6.0.0-beta13
▾
Tasks
New Task
Search
Photos
Wiki
▾
Tickets
New Ticket
Search
dev.horde.org
Toggle Alerts Log
Help
4/10/26
H
istory
A
ttachments
C
omment
W
atch
Download
Comment on [#5913] Lower memory usage while downloading files
*
Your Email Address
*
Spam protection
Enter the letters below:
.__ \ /.__. __.\ / [__) >< | |(__ >< [__)/ \|__\.__)/ \
Comment
> Hi Chuck, > > > > Installed the latest and greatest stable of Horde/Gollum, but > couldn't really figure out how to place the file.php/ftp.php. I found > similar files, in the Horde VFS directory, but they were of a vastly > different size. So haven't been able to test this yet. > > However, having had a spare minute to play with this, I did get it > working with my own function. I've just downloaded a 368MB file > succesfully from Gollem with my own function. Here's how: > > > > Modified gollem/view.php: > > Added function > > Function ReadFileChunked ($FileName) { > > $chunksize = (102400); // how many bytes per chunk > > $buffer = ''; > > $handle = fopen($FileName, 'rb'); > > if ($handle === false) { return false; } > > while (!feof($handle)) { > > $buffer = fread($handle, $chunksize); > > print $buffer; > > } > > return fclose($handle); > > } > > > > Then changed this section: > > case 'download_file': > > $browser->downloadHeaders($filename, null, false, strlen($data)); > > ReadFileChunked("/exampledir/home/web/".$filename); > > /* echo $data; */ > > break; > > > > "/exampledir/home" is where all the userdirectories are located, > "web" is the logged in user. I didn't know how to get that > information from Horde/Gollem quickly, so for testing purposes I > hardcoded it. > > > > But it works a treat. The only reason I still have to keep the PHP > memory_limit just over the filesize I'm trying to download is because > of this line: > > $data = $GLOBALS['gollem_vfs']->read($filedir, $filename); > > Which reads the entire file into memory. > > > > But as opposed to having to set the memory_limit to just over double > the file size, it now needs to be just over the size of the file. > > I'm not used to working with objects in PHP, so haven't been able to > retrieve the directory from the $GLOBALS['gollem_vfs'] object > (although I could print the array and view it). > > Now if we can get that one line fixed so that the object doesn't read > the filecontents into memory anymore, we'd be laughing. > > > > Cheers! Olger. > > > >> Please give these two commits a shot, assuming that you are using > >> either the file or FTP backend in Gollem: > >> > >> http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072746.html > >> http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072745.html > >> > >> I went ahead and used fpassthru because I didn't see anything that > >> indicated that it read the whole file into memory - just that on some > >> older versions it might leak, which is a problem but a different one. > >> :) > >
Attachment
Watch this ticket
N
ew Ticket
M
y Tickets
S
earch
Q
uery Builder
R
eports
Saved Queries
Open Bugs
Bugs waiting for Feedback
Open Bugs in Releases
Open Enhancements
Enhancements waiting for Feedback
Bugs with Patches
Enhancements with Patches
Release Showstoppers
Stalled Tickets
New Tickets
Horde 5 Showstoppers