<?xml version="1.0" encoding="UTF-8"?> 
<?xml-stylesheet href="https://dev.horde.org/themes/horde//default/feed-rss.xsl" type="text/xsl"?> 
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"> 
 <channel> 
  <title>Lower memory usage while downloading files</title> 
  <pubDate>Fri, 10 Apr 2026 18:27:28 +0000</pubDate> 
  <link>https://bugs.horde.org/ticket/5913</link> 
  <atom:link rel="self" type="application/rss+xml" title="Lower memory usage while downloading files" href="https://bugs.horde.org/ticket/5913/rss" /> 
  <description>Lower memory usage while downloading files</description> 
 
   
   
  <item> 
   <title>Gollem (probably Horde as an application entirely) needs qui</title> 
   <description>Gollem (probably Horde as an application entirely) needs quite a memory footprint (PHP memory_limit) for downloading mail attachments or downloading files from Gollem.



I haven&#039;t tested this problem in the latest and greatest of Horde, but all tickets and forums I&#039;ve seen indicate this is no different in teh current release.



The problem comes from the fact that the entire file that is about to be downloaded is read entirely into memory (on the server) before it is pushed out to the browser. This basically means that a 200MB file stored in Gollem needs a PHP memory_limit of at least 200MB to be able to download it. If one foreach is used on the array storing the file in memory, the memory_limit doubles.



To get around this problem is easy, push out the file in chunks, instead of all at once. I have no idea how Gollem (or Horde) pushes out a file to a browser, but I&#039;d imagen it would be similar as I do below.

I think the below code is easy enough to read without me blabbering on about it... Its code I use myself for a similar sort of function. And, as far as I&#039;m concerned, is available for Horde to use. I&#039;ve basically created the code myself using code templates for ideas I found on various places on the net.

Using this code, PHP&#039;s memory_limit can be left at default 8MB, and any sized file can be downloaded.



Questions? Do ask!



Function ReadFileChunked ($FileName) {

               $chunksize = (102400); // how many bytes per chunk

               $buffer = &#039;&#039;;

               $handle = fopen($FileName, &#039;rb&#039;);

               if ($handle === false) { return false; }

               while (!feof($handle)) {

                     $buffer = fread($handle, $chunksize);

                     print $buffer;

                     }

               return fclose($handle);

               } 

 

 { $File[&quot;File&quot;] = &quot;200MBfiletobedownloaded.zip&quot;;

    $File[&quot;Size&quot;] = FileSize($File[&quot;File&quot;]);

    Header(&quot;Content-Type: application/force-download;&quot;);

    Header(&quot;Content-Disposition: attachment; filename=\&quot;&quot;.$File[&quot;File&quot;].&quot;\&quot;&quot;);

    Header(&quot;Content-Length: &quot;.$File[&quot;Size&quot;]);

    ReadFileChunked($File[&quot;File&quot;]);

    Exit;

    }

</description> 
   <pubDate>Thu, 22 Nov 2007 00:12:25 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38907</link> 
  </item> 
   
  <item> 
   <title>Thanks for the ticket. The challenge in Gollem is we&#039;re very</title> 
   <description>Thanks for the ticket. The challenge in Gollem is we&#039;re very often not reading static files - we&#039;re reading from a database, or from an FTP server, or an SMB share, or ...



In any case, we can take another look at this, but I&#039;m caught up in Thanksgiving stuff here in the U.S. for now, so it&#039;ll be a few days at least.</description> 
   <pubDate>Thu, 22 Nov 2007 06:41:32 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38910</link> 
  </item> 
   
  <item> 
   <title>Ah, I think I can wait a few days... ;-)



Looking at the s</title> 
   <description>Ah, I think I can wait a few days... ;-)



Looking at the sort of files that are being stored in Gollem (and the same goes for attachments as well), those wouldn&#039;t dynamically change size. If they do, something is probably wrong. But the code can easily be expanded to include some file checks every time it reads a chunk.



Cheers! Enjoy Thanksgiving!

Olger.</description> 
   <pubDate>Thu, 22 Nov 2007 07:03:52 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38911</link> 
  </item> 
   
  <item> 
   <title>As chuck says, this is entirely undoable if using a backend </title> 
   <description>As chuck says, this is entirely undoable if using a backend like FTP, since PHP&#039;s ftp get functions don&#039;t allow us to chunk data from the ftp server.  At a minimum, this kind of block/chunk reading needs to be put in the VFS drivers, not gollem.</description> 
   <pubDate>Fri, 23 Nov 2007 07:20:12 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38932</link> 
  </item> 
   
  <item> 
   <title>This probably gets much easier and more efficient if we can </title> 
   <description>This probably gets much easier and more efficient if we can switch to streams for some backends in Horde 4.</description> 
   <pubDate>Fri, 23 Nov 2007 11:04:01 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38935</link> 
  </item> 
   
  <item> 
   <title>Michael and Jan&#039;s comments are right on the money, hopefully</title> 
   <description>Michael and Jan&#039;s comments are right on the money, hopefully helping Olger see some of the complexities here.



Olger, what does this refer to?



&gt; Looking at the sort of files that are being stored in Gollem (and the same goes for 

&gt; attachments as well), those wouldn&#039;t dynamically change size.



I&#039;m not sure what dynamically changing size would have to do with this.</description> 
   <pubDate>Fri, 23 Nov 2007 23:40:49 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t38969</link> 
  </item> 
   
  <item> 
   <title>Hi Guys,



&gt;&gt; Looking at the sort of files that are being s</title> 
   <description>Hi Guys,



&gt;&gt; Looking at the sort of files that are being stored in Gollem (and 

&gt;&gt; the same goes for

&gt;&gt; attachments as well), those wouldn&#039;t dynamically change size.

&gt;

&gt; I&#039;m not sure what dynamically changing size would have to do with this.



That was a comment on Chuck&#039;s remark about often not reading static files.



But, it&#039;s really not that hard (from my point of view). Right now, (it would appear anyhows) the entire file is read into memory, from whatever source (DB, ftp, smb, etc). Causing memory problems with large files.

As opposed to reading the file into RAM, why not copy it to local disk and then sending it to the client in chuncks, saving heaps on memory.

The downside of having to use so much memory is that your webserver will start swapping to disk sooner. In general, clients sit on connections that are way slower than average disks can read, so speed is not an issue here.

A one GB RAM server will happily serve hundreds of people a 250MB file when this is done in chuncks, whereas it&#039;ll choke on 2 users if those files are read into memory first. 



BTW, I live in Australia, so you might not get quick responses when you guys are half way through the day... :-)



Cheers! Olger.</description> 
   <pubDate>Sun, 25 Nov 2007 22:35:40 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39009</link> 
  </item> 
   
  <item> 
   <title>&gt;&gt; I&#039;m not sure what dynamically changing size would have to</title> 
   <description>&gt;&gt; I&#039;m not sure what dynamically changing size would have to do with this.

&gt;

&gt; That was a comment on Chuck&#039;s remark about often not reading static files.



I meant files on the local filesystem, vs. remote files, not often-changing files.



&gt; But, it&#039;s really not that hard (from my point of view). Right now, 

&gt; (it would appear anyhows) the entire file is read into memory, from 

&gt; whatever source (DB, ftp, smb, etc). Causing memory problems with 

&gt; large files.

&gt; As opposed to reading the file into RAM, why not copy it to local 

&gt; disk and then sending it to the client in chuncks, saving heaps on 

&gt; memory.



As long as you can create that local copy without reading the file into RAM in the first place, fine. Works for FTP, but SQL is much harder. Etc. You won&#039;t get any argument that lower memory usage is better - but I think you&#039;ll have a better appreciation for things if you _do_ look at the Gollem and VFS code.</description> 
   <pubDate>Sun, 25 Nov 2007 23:02:27 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39012</link> 
  </item> 
   
  <item> 
   <title>Btw, you didn&#039;t post the code you mentioned yet, but is it r</title> 
   <description>Btw, you didn&#039;t post the code you mentioned yet, but is it really necessary to use chunked reads, or will readfile (http://www.php.net/readfile) or fpassthru do the right thing? In a way fpassthru would be ideal because we can return a stream from the VFS library in some backends, or fake it with a local file.</description> 
   <pubDate>Sun, 25 Nov 2007 23:04:41 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39015</link> 
  </item> 
   
  <item> 
   <title>Hi Chuck,



I posted my code in the initial ticket, but jus</title> 
   <description>Hi Chuck,



I posted my code in the initial ticket, but just in case you can&#039;t see it for some reason, here it is again (slightly modified to make it easier to read/use).

All you&#039;d basically do is call the second function with the filename, and the file should be send in bits to the browser.



Function ReadFileChunked ($FileName) {

              $chunksize = (102400); // how many bytes per chunk

              $buffer = &#039;&#039;;

              $handle = fopen($FileName, &#039;rb&#039;);

              if ($handle === false) { return false; }

              while (!feof($handle)) {

                      $buffer = fread($handle, $chunksize);

                      print $buffer;

                      }

              return fclose($handle);

              } 



Function SendFileToBrowser ($FileName) {

              Header(&quot;Content-Type: application/force-download;&quot;);

              Header(&quot;Content-Disposition: attachment; filename=\&quot;&quot;.$FileName.&quot;\&quot;&quot;);

              Header(&quot;Content-Length: &quot;.FileSize($FileName));

              ReadFileChunked($FileName);

              Exit;

              }



&gt; Btw, you didn&#039;t post the code you mentioned yet, but is it really 

&gt; necessary to use chunked reads, or will readfile 

&gt; (http://www.php.net/readfile) or fpassthru do the right thing? In a 

&gt; way fpassthru would be ideal because we can return a stream from the 

&gt; VFS library in some backends, or fake it with a local file.



Did a bit of reading up on fpassthru, and you might want to look at the first comment on this page: http://au.php.net/manual/en/function.fpassthru.php.

There&#039;s also various other comments on memory usage downloading files on that page.



Cheers! Olger.</description> 
   <pubDate>Sun, 25 Nov 2007 23:27:53 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39016</link> 
  </item> 
   
  <item> 
   <title>Please give these two commits a shot, assuming that you are </title> 
   <description>Please give these two commits a shot, assuming that you are using either the file or FTP backend in Gollem:



http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072746.html

http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072745.html



I went ahead and used fpassthru because I didn&#039;t see anything that indicated that it read the whole file into memory - just that on some older versions it might leak, which is a problem but a different one. :)</description> 
   <pubDate>Thu, 29 Nov 2007 05:20:14 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39123</link> 
  </item> 
   
  <item> 
   <title>Hi Chuck,



Installed the latest and greatest stable of Hor</title> 
   <description>Hi Chuck,



Installed the latest and greatest stable of Horde/Gollum, but couldn&#039;t really figure out how to place the file.php/ftp.php. I found similar files, in the Horde VFS directory, but they were of a vastly different size. So haven&#039;t been able to test this yet.

However, having had a spare minute to play with this, I did get it working with my own function. I&#039;ve just downloaded a 368MB file succesfully from Gollem with my own function. Here&#039;s how:



Modified gollem/view.php:

Added function

Function ReadFileChunked ($FileName) {

              $chunksize = (102400); // how many bytes per chunk

              $buffer = &#039;&#039;;

              $handle = fopen($FileName, &#039;rb&#039;);

              if ($handle === false) { return false; }

              while (!feof($handle)) {

                      $buffer = fread($handle, $chunksize);

                      print $buffer;

                      }

              return fclose($handle);

              }



Then changed this section:

case &#039;download_file&#039;:

    $browser-&gt;downloadHeaders($filename, null, false, strlen($data));

    ReadFileChunked(&quot;/exampledir/home/web/&quot;.$filename);

    /* echo $data; */

    break;



&quot;/exampledir/home&quot; is where all the userdirectories are located, &quot;web&quot; is the logged in user. I didn&#039;t know how to get that information from Horde/Gollem quickly, so for testing purposes I hardcoded it.



But it works a treat. The only reason I still have to keep the PHP memory_limit just over the filesize I&#039;m trying to download is because of this line:

$data = $GLOBALS[&#039;gollem_vfs&#039;]-&gt;read($filedir, $filename);

Which reads the entire file into memory.



But as opposed to having to set the memory_limit to just over double the file size, it now needs to be just over the size of the file.

I&#039;m not used to working with objects in PHP, so haven&#039;t been able to retrieve the directory from the $GLOBALS[&#039;gollem_vfs&#039;] object (although I could print the array and view it).

Now if we can get that one line fixed so that the object doesn&#039;t read the filecontents into memory anymore, we&#039;d be laughing.



Cheers! Olger.



&gt; Please give these two commits a shot, assuming that you are using 

&gt; either the file or FTP backend in Gollem:

&gt;

&gt; http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072746.html

&gt; http://lists.horde.org/archives/cvs/Week-of-Mon-20071126/072745.html

&gt;

&gt; I went ahead and used fpassthru because I didn&#039;t see anything that 

&gt; indicated that it read the whole file into memory - just that on some 

&gt; older versions it might leak, which is a problem but a different one. 

&gt; :)

</description> 
   <pubDate>Thu, 29 Nov 2007 23:33:46 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39164</link> 
  </item> 
   
  <item> 
   <title>OK, fixed it myself. Figured out how to use objects (again, </title> 
   <description>OK, fixed it myself. Figured out how to use objects (again, used to program in Turbo Pascal with objects loooong time ago...), and resolved my issue with the hardcoded pathname.

Also fixed my memory issue while I was at it. PHP memory_limit now set to 16MB, and downloading 368MB file. Sweet.



Still in the same file (gollem/view.php). This is what I done:

/* $data = $GLOBALS[&#039;gollem_vfs&#039;]-&gt;read($filedir, $filename); 

if (is_a($data, &#039;PEAR_Error&#039;)) { 

    $notification-&gt;push(sprintf(_(&quot;Access denied to %s&quot;), $filename), &#039;horde.error&#039;); 

    header(&#039;Location: &#039; . Util::addParameter(Horde::applicationUrl(&#039;manager.php&#039;, true), &#039;actionID&#039;, $actionID)); 

    exit; 

}*/

Commented this part out completely as it read the file into memory, stuffing it up.



/* Run through action handlers. */

switch ($actionID) {

case &#039;download_file&#039;:

    $File_Dir  = $GLOBALS[&#039;gollem_vfs&#039;]-&gt;_getNativePath($filedir, $filename);

    $File_Size = FileSize($File_Dir);

    $browser-&gt;downloadHeaders($filename, null, false, $File_Size);

    ReadFileChunked($File_Dir); 

    /* echo $data; */

    break;



Let me know what you think!



Cheers! Olger.</description> 
   <pubDate>Fri, 30 Nov 2007 00:01:22 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39165</link> 
  </item> 
   
  <item> 
   <title>I know that you are trying to be helpful and I appreciate th</title> 
   <description>I know that you are trying to be helpful and I appreciate that. But at some point you need to start to understand the codebase that you are working with.



What your changes do is make sure that Gollem can&#039;t work with any backend except a local filesystem. I hope it&#039;s obvious that that&#039;s not an acceptable change.



My commits were to the CVS version of Horde and Gollem (Horde is at 3.2-RC1, Gollem will be released as Gollem 1.1 once Horde 3.2 is released - see http://wiki.horde.org/ReleaseManagement). If you are interested in helping us in a way that can be incorporated into the code and that will help all Gollem users, you need to test that version as well. You can get snapshots from http://snaps.horde.org/.



You should be able to just drop in the new file.php and view.php files as well, but since I&#039;m not sure which version you mean by &quot;latest and greatest&quot;, I can&#039;t guarantee that. Horde 3.2 is backwards compatible with Horde 3.0, though.</description> 
   <pubDate>Fri, 30 Nov 2007 00:08:11 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39168</link> 
  </item> 
   
  <item> 
   <title>Hi Chuck,



I&#039;m more than happy to help make it work for ev</title> 
   <description>Hi Chuck,



I&#039;m more than happy to help make it work for every filesystem. Took me a while to figure out how to get files out of the cvs system though, but I&#039;ve got the files you modified. (btw, running Horde 3.1.5 and Gollem H3 1.0.3 for testing).

Tested your modifications, but no joy. Tried downloading a 380MB file, with various memory settings, all the way up to 800MB memory_limit. Keeps eating the memory.



The thing is though, in the gollem/view.php file there is a line:

         $data = $GLOBALS[&#039;gollem_vfs&#039;]-&gt;read($filedir, $filename);

which reads everything that&#039;s returned from the &quot;read ( , )&quot; function into a variable (in memory, which is a local resource!).

In the function &quot;read( , )&quot; you use another variable to read the file into a variable, so basically your doubling the memory need to be able to download a file.



Memory is a local resource. It doesn&#039;t care if it came from SQL, FTP, SMB. It&#039;s stored in local memory in the machine serving the browser. As opposed to copying the entire file into memory, why not copy it to disk (temp directory comes to mind)? (I&#039;ve used a hardcoded &#039;/tmp/&#039; for now, but its easy enough to get the system variable).

Using my previously ReadFileChunked function, that would be easy and compatible with any backend. It would need some additional checking I suppose, but this illustrates the basic idea (gollem/view.php), see attachment.



For some reason it doesn&#039;t copy the file from the VFS to the local filesystem, but you may know why.



I think it gets the basic idea across.



Cheers! Olger.





&gt; I know that you are trying to be helpful and I appreciate that. But 

&gt; at some point you need to start to understand the codebase that you 

&gt; are working with.

&gt;

&gt; What your changes do is make sure that Gollem can&#039;t work with any 

&gt; backend except a local filesystem. I hope it&#039;s obvious that that&#039;s 

&gt; not an acceptable change.

&gt;

&gt; My commits were to the CVS version of Horde and Gollem (Horde is at 

&gt; 3.2-RC1, Gollem will be released as Gollem 1.1 once Horde 3.2 is 

&gt; released - see http://wiki.horde.org/ReleaseManagement). If you are 

&gt; interested in helping us in a way that can be incorporated into the 

&gt; code and that will help all Gollem users, you need to test that 

&gt; version as well. You can get snapshots from http://snaps.horde.org/.

&gt;

&gt; You should be able to just drop in the new file.php and view.php 

&gt; files as well, but since I&#039;m not sure which version you mean by 

&gt; &quot;latest and greatest&quot;, I can&#039;t guarantee that. Horde 3.2 is backwards 

&gt; compatible with Horde 3.0, though.

</description> 
   <pubDate>Tue, 04 Dec 2007 00:23:04 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39309</link> 
  </item> 
   
  <item> 
   <title>&gt; I&#039;m more than happy to help make it work for every filesys</title> 
   <description>&gt; I&#039;m more than happy to help make it work for every filesystem. Took 

&gt; me a while to figure out how to get files out of the cvs system 

&gt; though, but I&#039;ve got the files you modified. (btw, running Horde 

&gt; 3.1.5 and Gollem H3 1.0.3 for testing).

&gt; Tested your modifications



No, you didn&#039;t test the changes to Gollem. You need the VFS updates and the Gollem changes.



&gt; The thing is though, in the gollem/view.php file there is a line:

&gt;          $data = $GLOBALS[&#039;gollem_vfs&#039;]-&gt;read($filedir, $filename);



http://cvs.horde.org/diff.php?r1=1.60&amp;r2=1.61&amp;f=gollem%2Fview.php



Your view.php looks nothing like the current one in Gollem CVS. See http://horde.org/source/ if you still need help with CVS.



&gt; For some reason it doesn&#039;t copy the file from the VFS to the local 

&gt; filesystem, but you may know why.



... because you&#039;re still accessing the VFS. copy() doesn&#039;t work the way you&#039;re trying to use it. To copy to a _different_ filesystem (virtual or otherwise) you need to read the file data.</description> 
   <pubDate>Tue, 04 Dec 2007 05:07:00 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39314</link> 
  </item> 
   
  <item> 
   <title>Is it possible to get a zipped up file of the development co</title> 
   <description>Is it possible to get a zipped up file of the development code (like including the code you put in there) from the cvs? Like just one zipfile for Horde and one for Gollem that I can simply download and unzip?

I tried using the files from the snapshots, but they don&#039;t seem to have to files I&#039;m after.

I haven&#039;t used cvs before (am usually a lone programmer, so haven&#039;t had much need for it) and find it very unlogical. Tried getting a cvs client but that didn&#039;t help much either.

</description> 
   <pubDate>Tue, 04 Dec 2007 06:44:50 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39323</link> 
  </item> 
   
  <item> 
   <title>&gt; Is it possible to get a zipped up file of the development </title> 
   <description>&gt; Is it possible to get a zipped up file of the development code (like 

&gt; including the code you put in there) from the cvs? Like just one 

&gt; zipfile for Horde and one for Gollem that I can simply download and 

&gt; unzip?

&gt; I tried using the files from the snapshots, but they don&#039;t seem to 

&gt; have to files I&#039;m after.



That&#039;s exactly what the snapshots are. Perhaps you&#039;re not using the right snapshots. From last nights, you want:



http://ftp.horde.org/pub/snaps/latest/framework-HEAD-2007-12-04.tar.gz

http://ftp.horde.org/pub/snaps/latest/gollem-HEAD-2007-12-04.tar.gz



&gt; I haven&#039;t used cvs before (am usually a lone programmer, so haven&#039;t 

&gt; had much need for it)



You&#039;re missing the point of a version control system then, but that&#039;s a separate issue. :)</description> 
   <pubDate>Tue, 04 Dec 2007 16:34:27 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39362</link> 
  </item> 
   
  <item> 
   <title>OK, lack of time prevented me from going further on this.

T</title> 
   <description>OK, lack of time prevented me from going further on this.

Tried the modifications. It didn&#039;t work so I took a look at view.php and modified it as follows:



case &#039;download_file&#039;:

    $browser-&gt;downloadHeaders($filename, null, false, $GLOBALS[&#039;gollem_vfs&#039;]-&gt;size($filedir, $filename));

    if (is_resource($stream)) {

         while ($buffer = fread($stream, 10240)) { 

           print $buffer;

           ob_flush(); flush();

           usleep(50000);

           }

       } else {

         echo $data; 

       }

     /*  if (is_resource($stream)) { 

        fpassthru($stream);

     } else {

        echo $data;

    }  */

    break;



That works for me with 16MB memory_limit downloading a 61MB file. The fpassthru still goggles up more memory than is desirable. ob_flush() and flush() are nessary to clear internals of the webserver and php, the usleep(50000) (.05 seconds) allows a bit of time for the buffers to be actually flushed before filling them again.

Hows that?



Cheers! Olger.</description> 
   <pubDate>Tue, 11 Dec 2007 00:18:49 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39694</link> 
  </item> 
   
  <item> 
   <title>I made it 8192 bytes without the usleep. I guess we could ma</title> 
   <description>I made it 8192 bytes without the usleep. I guess we could make it 4mb and maybe get in under the default 8mb memory limit? I didn&#039;t include the usleep because you didn&#039;t have it in your original code and it didn&#039;t &quot;smell&quot; right to me (won&#039;t work on windows either).



Thanks for all your work on this! I&#039;m closing the ticket but things can still be tweaked of course.</description> 
   <pubDate>Tue, 11 Dec 2007 05:03:16 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39697</link> 
  </item> 
   
  <item> 
   <title>Hehe, I&#039;m happy with 8mb memory_limit... I agree that not us</title> 
   <description>Hehe, I&#039;m happy with 8mb memory_limit... I agree that not using usleep is better, even if it uses just .05 seconds. Windows? What&#039;s that...? ;-)

Glad to be of help! When will the next stable release see the light?



Cheers! Olger.



&gt; I made it 8192 bytes without the usleep. I guess we could make it 4mb 

&gt; and maybe get in under the default 8mb memory limit? I didn&#039;t include 

&gt; the usleep because you didn&#039;t have it in your original code and it 

&gt; didn&#039;t &quot;smell&quot; right to me (won&#039;t work on windows either).

&gt;

&gt; Thanks for all your work on this! I&#039;m closing the ticket but things 

&gt; can still be tweaked of course.

</description> 
   <pubDate>Tue, 11 Dec 2007 05:19:09 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39701</link> 
  </item> 
   
  <item> 
   <title>&gt; When will the next stable release see the light?



All of</title> 
   <description>&gt; When will the next stable release see the light?



All of these changes will be in the Horde 3.2 release series - see http://wiki.horde.org/ReleaseManagement for some details. The VFS changes were in the latest Horde 3.2 release candidate, and the Gollem changes will be in Gollem 1.1, to be released after Horde 3.2 is out.</description> 
   <pubDate>Tue, 11 Dec 2007 19:28:54 +0000</pubDate> 
   <link>https://bugs.horde.org/ticket/5913#t39720</link> 
  </item> 
   
   
 
 </channel> 
</rss> 
