Return multiple arrays using a buffer without overloading memory

3

I'm developing an application where, in a report, I need to print something around 50,000 lines. When executing SELECT the error occurs:

  

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried   to allocate 24 bytes) in   C: \ wamp \ www \ simplaFramework \ System \ Framework \ Model.class.class.php on   line 884.

The result of the following query is being saved in array because I will treat it later:

        $data = array();
        $count = 0;
        while ($row = $q->fetch($pdo_fetch)) {                     
            $data[] = $row;                    
            ++$count;
        } 

My idea was to split the result into 2 or more arrays whenever it reaches an X amount of memory usage. For example: when using approximate 128mb (I get this value with memory_get_usage() ) I divide the result into 2 or more arrays and return the result.

But how can I do this? Should I use a Buffer?

    
asked by anonymous 27.02.2014 / 21:24

4 answers

3

This problem happens because PHP typically uses buffered queries. This means that PHP receives all result rows before returning the call for queries.

To resolve the problem you should use unbuffered queries . So PHP returns one line at a time for the application to process without first storing everything in memory.

$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

You need to be careful because using unbuffered queries, this means leaving an open transaction while not returning all results. This means that you can block slowing access to tables by other parallel queries.

Another consequence is that in some database types you may not be able to run another query until you have returned all the results.

A workaround to avoid these problems is to use unbuffered queries and save the results to a local file, for example in CSV format, with the results rows to the end, and then re-read the file to render the results a one, avoiding leaving the transaction open and releasing the connection to execute other queries while processing the results.

    
28.02.2014 / 06:04
3

Implement a pagination in the queries, that is, collect the resultset from 1000 to 1000 rows (for example) using the LIMIT clause of MySQL.

SELECT ... LIMIT 0, 1000

SELECT ... LIMIT 1000, 1000

SELECT ... LIMIT 2000, 1000

(etc.)

In my case, I would run a first query just to count the total number of rows, and then divide that total by the number of results per query. Then, just say what I suggested above:)

    
27.02.2014 / 21:37
3

I do not understand much about buffers in PHP, but I think the correct one in this case would be to "drop" what is stored in the array, and then go get the rest.

I think an interesting option would be to use ajax in this case, since you can go to the server, fetch a quantity of data that you limit, print on the screen and, if you have not finished the request completely, return by ajax to the server and grab more amount of data.

For this you would use only one flag, indicating whether the processing on the server has come to an end or not. If you have not arrived, you will be prompted to go to the server again and continue from where you left off.

You can even use the hint that friend @rodrigorigotti, dividing the result of the query into pieces and then returning from where it left off.

    
27.02.2014 / 21:46
3

If you can break the records use limit/off set paging as quoted by @rodrigorigotti

otherwise try one of the following approaches.

Possible solutions

Faster:

Increase the memory limit of the script (use in urgent cases), put this code at the beginning of the script.

ini_set('memory_limit', '512M');

FetchAll

Change the code to use fetchAll This eliminates the for N assignments. Remember that it is necessary to use preparedStaments.

$registros = $q->fetchAll();
$total = count($registros);
    
27.02.2014 / 21:54