In laravel, I have a query that searches thousands of rows in a database, and I trying to compile that into a CSV file. In attempt to reduce memory usage, I get 500 rows at a time and output the CSV.
$callback = function () use ($query) {
$file = fopen('php://output', 'w');
$query->chunk(500, function ($rows) use ($file) {
foreach ($rows as $key => $row) {
fputcsv($file, array_map(...$rows...));
}
log_info("Memory used " . memory_get_usage());
});
fclose($file);
};
$headers = [ ... ];
return response()->stream($callback, 200, $headers);
The actual query is a bit more complex, and involves getting related models which also need to be rehydrated. When I run this, it begins generating the CSV file and, after a while, runs out of memory. This is in my log
(59): Memory used 17208328;
(59): Memory used 25105328;
(59): Memory used 30601328;
...
(59): Memory used 127380496;
(59): Memory used 129352584;
(59): Memory used 131207672;
[2025年11月23日 23:50:15] qa.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 16384 bytes)
What I Tried
I tried putting the following inside the chunk loop, hoping that it would free memory. It had no effect on the memory consumption.
unset($rows);
flush();
gc_collect_cycles();
asked 10 hours ago
mankowitz
2,1151 gold badge19 silver badges45 bronze badges
lang-php
ob_end_clean()before you start streaming the response to see if that makes a differenceflush()would release that memoryphp://output, this writes to the output buffer (keyword being buffer), so if the output exceeds 128MB, it will trip the memory limit (but 128MB is a lot of text).