so I'm doing this in order to print out a csv file:
while(ob_end_clean());
$fp = fopen('php://output', 'w');
fputcsv($fp, $header_fields);
foreach($j as $fields){
$row = array();
foreach ($fields as $field){
$row[] = $r->$field;
}
//stream each row
fputcsv($fp, $row);
}
fclose($fp);
exit;
My question is, suppose the csv will contain over 100,000 rows, will this cause an out of memory error or does this utilize the memory efficiently?