I have an export function that converts data posted from a form into a CSV file. The issue is when the number of rows exceeds 500 it stops the CSV.
My php.ini file shows that my upload_max_filesize
and post_max_size
equates to 0. So it should be unlimited, so my assumption is that it's not form size that's the issue.
My current code looks like this:
$data = $_POST['ids'];
$ebayMmlSql = 'SELECT `UKM_Make`, `UKM_Model`, `UKM_CCM`, `UKM_Submodel`, `Year`, `UKM_StreetName`
FROM `ebay_mml`
WHERE `ePID` = :epid';
$aiSql = 'INSERT INTO `model_epid_matches` (`epid`, `model_id`) VALUES (:epid, :model);';
$rows = array();
$i = 0;
foreach ($data as $key => $ids)
{
foreach ($ids as $id)
{
if (isset($id['epid']) && $id['epid'] !== '') {
# run select
$ebayMmlQry = $handler->prepare($ebayMmlSql);
$ebayMmlQry->execute(array(':epid' => $id['epid']));
# if insert is true, run insert
if (isset($_POST['insert']) && $_POST['insert'] == true) {
if (isset($id['new']) && $id['new'] == true) {
$aiQry = $handler->prepare($aiSql);
$aiQry->execute(array(':epid' => $id['epid'], ':model' => $id['mod_id']));
}
}
$rows[$key][$i]['action'] = ($i == 0 ? 'Revise' : '');
$rows[$key][$i]['item_id'] = ($i == 0 ? 'Insert ItemID for '. $key : '');
$rows[$key][$i]['relationship'] = ($i == 0 ? '' : 'Compatibility');
$rows[$key][$i]['relationship_details'] = '';
foreach ($ebayMmlQry->fetchAll(PDO::FETCH_ASSOC) as $k => $value)
{
$rows[$key][$i]['relationship_details'] = 'UKM_Make='. $value['UKM_Make'] .'|';
$rows[$key][$i]['relationship_details'] .= 'UKM_Model='. $value['UKM_Model'] .'|';
$rows[$key][$i]['relationship_details'] .= 'UKM_CCM='. $value['UKM_CCM'] .'|';
$rows[$key][$i]['relationship_details'] .= 'UKM_Submodel='. $value['UKM_Submodel'] .'|';
$rows[$key][$i]['relationship_details'] .= 'Year='. $value['Year'] .'|';
$rows[$key][$i]['relationship_details'] .= 'UKM_StreetName='. $value['UKM_StreetName'];
}
$i++;
}
}
$i = 0;
}
# testing count of rows before gen_csv_file on screen
# $total = 0;
# foreach ($rows as $key => $row)
# {
# echo '<pre>';
# var_dump($key. ': ' .count($row));
# echo '</pre>';
# $total += count($row);
# }
# var_dump($total);
gen_csv_file($rows, 'ebay.csv');
function gen_csv_file($data, $file)
{
header('Content-Type: text/csv');
header('Content-Disposition: attachment; filename='. $file);
header('Pragma: no-cache');
header('Expires: 0');
$output = fopen('php://output', 'w');
fputcsv(
$output,
array(
'Action(SiteID=UK|Country=GB|Currency=GBP|Version=941)',
'ItemID',
'Relationship',
'RelationshipDetails'
)
);
foreach ($data as $key => $row)
{
foreach ($row as $item)
{
fputcsv($output, $item);
}
}
fclose($output);
}
I can't seem to get to the root of the issue - maybe memory issue?
@AymDev linked this question: Warning: Input variables exceeded 1000
Was a good read, however:
1) The highest voted answer led me to var_dump the count of $_POST which returned 1 (though that doesn't exceed 1000 so not sure why it doesn't work?)
2) The accepted answer suggests setting Content-Type
in the header which I've already done
I could always set the max_input_vars
via ini_set
/.htaccess to avoid changing on server ...
Thanks,
The answer was to create a .htaccess file. After @Inazo's comment I checked my logs and saw this:
PHP Warning: Unknown: Input variables exceeded 1000. To increase the limit change max_input_vars in php.ini
I first increased my limit in the .ini file to 25000 and tested the code. Everything worked, but it did feel wrong to set it to an excessive amount. After @AymDev linked another question I saw someone suggest ini_set
or using the .htaccess file.
Using ini_set('max_input_vars', 25000)
at the top of my script changed nada, but using .htaccess like this:
php_value max_input_vars 25000
It worked - so it wasn't at script level, but server level. Though still doesn't explain why count of $_POST
returned 1 and yet still exceeded 1000.