Good afternoon.
I have a problem with php foreach
loop.
I parse xml file ( ~20mb ) using simplexml and then insert data to mysql
in xml are over 37000 items,i must loop 37000 times ,to read data from xml
every 100 iteration i create string like this: insert into my_table values (...)
But i get a 502 error in 10500-st iteration.
I try send string after loop,but get error again:
memory_limit=240
max_execution_time 500
How can I solve this problem. Thanks and best regards.
I think the problem is that your script is timing out, you can overcome this by using set_time_limit(0)
in you script or by changing the max_execution_time
in your php.ini
:
while(1) {
set_time_limit(0);
// do something
}
You also need to increase your memory_limit
by editing your php.ini
and restart your webserver.
Read documentation for set_time_limit()
I'd queue the 37000 items into several batches and process them one after another or asynchronous. I've done this a few times in PHP. A better language for jobs like this would be Phyton or RoR.
However, try creating batches of the items.
I use this function to convert the strings into CSV format:
<?php
function convertStrToCsv($data, $delimiter = ';', $enclosure = '"')
{
ob_start();
$fp = fopen('php://output', 'w');
fputcsv($fp, $data, $delimiter, $enclosure);
fclose($fp);
return ob_get_clean();
}
… then I save the functions output as a file and finaly use this query to save the CSV data in the Database:
LOAD DATA LOW_PRIORITY LOCAL INFILE '$file' IGNORE INTO TABLE `$table` CHARACTER SET utf8 FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '\"' ESCAPED BY '\"' LINES TERMINATED BY '\
';
Happy coding!