I have a xml file containing information i want to store or update in database. my server redirects me to previous page if in 30 seconds script doesn't finish executing (changed max execution time, didn't help)
I want to split the file into multiple arrays and send them over ajax to be processed in more instances thus trying to shorten the execution time.
the file contains 38k rows and in 30 seconds i can add 6700 new objects in db or update 3800 existing ones.
so is there a way to do this? i'm very new to ajax so i don't even know where to start looking for a solution.
EDIT1:
<?php
$time = microtime(TRUE);
$xml = simplexml_load_string(file_get_contents($feed));
$json = json_encode($xml);
$array = json_decode($json,TRUE);
$array= $array['Row'];
set_time_limit(0);
ini_set('memory_limit','4000M');
//echo ini_get('max_execution_time');
//die();
$new = 0;
$existent = 0;
foreach($array as $produs)
{
$prod = Products::model()->findbyattributes(array('cod'=>$produs['ProductId']));
if(!$prod)
{
$prod = new Products;
$prod->cod = $produs['ProductId'];
$prod->price = $produs['PriceSRP'];
$prod->name = $produs['Name'];
$prod->furnizor= 'ABCData';
$prod->brand = $produs['HierarchyNameLevel1'];
//$prod->stock = $produs['Available'];
if($produs['Available'] == "+")
$prod->stock = 'Da';
else
{$prod->stock = 'Nu';}
$prod->category = $prod->getCategory($produs['MinorGroup'], 'ABC');
if(!$prod->category)
continue;
if(!$prod->save())
{
echo '<pre>';
var_dump($prod->geterrors());
echo '</pre>';
}
else{$new++;}
}
elseif($prod)
{
$prod->brand = $produs['HierarchyNameLevel1'];
$prod->price = $produs['PriceSRP'];
$prod->last_edit = date('Y-m-d H:i:s');
if($produs['Available'] == "+")
$prod->stock = 'Da';
else
{$prod->stock = 'Nu';}
if(!$prod->save())
{
echo '<pre>';
var_dump($prod->geterrors());
echo '</pre>';
}
else {$existent++;}
}
}
echo 'adaugat '.$new.' si updatat '.$existent.' produse in ';
print (microtime(TRUE)-$time). ' secunde!';
?>
it appears i may have been unclear in my initial post. so this is my existing code. the $feed file has 38k items in it that i need to process and add or update existing db entries.
if i run the full 38k file after 30 sec the browser performs a history.back() called by the apache server. i would have liked to process the file from crond and process for example 1 entry every second but that is imposible since i have no access to crond on that specific server. i've tried to split up the file manually and it works perfectly fine for ~6700 new entries or 3500 - 4000 existing ones (since it has to find them, load them, update them and save )
so my initial problem, and what i was asking if it is posible to do it over ajax so the server won't stop the script from executing if its longer them 30 seconds(as in i don't even know if the server will interpret the ajax as a new request and existing script won't wait for it to respond).
array = $('.def-mask :checkbox:checked').serialize();
$.ajax({
url: 'ajax/battle.php',
type: 'post',
data: { playerReady: 1, attack: attack, defence: array },
success: function(data) {
alert(data);
}
});
I would save the xml file in a temp folder, then do an ajax get that runs the file from a specific off-set for (e.g.) 100 records:
function processScript(offset) {
$.ajax({
type: "POST",
url: "some.php",
data: { offset:offset },
dataType:'json',
success: function(data) {
var o = parseJSON(data);
if(o.offset > 0) {
processScript(o.offset);
}
}
})
}
processScript(0);
In some.php
you would want to return a json object with a property 'offset' containing the next block of elements you want to process. When the xml file is complete, set offset to 0.
The above code is enough to get you started. You will also want to do some sort of error cheching in the success
function, as well as give a progress notification to the user (e.g. "3,600 of 38,000 lines process"?).