如何重构一个长时间运行的php进程,以便不会超时[重复]

This question already has an answer here:

I have a simple javascript function like so:

$(document).ready(function(){
    var request = $.ajax({
        url: "read_images.php",
        type: "GET",
        dataType: "html"
    });
    request.done(function(msg) {
        $("#mybox").html(msg);
        document.getElementById('message').innerHTML = '';
    });
    request.fail(function(jqXHR, textStatus) {
        alert( "Request failed: " + textStatus );
    });
});

The php script it is calling loops on the contents of a folder, runs some checks, and returns a response. The script is as follows:

//Get all Images from server, store in variable
$server_images = scandir('../images/original');

//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);

$j = 0;
for($i=0;$i<count($server_images) && $i<3000;$i++) {
     $server_image = $server_images[$i];

    //Make sure that the server image does not have a php extension
    if(!preg_match('/.php/',$server_image)) {

    //Select products_id and name from table where the image name is equal to server image name
    $query = "SELECT `name`
            FROM `images`
            WHERE `name` = '$server_image'";
    $mro_images = $db->query($query);
    $mro_images_row = $mro_images->fetch();
    $mro_image = $mro_images_row['name'];

    //If no results are found
    if(empty($mro_image)) {
        $images[$j] = $server_image;
        $j++;
    }
}
}

It works if the loop is restricted to 2000 iterations but if I try to do e.g. 3000 iterations the result is:

HTTP/1.1 500 Internal Server Error 31234ms

I've tried increasing the php execution limit, but this didn't have any effect as, after contacting my host:

Unfortunately in our environment we don't have any way to increase the loadbalancer timeout beyond 30 seconds

Therefore: How can I restructure this code to avoid hitting the execution time limit?

</div>

The below code indicates the basic logic to follow. It isn't tested code and should not be taken as a drop in code example.

Use a javascript loop

Instead of making a slow process slower - write your JavaScript to ask for smaller chunks of data in a loop.

I.e. the js could use a while loop:

$(document).ready(function(){
    var done = false,
        offset = 0,
        limit = 20;

    while (!done) {
        var url = "read_images.php?offset=" + offset + "&limit=" + limit;

        $.ajax({
            async: false,
            url: url
        }).done(function(response) {

            if (response.processed !== limit) {
                // asked to process 20, only processed <=19 - there aren't any more
                done = true;
            }

            offset += response.processed;
            $("#mybox").html("Processed total of " + offset + " records");

        }).fail(function(jqXHR, textStatus) {

            $("#mybox").html("Error after processing " + offset + " records. Error: " textStatus);

            done = true;
        });
    }

});

Note that in the above example the ajax call is forced to be syncronous. Normally you don't want to do this, but in this example makes it easier to write, and possibly easier to understand.

Do a fixed amount of work per php request

The php code also needs modifying to expect and use the get arguments being passed:

$stuff = scandir('../images/original');

$offset = $_GET['offset'];
$limit = $_GET['limit'];

$server_images = array_slice($stuff, $offset, $limit);

foreach($server_images as $server_image) {
    ...
}
...

$response = array(
    'processed' => count($server_images),
    'message' => 'All is right with the world'
);

header('Content-Type: application/json');
echo json_encode($response);
die;

In this way the amount of work a given php request needs to process is fixed, as the overall amount of data to process grows (assuming the number of files in the directory doesn't grow to impractical numbers).

If everything works with 2000 iterations for 3000 iterations try upping the time limit to allow php to execute longer. But under normal circumstances this is not a good idea. Make sure you know what you are doing and have a good reason for increasing the execution time.

set_time_limit ( 60 );

http://www.php.net/manual/en/function.set-time-limit.php

Also this could be due to the script exhausting the amount of memory. Create a file with the phpinfo function in it and then check the value for the memory_limit.

<?php phpinfo(); ?>

Then you can increase the limit by htaccess file. But again make sure you want the script to consume more memory. Be careful.

ini_set('memory_limit', '128M'); #change 128 to suit your needs

Your count($server_images) is probably resulting in an infinite loop.

If count() returns 0, your for loop will never end. So you need to check that first.

//Get all Images from server, store in variable
$server_images = scandir('../images/original');

//Remove first 3 elements, which are not correct
array_shift($server_images);
array_shift($server_images);
array_shift($server_images);

$j = 0;

if(count($server_images) > 0){    
    for($i=0;$i<count($server_images) && $i<3000;$i++) {
      //Do something
    }
}