I am not very experienced, so I could use some help in troubleshooting this... Kinda stuck right now. :-( Thanks in advance:
I am developing a module for an ecommerce platform. I have a script at www.mystore.com/modules/mymodule/myscript.php
the myscript.php this code in it:
$command = '/do/some/process/ -make_log_files';
echo shell_exec($command);
//here I would like to do some stuff with the log file, now that I know the process has completed.
All this works fine in theory. The process initiated by $command
will start and completes after a while (I can tell by the log files it creates). Also it should be noted that there is no evidence that the script.php accesses any of the mystore files. I logged a message in the mystores most fundamental include files, and I see the message nowhere if I simply exit() before the shell_exec().
Because the process is so slow and it takes a long time to finish up, when I navigate with my browser to www.mystore.com/modules/mymodule/myscript.php
the browser says "waiting for mystore.com ..." and keeps showing my it's loading animation.
After a while however it will suddenly show the 404 page of mystore.
EDIT: All the following stuff is really useless in this case and I found a much better solution.
I have already added these things to the beginning of script.php to try to prevent that from happening:
ini_set('output_buffering', 'off');
ini_set('zlib.output_compression', false);
while (@ob_end_flush());
ini_set('implicit_flush', true);
ob_implicit_flush(true);
header("Content-type: text/plain");
header('Cache-Control: no-cache');
http_response_code(200);
for($i = 0; $i < 1000; $i++)
{
echo ' '; //echo 1000 whitespaces
}
ob_flush();
flush();
Instead of shell_exec I have also tried to use these other commands, either with the same result, or just with printing and empty array or similar before the process has completed.
$process = (popen($command, "r"));
echo "'$process '; " . gettype($process ) . "
";
$read = fread($process , 2096);
echo $read;
pclose($process );
and
exec($call,$output);
print_r($output);
and
passthru($call);
none did any better.
EDIT: Turns out, because I'm running on FastCGI, this whole "no buffer send data straight to the browser" won't work.
If calling my script will result in a 404 after a certain amount of time, how could I ever make sure my process has completed?
I should note that my command does NOT end with >/dev/null &
. With that I am intentionally not putting it in the background.
Also I have already tried my script and changed the parameters of the command so that my process is done really quick. Everything works as expected then.
I realize there are other solutions (cron job or javascript that check the log files) but what if I simply want the script to wait for the process to complete? Is that not possible?
EDIT:
Thanks to drew010's helpful comments I realized the following:
call.php:
exec(php home/usr/mystore.com/modules/mymodule/myscript.php > /dev/null 2>&1 &');
echo 'all things set in motion, you are good to go!';
And now I can simply go to www.mystore.com/modules/mymodule/call.php
which will run the script in the background! And once the script is in the background there are no timeouts and my php code patiently waits for the slow process to be done and then does the rest of it's job. :-)
The 404 error is probably a result of some sort of timeout being reached.
It is common for web servers to have a CGI timeout where they will terminate the request if they do not receive a response from the CGI backend within a certain amount of time.
Since your script was running for some time a timeout was probably reached which terminated the connection.
In the past I've seen this happen and the website shows a 404 but when you look in the server logs you will see a 500 error indicating the server didn't receive a response from the backend within the timeout period.
If running the task from the terminal is possible, that is probably a better option, or you can try increasing your timeouts. In any case, I usually like my CGI timeout to be slightly higher than the PHP max_execution_time directive so they are somewhat in sync.
A 404 error should NEVER be the result of a timeout.
Trying to keep a process running for a long time in a webserver is a very bad idea - they are not designed for the job. Developers and administrators will go to great leangths to protect the system against anything which appears to be misbehaving. There are solutions, but they introduce other complications, a full discussion of which goes a bit beyond the scope of a post here.
If a timeout is occurring then it should be returning a 504 error (not a 500 error, although we're getting a bit closer).
The method you think is solving the problem:
exec('php home/usr/mystore.com/modules/mymodule/myscript.php > /dev/null 2>&1 &');
is actually creating more issues - the job may randomly stop, zombie processes may start backing up, apache process instances may be blocked...