I'm testing a complex project involving several php files linked with each other by CURL. I run the main function 15 times. It only runs 4 times and then complains Maximum execution time of 30 seconds exceeded
. When I look at the database, I see that operations are executed exactly 4 times, every time I run it. What could it be? The error appears on the line where I'm calling some other php file $data = curl_exec($ch);
See php.ini max_execution_time
setting and adjust. 120
, 240
still makes sense and are ok, but do NOT feel tempted to set it to 0
to disable this limit. Instead fix your script to i.e. cache remote files and not fetch it all the time. In general I expect your design to be broken. Connecting scripts via cURL indicates that you either need API or there's some other design issue
Someone else already covered max execution time, so general thoughts:
In the starting point of your php function set time limit for this function to execute. Set 0 for no limit.
function doWork() {
set_time_limit(0);
/* do your work here */
}
Time limit set in curl:
curl_setopt($ch, CURLOPT_TIMEOUT, 400);
If you need to request multiple pages, you can do it simultaneously (in parallel) with curl_multi_*