PHP - Divide et Impera以避免致命错误:超出Maxim执行时间

everyone!

I am building an 'Seeker' to get some prices from 7 websites. I am using cUrl to get this prices and I am dealing with this problem: Fatal Error: Maximum execution exceeded. I have read about this and saw some solutions, but I want to know if I can divide the process more than it is. So here is my app:

The customer enters on my website, search for some codes of items (the worst case is the one of 5 different codes once) and send a request for price. So I am posting an array of maximum 5 codes to my MainController that distributes the array on every single controller, like this:

$codesArray -> MainController -> foreach(ControllersList as Controller){Controller->getPrices($codesArray)}

I assume that the time execution is distributed wrong, or not as I want, because I have tested it with 3 codes and the "Fatal Error: maximum execution time: 30 seconds" comes at 33 seconds from the begining, on the fifth controller. I think is impossible to search for 3 items 4 times in 3 seconds and exceed on the fifth, so it means the server is counting the MainController process, not every process itself. My question is how can I divide properly the process to obtain the distribution of time per controller?

I hope I made myself understood, sorry for bad english and also bad explanation.

NOTE: I tested each controller with an array of 7 codes and I have never get this Fatal Error, that is why I want to divide the process, but I am not sure how..

NOTE II: Send one code once will result in n*7 more curl executions because with the array the process repeats just the search functions, not the whole connect to website-> check login -> login* -> search

The timer is related to the request as a whole, is not split on the different controller calls. The whole execution time can't exceed 30 seconds (you can increase the time limit in php.ini. And also, they are not multiple processes: PHP is single-threaded, so you cant spawn another thread and run it. You can try making the calls in parallel using curl_multi_init() and then doing the processing (I'm assuming the calls take more time than the actual processing).