在nginx / php-fpm配置上同时运行脚本的限制是什么?

The problem is I have to use curl and sometimes the curl requests take a long time because of the timeouts. I have set the timeouts to 1 second so no request should take more than 1 second but still the server is unable to process other php requests.

My question is how many concurrent scripts(running at the same time) can nginx/php-fpm handle. What I see is that a few requests lasting 1 second make the whole server unresponsive. What are the settings that I can change so more requests can be processed at the same time?

Php-curl did not respond in a timely manner because of DNS.

The problem was that I had to access files from a CDN but the IP behind the domain changed frequently and unfortunately curl keeps a DNS cache.

So from time to time it would try to access files from IPs that were not valid anymore, but they were still in the DNS cache of php-curl.

I had to drop php-curl completely and use a plain file_get_contents(...) request. This completely solved the problem.

Multicurl is indeed not the solution to your probleme, but asynchrousity probably is. I am not sure that the solution is tweaking Nginx. It would scale better if you were to consider one of the following options :

  • You can abstract Curl with Guzzle http://docs.guzzlephp.org/en/latest/ and use their approach to async call and promises.

  • You can use Gearmand http:/gearman.org/getting-started/ which will enable you to send an async message to a remote server which will process the instruction based on a script you register to your message. (I use this mechanism for non blocking logging)

Either way, your call will be made in milliseconds and won't block your nginx but your code will have to change a little bit.