I have a php script that comes down to the following:
while ($arr = mysql_fetch_array($result))
{
$url = $arr['url'];
exec("curl $url > /dev/null &");
}
$url will represent a remote script.
My question is, what can I expect if I try to cycle through 2,000 URLs.
Will opening that many CURL connections cripple my server? Could I fire all 2000 in less than one minute?
What I am trying to do is prevent my users from having to setup cronjobs by opening connections and running their remote scripts for them.
Can you guys advise? I'm out of my league today.
Hudson
Take a look at curl_multi_init
. It will not fire multiple processes so it should be softer on your server.
I would advise you to fire only 3-15 at a time, depending on the load the server can handle and the complexity of the scripts. 2000 at a time will probably make you run out of file descriptors or other limit.
You need to keep an eye on the number of opened files (connections) so you don't hit the file-max limit on your server.