使用http请求进行PHP后台同步

I have a php coded that is suppose to sync data from thousands of http links every 2 minutes and update the database.

However, some of the websites are too slow, and my current approach which is using foreach and going over the links one by one takes around 15 minutes.

Is there a better way to achieve this task in a shorter time?

foreach($email as $emails) {

imap_open(......);

// update db

}

Thanks

Without sample code it's difficult to advise. However it may be that you are using a connection method which 'blocks' until the response is received. This means PHP stops until that request has completed one at a time.

What you need is to connect to multiple systems at once and poll for responses.

fsockopen() can do this, if you set stream_set_blocking() on the socket before writing the http request headers. Then fread can

An alternate solution would be to fork the PHP into many processes each requesting a different source.

Is that enough to point you in the right direction? If not please include some sample code so we know what methods you currently use and somebody may expand it to do what you require.

Yes, do them in parallel.

One solution would be plit the current list of URLs across say 20 threads. You've not shown us your code, but imagining it currently does something like this:

$urls=mysqli_query($link,
      "SELECT url FROM list");
while ($r=mysqli_fetch_assoc($urls)) {
    ...

You might try something like the following to shard the data set:

$instance=(integer)$argv[1];
if (0==$instance) {
    die("Next time start me with a number between 1 and 20 indicating the thread");
}
$urls=mysqli_query($link,
      "SELECT url FROM list WHERE $instance-1 = CONVERT(
         CONV(SUBSTR(MD5(url),4), 16, 10) USING SIGNED INTEGER)
       ) % 20
      ");

Alternatively you could batch up the HTTP requests in a single PHP thread and call them cuncurrently using curl_multi_exec