What is the fastest way to execute many PHP pages. I will not need to see them (i.e. scrape or retain any information from any html that is output), but they need to load. The PHP code within each page needs to be executed.
I first started out doing foreach (...){echo <iframe src=....
(very ugly and took very long and very bad idea)
Now I'm thinking to set a jquery timer to keep loading ajax requests and go through an array.
Is there a better way to go about this?
Multi-threading is going to let you call URL's as fast as possible as you can have multiple loading at the exact same time. PHP doesn't really have good support for multi-threading but we can have it do multi-processing. This will allow you to kick off a URL request and immediately kick off more without waiting for any of the pages to load.
The following code is an example of how to accomplish this using wget. This will be fast but does have the downside of not letting you know about a success or failure.
<?php
foreach($pages as $page){
exec('/usr/bin/wget '.$page.' > /dev/null 2>&1 &');
}
?>
This could be taken a step further to call your own php script instead of wget. In which case your program can log the urls that failed to load.
So a complete example could be something like the following:
run.php
<?php
foreach($pages as $page){
exec('loadPage.php '.$page.' > /dev/null 2>&1 &');
}
?>
loadPage.php
<?php
$handle = curl_init($url);
curl_setopt($handle, CURLOPT_RETURNTRANSFER, TRUE);
$response = curl_exec($argv[1]);
$httpCode = curl_getinfo($handle, CURLINFO_HTTP_CODE);
if($httpdCode != '200'){
$fp = fopen('error.log', 'w');
fwrite($fp, 'The URL '.$url.' had an issue. It returned a '.$httpCode.' error.');
fclose($fp);
}
?>
You could do the following:
<?php
$pages = array();
$pages[] = 'page1.php';
$pages[] = 'page2.php';
$pages[] = 'page3.php';
$pages[] = 'page4.php';
$pages[] = 'page5.php';
foreach($pages as $page):
include $page;
endforeach;
?>
well although there will be minimal difference from my understanding foreach is the slowest type of loop. Your real issue is the iframe they are notoriously slow. what you should be doing is an include(); or require(); or even require_once();
$pages = array();
$pages[] = "Page1.php";
$pages[] = "Page2.php";
$pages[] = "Page3.php";
foreach($pages as $page)
{
require($page);
}
Avoid using iframes as I believe they are actually planned to have support for them dropped. if you need them to appear a certain was or style just use CSS to position them how you had when you were using an iframe but use a div in its place.
I hope this helps.
The most efficient thing to do by far is thread it ...
http://docs.php.net/manual/en/book.pthreads.php
http://pecl.php.net/package/pthreads ( does not contain latest release, git if you can )
I think it's highly likely that the only reason the code is contained in several pages ( which I assume are local ) is because this was the only way to achieve concurrency in PHP ... if that's correct then you should most likely rewrite the code as you do not require your server for concurrency anymore. Even if they are external sites on different physical machines, threading the requests is the thing to do ... you can even
and it's soooooo easy