I followed a tutorial on making a web crawler app. I just simply pulls all the links from a page and then follows them. I have a problem with pushing the foreach loop of links to the global variable. I keep getting an error that says the second variable in the in_array should be an array which is what i set it to. Is there anything there you guys might see bugging up the code?
Error:
in_array() expects parameter 2 to be array, null given
HTML:
<?php
$to_crawl = "http://thechive.com/";
$c = array();
function get_links($url){
global $c;
$input = file_get_contents($url);
$regexp = "<a\s[^>]*href=(\"??)([^\" >]*?)\\1[^>]*>(.*)<\/a>";
preg_match_all("/$regexp/siU", $input, $matches);
$l = $matches[2];
$base_url = parse_url($url, PHP_URL_HOST);
foreach($l as $link){
if(strpos($link, '#')){
$link = substr($link, 0, strpos($link, '#'));
}
if(substr($link, 0, 1) == "."){
$link = substr($link, 1);
}
if(substr($link, 0, 7) == "http://"){
$link = $link;
} elseif(substr($link, 0, 8) == "https://"){
$link = $link;
} elseif(substr($link, 0, 2) == "//"){
$link = substr($link, 2);
} elseif(substr($link, 0, 1) == "#"){
$link = $url;
} elseif(substr($link, 0, 7) == "mailto:"){
$link = "[".$link."]";
} else{
if(substr($link, 0,1) != "/"){
$link = $base_url."/".$link;
} else{
$link = $base_url.$link;
}
}
if(substr($link, 0, 7) != "http://" && substr($link, 0, 8) != "https://" && substr($link, 0, 1) != "["){
if(substr($link, 0 , 8) == "https://"){
$link = "https://".$link;
} else{
$link= "http://".$link;
}
}
if (!in_array($link, $c)){
array_push($c, $link);
}
}
}
get_links($to_crawl);
foreach($c as $page){
get_links($page);
}
foreach($c as $page){
echo $page."<br/ >";
}
?>
Trying to make "global" your $c at each iteration is a bad design. You should avoid "global" when it's possible.
Here I see 2 choices :
1/ Pass your array as reference (search google for that) in parameter of the "get_links" function. It will allow you to fill the array from your function.
Exemple :
function getlinks($url, &$links){
//do your stuff to find the links
//then add each link to the array
$links[] = $oneLink;
}
$allLinks = array();
getlinks("thefirsturl.com", $allLinks);
//call getlinks as many as you want
//then your array will contain all the links
print_r($allLinks);
Or 2/ Make "get_links" return an array of links, and concatenate it into a bigger one to store all your links.
function getlinks($url){
$links = array();
//do your stuff to find the links
//then add each link to the array
$links[] = $oneLink;
return $links;
}
$allLinks = array();
$allLinks += getlinks("thefirsturl.com");
//call getlinks as many as you want. Note the concatenation operator +=
print_r($allLinks);