I have a 500 link to xml files (online files) I want to pass those files to a function that read the content of these files, parse them and then store them in my database (mysql). I use this function to get the links:
function get_links()
{
$user_file = @fopen("./fullsoccer.TXT", "r");
if ($user_file)
{
while (!feof($user_file))
{
$lines[] = fgets($user_file, 4096);
}
fclose($user_file);
}
return $lines;
}
then I use this function to read each link content and parse it :
function doParse($parser_object) {
$links=get_links();
$i=0;
while(!empty($links[$i]))
{
if (!($fp = fopen($links[$i], "r")));
{
//loop through data
while ($data = fread($fp, 4096)) {
//parse the fragment
xml_parse($parser_object, $data, feof($fp));
}
}
$i++;
}
}
The links are returned successfully (I print them in the "doParse" function) but the problem is : the "doParse" function just parse the file of the last link. I want to know why the function parse just the file of the last link and leave the whole previous 499 files ? is it fopen problem ?? does it able of reading more than 1 file?
change
$links=get_links();
to
$links=file("./fullsoccer.TXT");
File function is used to read file contents into array where array item is a line.
you can use foreach instead of while loop. To get content of link you may use file_get_contents(); but the best way is to use CURL because server may have some security issues when you want to get the data just by fopen
http://php.net/manual/en/book.curl.php
I have found curl example of how to download data from link
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322)');
$data = curl_exec($ch);
curl_close($ch);
return $data;
}