I have the following seems simple code in php; but the ptoblem is that it shows all valid links as "not valid"; any help appreciated:
<?php
$m = "urllist.txt";
$n = fopen($m, "r");
while (!feof($n)) {
$l = fgets($n);
if (filter_var($l, FILTER_VALIDATE_URL) === FALSE) {
echo "NOT VALID - $l<br>";
} else {
echo "VALID - $l<br>";
}
}
fclose($n);
?>
The string returned by fgets()
contains a trailing newline character that needs to be trimmed before you can validate it. Try out following code, I hope this will help you:
<?php
$m = "urllist.txt";
$n = fopen($m, "r");
while (!feof($n)) {
$l = fgets($n);
if(filter_var(trim($l), FILTER_VALIDATE_URL)) {
echo "VALID - $l<br>";
} else {
echo "NOT VALID - $l<br>";
}
}
fclose($n);
?>
I have tried with following urls:
http://stackoverflow.com/
https://www.google.co.in/
https://www.google.co.in/?gfe_rd=cr&ei=bf4HVLOmF8XFoAOg_4HoCg&gws_rd=ssl
www.google.com
http://www.example.com
example.php?name=Peter&age=37
and get following result:
VALID - http://stackoverflow.com/
VALID - https://www.google.co.in/
VALID - https://www.google.co.in/?gfe_rd=cr&ei=bf4HVLOmF8XFoAOg_4HoCg&gws_rd=ssl
NOT VALID - www.google.com
VALID - http://www.example.com
NOT VALID - example.php?name=Peter&age=37
maybe you have some symbols at end of each line ' '
I think you can just use trim function before validate the $l like this:
filter_var(trim($l), FILTER_VALIDATE_URL) === TRUE
maybe this will help you.
Try this code. It must be helpful. I have tested it and its working.
<?php
$m = "urllist.txt";
$n = fopen($m, "r");
while (!feof($n)) {
$l = fgets($n);
if(filter_var(trim($l), FILTER_VALIDATE_URL)) {
echo "URL is not valid";
}
else{
echo "URL is valid";
}
}
fclose($n);
?>
Here is the DEMO
Please try with the different filters available to see where it fails:
(cc of http://www.w3schools.com/php/filter_validate_url.asp)
You can try the good old regex too:
if (!preg_match("/\b(?:(?:https?|ftp):\/\/|www\.)[-a-z0-9+&@#\/%?=~_|!:,.;]*[-a-z0-9+&@#\/%=~_|]/i",$url))