I set up my IX Webhosting to conduct the following cronjob every 15 minutes.
usr/bin/wget -O- http://xxx.com/php/xxx.php
I want the php folder to be blocked off from all outside requests for security reasons, so I set the .htaccess file to deny from all. But when the .htaccess file is present the cron job is denied with a 403 error.
I thought that server-side cronjobs are not blocked by .htaccess? Is there any way to get around this?
Change the cron job to:
/usr/bin/php /path/to/web/root/php/greader_forceupdate.php xxxx
/usr/bin/php
may need to be adjusted if PHP is installed elsewhere./path/to/web/root
is the path on the filesystem.
To access the parameter, use $_SERVER['argv'][1]
.
This is still a web request from localhost, since wget
is a regular HTTP client requesting the page, and the web server is serving it (even though it may produce no output for humans to read). Instead of denying all, allow localhost:
Order deny,allow
Allow from 127.0.0.1
Deny from all
You can add something like:
Order deny,allow
Allow from 192.168.0.1/24
Allow from 127
Deny from all
in your .htacces to allow access from local intranet IP (192.168.0.1/24) and localhost IP i.e. 127.0.0.1