脚本从下载页面接收文件并上传/保存在服务器上

I would like to know the possiblities for automating a current manual procedure.

I have a script that reads csv-files into a mysql database. The csv-file is fetched by visiting different sites, mostly asp och php pages that automatically starts a doqnload of the csv to the local Download-folder. Then I upload it to my server where the csv->db script handles it.

Is there a way to fetch and place the csv on the server with e.g. a php script that I can set to run automatically with a cron job?

Hope this question is OK for this forum. Thanks!

You can put all the urls in a text file and download them using wget in Linux; for example:

wget --input-file=file_with_urls_listed_one_per_line.txt

You can also get wget for Windows in case your server is not a Linux box.