Laravel和Goutte刮刀[关闭]

Iam going to scraper 4 website using Laravel and Goutte .

number of url(s) are 900 and I don't have any idea, how to send url(s)

(I wrote crawler code and don't have any question about this)

but I don't know How to send url(s)? Must I use queue or cronJob or ... ?

Do you recognize any package or tool or idea ? I don't have any idea to send 900 urls, 5 times in a day

If you wrote crawl code for websites, you can separate the links and store in CSV format file. You should write another script that enable to read with an exact numbers of these urls in CSV file and send you back. It's very easy in Ruby with open.csv library.