I've got an app based on phantomjs that works so: 1. I'm running a php script that gets data from my databse (postgres) as an array, 2. Then via shell_exec I'm running phantomjs script and as argument I'm passing array with data (1), 3. In phantom I'm processing the data - checking domains WHOIS - and collecting for each domain expiration date. As result I'm getting an array that I'm storing in a file, 4. In the end phantom runs php script that gets the data from stored file and saves it in my database.
I'm wondering if there is a better option? Maybe doing everything in the phantomjs script? Maybe there is a js client for postgres?
I'd change workflow starting from step 3 and start saving data right away (PhantomJS is no stranger to crashing so it may not always get to step 4).
You could send data via an AJAX or POST request to an endpoint of your own. It could be another PHP script available via HTTP, even if on localhost. So you'd do another page.open to there and send data.
An even more reliable approach: after processing data execute a local PHP script feeding it data via CLI (or save data to a file like before and feed the script path to it).