I need to parse large Excel tables into my database. I'm using PHPExcel. The files are too big to import them at once, and I have to deal with max execution time and memory restrictions.
So now the system works this way: operator uploads the files in browser, and then the import scripts runs many times, until the whole file is parsed. The PHP script imports only part of excel file, and returns a value. If there are still some unparsed rows - script runs again by AJAX.
Now I want to move this task to cron, but I don't know how, because I don't know how many times the script needs to be executed until the job is done.
Is there some way to execute script again and again until it is done, but without dealing with it in browser (by AJAX, reload, etc.)
Make the script to run in the background. Use exec function. Refer the manual below
exec() wont do what your looking for. You can overcome the memory and lifetime limits by one setting
memory_limit = -1
max_execution_time = 0
but exec time is unlimited for CLI PHP anyway - as mentioned.
I would put all your uploaded files into some working directory, and have a cron script come along every minute and process unprocessed files (once, or Ntime but should not need that...). Cron PHP scripts could touch a file when its done (and when its started working on a file if you've got multiple processes) so it knows which files are done (or being used by another processes). An AJAX endpoint could test for existence of that flag file or similar to notify the client.
Why not just let the job run every x minutes continually, with this kind of thing:
Check if there are any files neededing parsed
If NO
exit
else
Parse X number of rows from where we left off,
keeping track of where we are upto.
The script will just happily run in the background checking if there's anything to do, if there's no it will just quit.