I've made a CSV parser but sometimes it returns a memory exhausted error. The CSVparser consists of a jquery loop that starts multiple php files with ajax to parse a file. Some of the files return the error and i was wondering if this has something to do with the files running at the same time. So my question is, can i prevent the exhausting error by queuing up the parsing?
If you're receiving the PHP error Fatal error: Allowed memory size of ### bytes exhausted
then that means the single thread ran out of memory - regardless of any other processes. The memory you set for PHP (memory_limit
) isn't divided amongst the running processes or anything like that. That amount is reserved for each instance.