Anyone have any experience using a long-running Magento process to mitigate overhead. For example, a typical Magento API call to an order or customer resource could take 1s or more, with potentially half of that time being spent on Magento overhead, and not specific to the API resource in question.
So, what if a Magento PHP process was spun up and maintained in memory, waiting for API requests, so that it could handle them without the need to load up Magento each time.
Most of my searches for long-running php scripts are turning up questions / issues related to troubleshooting PHP scripts that are taking longer than expected to run b/c of the amount of data they're processing, etc. - so I'm finding it difficult to find good resources on this kind of thing, if it's even possible.
UPDATE: To be a bit more specific with my needs:
You may want to look into proc-open
, and you'll need to do a lot of management that usually occurs in the OS's itself.
However, if the problem is speed and not just wanting a means to pipe/fork to take use of hardware available I would look into simply finding bottlenecks through out the system, and caching before diving into such. Such as WSDL Caching, DB normalization, OP code caching or even memcache or reverse proxy caching. Alan does have WSDL caching in his Mercury API product ( http://store.pulsestorm.net/products/mercury-api )
I have used proc-open
before when importing over 500k customer records (through Magento's models (stack) I may add) with addresses into Magento on a 32 Core system in less than 8 hours using this same approach. One PHP file acted as the main entry point and new processes based of chunks of data were forked out to a secondary PHP file that did the actual importing.
I did leverage this small script for multi-threading on the import I had mentioned, although this isn't an exact answer to your question, as it doesn't seem to very technically specific oriented but hopefully offers some insight on possibilities: