Need some advice on the best approach.
Currently we are going to start a new CI web project where we need to leverage data heavily from a external web-services or API for data?
Is it better to manipulate the data programically (in objects or array) when i need to sort them or store them in database and call them with order, group by etc..?
Is there a known architecture or framework for this?
What's the best approach use nowadays like how aggregater website is doing where they pull many data sources from various vendor API?
I would suggest getting the data using curl etc manipulate as arrays etc then store.
Make sure you build in somekind of caching as well so you don't end up making unnessecary requests.
The reason behind my method is to process once rather than everytime your site is requested.
After all these while, i've have come up with the plan and it's working great !
First request will take 3-4 secs to complete(first call to webservice to grab data, stored it in cache), while subsequent requests from users take 0.002 secs due to cached data. 4hours later, the cycle will repeat so as to make sure data is 4hourly updated from webservice. If you are the first user that access the site after each refresh, you are the unlucky chap. But you sacrificed for all other chaps.