I have a PHP webapp that has to access a remote read-only database which happens to be dead slow (8-10 seconds per query) , problem is I can't do anything about the slowness of that database, but, I can do whatever I can on my side to deal with that problem. The data on the database doesn't change much, perhaps once or twice per day and each query is relatively small, so we were thinking about these options :
Dump the remote database daily to a local database, problem is we will have data changes delayed by one day
Use cache for each query, problem is the first query after a db change will always be dead slow.
Does anyone have another idea?
How about using option 2 but also having a cron job that regularly updates the cache (at least for your most commonly accessed pages/searches)?
Is there any possibility of adding option
3). Get the maintainers of the slow database to fix it, urgently?
If not, I think you could adopt Matt B's idea, but deal with it locally.
2). Use a (local) cache for each query. Twice a day, run a back end cron job that calls all the most common queries, so that your local data is 'primed' to find new content quickly. That way, your users should only rarely suffer from the delay, if they're doing something unusual that requires a call to the 'master' database.