I am facing quite a specific optimization problem.
I currently have 4 normalized tables of data.
Every second, possibly thousands of users will pull down up-to-date info from these tables using AJAX.
The thing is that I can predict relatively easily which subset of data they need... The most recent 100 or so entries in those 4 normalized tables.
I have been researching de-normalization... but feel that perhaps there is an easier solution.
I was thinking that I could somehow every second run one sql query to condense the needed info, store it in a temp cached table and then have all of the user queries just draw from this. This will allow the complex join of 4 tables to only be run once, and then from there the users just need to do a simple lookup from the cached table.
I really don't know if this is feasible. Comments on this or any other suggestions would be much appreciated.
Thanks!
Are you able to employ memcache?
It helps to lessen db load.
If you're running MySQL version 5.1.6 or later you can take advantage of Events:
CREATE EVENT e
ON SCHEDULE
EVERY 1 MINUTE
DO
INSERT INTO tempTable (`col1`,`col2`,`col3`)
SELECT `col1`,`col2`,`col3`
FROM table1
WHERE col1 = "value"
Denormalization shouldn't be a result of thinking
. But of testing. What are problem queries? Do you have any?
Does your other equipment - hardware, web-server, business logic layer ready to such a traffic? Or your question is just imagination?
Have you tried with query caching?
Select SQL_CACHE * from myTable inner join etc..... ;
if the data changes often then this may not be of much/any help. (The cache will automatically invalidate every time the data changes.)
Note: always gaurd your database so, if you can try cache outside of the db. also you may need to check the variables SHOW VARIABLES LIKE %query_cache%
to ensure query caching is on.