I have a page displaying horse racing data. With a database over a GB.
Depending on how much data is being displayed, the page can take up to 1 minute to fully display.
It's faster if I remove one row where I have lots of calculation and mysql queries, but that would impact my end users.
What would be your advice be to load the page faster?
My last thought is to display the page without the heavy calculation, and then make an AJAX call (user clicks a button) to calculate the data and display it.
Any other ideas?
Here is the code which is - I reckon - causing slowness:
function stallWin($stall,$course,$distance,$runners){
include ("dbstring.php");
$threeless = $runners - 3;
$threemore = $runners + 3;
$hsql = "select course, runners, distance, date from ".$table."horse where course = '".$course."' and runners >= '".$threeless."' and runners <= '".$threemore."' and distance = '".$distance."' and xposition = '1' and draw > 0 and draw <='$runners'";
$hresult = mysql_query($hsql,$db);
$htotalraces = mysql_num_rows($hresult);
$hsql = "select draw, sum(IF(xposition=1,1,0)) as wins from ".$table."horse where course = '".$course."' and runners >= '".$threeless."' and runners <= '".$threemore."' and distance = '".$distance."' and draw = '$stall'";
$hbox = '';
$hresult = mysql_query($hsql,$db);
$totalrut = 0;
while ($hmyrow = mysql_fetch_array($hresult)){
$hpercent = number_format( (($hmyrow['wins'] / $htotalraces ) * 100),0);
$hbox.='<tr><td style="text-align:center;" class="likbox">'.$hmyrow['draw'].'</td><td style="text-align:center;" class="likbox">'.$hmyrow['wins'].'</td><td style="text-align:center;" class="likbox">'.$hpercent.'</tr>';
$totalrut = $totalrut + $hmyrow['wins'];
}
return $hpercent;
}
This function is called for each horse on each race. If the page displays 10 races with 10 horses each, I will call it 100 times.
I can't add images on here as I don't have enough reputation.
Explain screenshot (1): http://screencast.com/t/6RlOLtnQKs Explain screenshot (2): same url as previous with that at the end: /qCvBt8Hst (not enough reputation to post more than 2 links)
I've tried to use a PHP profiler as well but I have issues around SQL logs. In that example, the page took 16 seconds to display, which is ok-ish. But it can take up to a minute ...
You have some alternatives, which you can use singularly or cumulatively.
Using a caching system to reduce the burden would be ideal and really fast. Also a hybrid system shouldn't be kept out of mind.
Using (as already suggested) better index to speed up the retrieving of the rows. Also don't forget to add fulltext indexes.
Using a pagination script which would let you divide this "huge amount of datas"
But without any sample code the suggestions will be few.
As my opinion - 1. load only necessary. 2. Most of data is image - you can compress images (easy approach is to resize it on php side). 3. Compress - and write the extraction code at the client side. 4. I persume, there site can be accessed even not login. Hide those folders, that you don't want to expose to the whole world (i.e folder for the personal images), and use some tricks to expose only what relevant to user (security issue). 5. Spread your database to several domains (this can be achived even by same hosting supplier). 6. Put some calculation on client side, and some on the server side. Doing this wisely.
Good luck!
You're probably missing some indexes. Any time you search by a column or a group of columns, you should create an index on those columns to speed up the search process. You can literally reduce a query's execution time from minutes to fractions of a second by doing this. So if, for example, you often search the table results
based on the columns tracklocation
and date
, you could run this query to create an index to speed up the process:
# Name of index Columns being indexed
ALTER TABLE `results` ADD INDEX `results_location_date` (`tracklocation`, `date`)
MySQL has a command called EXPLAIN
that will show you information about the query you want to run: how many columns, how many rows it's going to look at, etc. You run it by just insert the word 'explain' before your query (i.e. explain select * from ...
). You can use this to locate bottlenecks and identify missing indexes, and then after adding an index you can run it again and see what's different.