I have found a few answers on here which I have followed, found it working locally so I pushed up to our test-box where I'm finding this specific DB queries lasts over 1300 seconds by going in to mysql and SHOW PROCESSLIST;
and it's hanging on copying to tmp table
It's Laravel 4.2, pretty old legacy code which I'm just trying to stabalise whilst working on a later version. This code below repeats roughly every 30 seconds as per the api_call, which is all well and good except it's not finishing and receiving a 504 Gateway Time-out
I feel like I'm doing something recursive or would scouring a really large database be an issue here?
All I’m trying to do is run a groupBy but instead of grouping by the first one I want to group by the last one, just in case of other details being updated.
Any help provided would be highly appreciated.
public function api_prevnames()
{
if (Auth::user()->repeat_vistor == 'Y') {
$names = DB::table('visitors')
->select(DB::raw('first_name,last_name,email,car_reg,OPTIN,vistor_company'))
->where('user_id', Auth::user()->id)
->where('hidden', 0)
->where('email', '<>', '')
->whereRaw('id IN (select MAX(id) FROM visitors GROUP BY first_name, last_name, email)')
->get();
}
return JSONResponseGenerator::successResponse($names->toArray());
}
Which generates this query
select first_name,last_name,email,car_reg,OPTIN,vistor_company from `visitors` where `user_id` = '439' and `hidden` = '0' and `email` <> '' and id IN (select MAX(id) FROM visitors GROUP BY first_name, last_name, email)
The previous code runs in just under a couple seconds which I've added below:
$names = DB::table('visitors')
->select(DB::raw('first_name,last_name,email,car_reg,OPTIN,vistor_company'))
->where('user_id', Auth::user()->id)
->where('hidden', 0)
->where('email', '<>', '')
->groupBy('first_name', 'last_name', 'email')
->get();
You can try to run the following:
SELECT
first_name,
last_name,
email,
car_reg,
OPTIN,
vistor_company
FROM (
SELECT
MAX(id) AS id
FROM `visitors`
WHERE
`user_id` = 439
AND `hidden` = 0
AND `email` <> ''
GROUP BY
first_name,
last_name,
email
) AS subQ
NATURAL JOIN `visitors`;
Joins usually are faster than the other way around but I don't think it will help because you apparently are not grouping by indexes. To do that you would have to change the structure and I highly recommend doing so. Try to reduce the maximum length of first_name, last_name and email, so you can create a combined index of these three. If you can change the database structure itself without killing half your system, you should consider to normalize this table like having a table with visitors and another one with visits of those visitors (relation with foreign keys), so you can group by a key instead of grouping by three unindexed strings with a big length.