Here's basically what's stored inside my PHP script
$query = $db->query("
SELECT *
FROM `media`
WHERE `id` > '$last_id'
AND `accounts_used` != ''
ORDER BY `id` ASC
LIMIT 100
");
foreach($query['results'] as $row) {
$last_id = $row['id'];
$accounts_used = explode(", ",$row['accounts_used']);
$db->connect();
foreach($accounts_used as $liked_account) {
$account_id = str_replace(" ","",$liked_account);
$media_id = $row['id'];
$insert_array = array(
"account_id" => $account_id,
"media_id" => $media_id,
"timesent" => "0000-00-00 00:00:00"
);
$db->insert("media_likes", $insert_array);
}
$db->disconnect();
}
An example of the $row['accounts_used']
is like so
61519, 65894, 63561, 61718, 63567, 66924, 66979, 66972, 66637, 66295, 66842, 64775, 51898, 64631, 65044, 67226, 67582, 66861, 51543, 61564, 65597, 66863
On average the accounts_used
row contains around 1000/2000 unique ID's which are separated by commas, I want to explode based on the comma and insert each ID into another table.
media_id
contains an integer value roughly around 5 digits long and account_id
is about the same
This script is taking around 3 minutes to complete, any specific ways to improve such functionality? or a quicker way to do it?
A few things I can see that might help off the bat:
1 - you are opening and closing your db connection after each row of data you process (don't)
2 - you can use a prepared statement to have the mysql connection "pre-compile" your insert statement
3 - use a transaction around the inserts so your indexes only get updated once
It's possible you have a table with a lot of rows, so the problem is not in the way you insert itself but in optimization of the table. If that's the case you could try changing the indexes and foreing keys, that might improve the speed. Also you can try an in-line insert to see if it improves.