I need some help to improve my current code. I have a huge array (about 20,000 objects inside it). The array looks like this:
Array
(
[0] => Player Object
(
[name] => Aaron Flash
[level] => 16
[vocation] => Knight
[world] => Amera
[time] => 900000
[online] => 1
)
[1] => Player Object
(
[name] => Abdala da Celulose
[level] => 135
[vocation] => Master Sorcerer
[world] => Amera
[time] => 900000
[online] => 1
)
[2] => Player Object
(
[name] => Ahmudi Segarant
[level] => 87
[vocation] => Elite Knight
[world] => Amera
[time] => 900000
[online] => 1
)
[3] => Player Object
(
[name] => Alaskyano
[level] => 200
[vocation] => Royal Paladin
[world] => Amera
[time] => 900000
[online] => 1
)
[4] => Player Object
(
[name] => Aleechoito
[level] => 22
[vocation] => Knight
[world] => Amera
[time] => 900000
[online] => 1
)
And so on... with about 20,000 Player Object in total.
Now I want to insert them all in to my database. I'd like to find a way to not loop through all players. It is causing a lot of performance issues and it's almost killing my computer. I'd like to make it in a single query, all at once.
But how can I get the Player Object attributes, like the "name", "level" and "vocation" of each individual object without looping them through?
This is what my code looks like:
// Insert player list to database
$sql = $db->prepare("INSERT INTO players (name, level, vocation, world, month, today, online) VALUES (:name, :level, :vocation, :world, :time, :time, :online) ON DUPLICATE KEY UPDATE level = :level, vocation = :vocation, world = :world, month = month + :time, today = today + :time, online = :online");
foreach ($players as $player) {
$query = $sql->execute([
":name" => $player->name,
":level" => $player->level,
":vocation" => $player->vocation,
":world" => $player->world,
":time" => $player->time,
":online" => $player->online
]);
}
Because right now on that foreach at the bottom, it is looping through 20,000 player objects in my array, and getting their names/level/vocation/world and so on.
Is there a better way to do this? My way of doing it can't be the best solution. I can hear my PC is working overload and it feels as if it's about to crash.
While I still doubt that transactions and/or batched inserts are a viable solution to your resource usage problem, they're still a better solution than preparing massive statements like Dave has suggested.
Give these a shot and see if they help.
The following assumes that PDO's error handling mode is set to throw exceptions. Eg: $db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
If, for some reason, you can't use Exception mode then you'll need to check the return of execute()
each time and throw your own Exception.
Single transaction:
$sql = $db->prepare("INSERT INTO players (name, level, vocation, world, month, today, online) VALUES (:name, :level, :vocation, :world, :time, :time, :online) ON DUPLICATE KEY UPDATE level = :level, vocation = :vocation, world = :world, month = month + :time, today = today + :time, online = :online");
$db->beginTransaction();
try {
foreach ($players as $player) {
$sql->execute([
":name" => $player->name,
":level" => $player->level,
":vocation" => $player->vocation,
":world" => $player->world,
":time" => $player->time,
":online" => $player->online
]);
}
$db->commit();
} catch( PDOException $e ) {
$db->rollBack();
// at this point you would want to implement some sort of error handling
// or potentially re-throw the exception to be handled at a higher layer
}
Batched Transactions:
$batch_size = 1000;
for( $i=0,$c=count($players); $i<$c; $i+=$batch_size ) {
$db->beginTransaction();
try {
for( $k=$i; $k<$c && $k<$i+$batch_size; $k++ ) {
$player = $players[$k];
$sql->execute([
":name" => $player->name,
":level" => $player->level,
":vocation" => $player->vocation,
":world" => $player->world,
":time" => $player->time,
":online" => $player->online
]);
}
} catch( PDOException $e ) {
$db->rollBack();
// at this point you would want to implement some sort of error handling
// or potentially re-throw the exception to be handled at a higher layer
break;
}
$db->commit();
}
I think the biggest performance gain you will get is by not doing one query per insert, but doing a single query for all inserts. Something like:
$sql = "INSERT INTO players (name, level, vocation, world, month, today, online) VALUES ";
$inserts = [];
$values = [];
$idx = 0;
foreach ($players as $player) {
$idx++;
$inserts[] = "(:name{$idx}, :level{$idx}, :vocation{$idx}, :world{$idx}, :month{$idx}, :today{$idx}, :online{$idx})";
$values[":name{$idx}"] = $player->name;
$values[":level{$idx}"] = $player->level;
$values[":vocation{$idx}"] = $player->vocation;
$values[":world{$idx}"] = $player->world;
$values[":month{$idx}"] = $player->time;
$values[":today{$idx}"] = $player->time;
$values[":online{$idx}"] = $player->online;
}
$sql .= implode(",", $inserts);
$sql .= " ON DUPLICATE KEY UPDATE level = VALUES(level), vocation = VALUES(vocation), world = VALUES(world), month = month + VALUES(time), today = today + VALUES(time), online = VALUES(online)";
$query = $db->prepare($sql)->execute($values);