I'm quite new to this, but I have the code below working great, it pulls in the json, loops and inserts them into database, whilst checking if the player exists, if exists then update ONLY score and bonus. If a new player then insert the new row.
My problem is the list of players I have is huge around 10-15K (players1 to players15000). When I am monitoring Laravel query logs for potential performance hogs, I notice the sheer amount of queries being ran because of the foreach looping the query one by one.
Since all the 10-15K data is there and available from the $scores variable. Is there any way to get it to run one big query, rather than 10-15K individual queries whilst being protected by the PDO placeholder?
I was thinking about building an array in PHP and then feed the array to Mysql query, but then how would you protect it with placeholders? Plus this would be ran a few times a day
I'm really confused how to best do this, so here is the code, the original is much larger as it contains teams, location, ip address etc and so forth. But this is a trimmed down example
Unique key in database set as (id)
$scores = '
{
"1":{
"id":"1",
"player":"1",
"name":"James",
"score":"10.25",
"bonus":"2.10"
},
"2":{
"id":"2",
"player":"2",
"name":"John",
"score":"11.50",
"bonus":"1.10"
}
}';
$decoded = json_decode($scores);
foreach($decoded AS $value)
{
$update = DB::insert("INSERT INTO players (id,player,name,score,bonus) VALUES (:id,:player,:name,:score,:bonus)
ON DUPLICATE KEY UPDATE score=VALUES(score), bonus=VALUES(bonus)",
array(
':id' => ,$value->id
':player' => ,$value->player
':name' => ,$value->name
':score' => ,$value->score
':bonus' => ,$value->bonus
)
);
}
well for start, your INSERT can look like this:
INSERT INTO table_name(col1, col2) VALUES(val1, val2), (val1, val2)
That is one INSERT which is inserting 2 rows, and it's faster than 2 INSERTs
As for laravel, you can use insert function as shown here (see 'Inserting Multiple Records Into A Table')
Not sure how it works behind the scenes tho, my guess is faster than foreach
Here's a very fast way (other than your IODKU loop, or @Basic batching of it).
INSERT
all the data into a temp table. (A la @Traxo answer.)INSERT INTO real SELECT ... FROM tmp LEFT JOIN real ON ... WHERE ... IS NULL
UPDATE real JOIN tmp ON ... SET real.bonus = tmp.bonus, real.score = bonus.score;
There are some details, such as BEGIN
...COMMIT
assuming you are using InnoDB (which you should be).
If your periodic update is really a total replacement, then do it this way:
CREATE TABLE new LIKE real;
new
RENAME TABLE real TO old, new TO real;
-- atomic and instantaneousDROP TABLE old;