I am trying to figure out will MySQL be enough for my use case. I tried inserting 100 000 rows into my local mysql server, which went fine. I saw that DB started to get populated with the data.
Then I run same insert script agains the Google Cloud SQL. Everything seemed also fine, but for some reason DB stopped inserting entries after the 67667 entry even though the response from the DB was that the insertion was successful.
Does MySQL has some kind of limitation, or what may cause this kind of behavior?
My test script:
// Create connection
$conn = new mysqli($servername, $username, $password);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$conn->select_db($database);
$insertData = '';
for ($i = 0; $i < 100000; $i++) {
$insertData .= "INSERT INTO table (isin) VALUES ('".uniqid()."');";
}
if ($conn->multi_query($insertData) === true) {
echo "New records created successfully";
} else {
echo "Error: <br>" . $conn->error;
}
$conn->close();
My test table has only two columns, id
and the isin
number.
Try using mysql Batch insert for e.g.
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);
Build this part i.e. (1,2,3),(4,5,6),(7,8,9) in your loop and then use only one INSERT query
Try this way :
$val = "";
for ($i = 0; $i < 100000; $i++) {
$val .= "('".uniqid()."'),";
}
$val = rtrim($val,",");
$insertData = "INSERT INTO table (isin) values $val";
$conn->query($insertData);