I am inserting some data to MySQL table for testing purposes. The code below adds twenty rows for each c
. And does so till the value of c
is over a million. After that, somewhere it starts adding rows for the ones added before. How can we solve this problem?
for($c=1; $c<=2000000; $c++)
{
for($i=0; $i<=19; $i++)
{
$query = "INSERT INTO selections (c_id, i_id) VALUES('$c', '$i')";
mysql_query($query) or die("Cannot add... " . mysql_error());
}
}
Php numbers unlikely wrap at millions level, so this is probably because of the field configuration of your database.. Simple test for php:
for($i = 2000000; $i < 2000020; i++)echo "$i";
You will see that php can manage the numbers correctly...
In mysql
SMALLINT
A small integer
The signed range is –32768 to 32767. The unsigned range is 0 to 65535
MEDIUMINT
A medium-size integer
The signed range is –8388608 to 8388607. The unsigned range is 0 to 16777215
So if you use one of the above as c_id's data type it will wrap after the upper limit of the range...
Warning Do not use mysql interface it is deprecated please use PDO or mysqli interfaces instead...
If the purpose of your task is just to populate the data, and not trying to insert it with php script you can do that in mysql using a simple stored procedure like this
DELIMITER $$
CREATE PROCEDURE insert_selections(IN max_value INT)
BEGIN
DECLARE i INT DEFAULT 1;
WHILE i <= max_value DO
INSERT INTO selections VALUES
(i, 1), (i, 2), (i, 3), (i, 4), (i, 5), (i, 6), (i, 7), (i, 8), (i, 9), (i, 10),
(i, 11), (i, 12), (i, 13), (i, 14), (i, 15), (i, 16), (i, 17), (i, 18), (i, 19), (i, 20);
SET i = i + 1;
END WHILE;
END$$
DELIMITER ;
And execute it
CALL insert_selections(2000000);
On my Mac it took several minutes
The code shared, seems to work fine on my local, I added a combined unique index
on the 2 columns
CREATE TABLE `selections` (
`c_id` INT(10) NULL DEFAULT NULL,
`i_id` INT(10) NULL DEFAULT NULL,
UNIQUE INDEX `c_id_i_id` (`c_id`, `i_id`)
)
I guess you have 2 instances
of the same page executing
, causing duplicated values