This question already has an answer here:
I have a CSV file that I successfully imported into a database table.
Over time I will update this CSV file with new data so I was wondering if I import the same file which will contain rows that has already been inserted into the table, is there a way to avoid duplicates being added again?.
When researching I came across an option called "Ignore duplicate rows" however this is not present for me within the import options.
I am using the PHP my admin that is packaged inside XAMPP.
which is •Version information: 4.2.7.1, latest stable version: 4.2.8
</div>
add unique key to your database column which you think should be unique
here is the code i have done for uploading csv files which will stop inserting found values maybe it will help you.
using On Duplicate Key Update
<?php
$conn=mysql_connect("localhost","root","");
$db=mysql_select_db("dbname");
$csv_file="/path/to/file";
$savefile="filename.csv";
$handle = fopen($csv_file, "r");
$i=0;
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) { //2nd param is for memory limit use it if u know what u are doing
if($i>0){
$import="INSERT into table_name(col0,col1,col2,col3)values('".addslashes($data[0])."','".addslashes($data[1])."','".addslashes($data[2])."','".addslashes($data[3])."')ON DUPLICATE KEY UPDATE col2= '".addslashes($data[1])."'";
mysql_query($import) or die(mysql_error());
}
$i=1;
}
?>
cheers