I do have text file having 500k data.
I am running a loop to store some info. something like..
$file = fopen("top-1-500000.txt", "r") or exit("Unable to open file!");
while(!feof($file)) { //some function
mysql_query("INSERT INTO table (name) VALUES ('$value')");
} fclose($file);
The issue is when the loop stopped in the middle then I need to delete data which already read from text file by reading mySQL database manually to prevent loop to read it again from first line from text file. this is a huge effort in terms of multiple files.
Alternative method of reading a large file is to use the MySQL LOAD DATA INFILE functionality.
Example:
LOAD DATA INFILE 'top-1-500000.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '
'
IGNORE 1 LINES;
if you use file()
to read the file, you can just use a counter inside a for-loop counting the number of iterations, then when you need to go back to where the loop stopped, use that counter
$file_array = file("top-1-500000.txt");
for($i=0;$i<count($file_array);$i++)
{
// ...code here...
mysql_query("INSERT INTO table (name) VALUES ('$value')");
}
Also, this is assuming this isn't an absolutely massive file
You are reading the file line by line, so
$line = fgets($handle);
will read 1024 chars by default (you can specify that)
There's a function called fseek with which you can navigate the internal file pointer and read from that place on. A possible solution is to keep in the database the number of read lines and when (if) the loop dies in the middle, you can have $number_lines * 1024
to be the offset for fseek()
and continue to read on.
I'm assuming by "stopped in the middle" you mean that the script is timing out. You should use set_time_limit
to prevent your script from timing out (I'm assuming your server config allows you to do this).