有没有更快的方法使用PHP将数据从文件导入MySQL?

Okay, so I get around 100k-1M lines of text that I always import to a database. The code that i use is as follows:

$lines = new SplFileObject('/home/file.txt');
while(!$lines->eof()) {
    $lines->next();       //Skipping first line
    $row = explode(',',$lines);
    for($i = 0; $i<4; $i++){
        if(!isset($row[$i])){
            $row[$i] = null;
        }
    }
    $y = (float) $row[1];
    $z = (float) $row[2];
    $load_query = "INSERT IGNORE INTO new (datetime_gmt,field2,field3)
    VALUES ('".$row[0]."','".$y."','".$z."');";

    if(!$mysqli->query($load_query)){
      die("CANNOT EXECUTE".$mysqli->error."
");
    }
}
$lines = null;

However, it takes waaayyy too long. Is there any faster way to do it, or am I stuck with this method?

PS. I don't want to use MySQL's "INSERT DATA INFILE".

As written, you're running an insert statement for every line. It'll be much faster if you compile a single multi-insert statement in the format of INSERT INTO table (foo, bar) VALUES (1, 2), (3, 4), (5, 6); that is executed once at the end. Something along the lines of this, though it could be cleaned up more.

$lines = new SplFileObject('/home/file.txt');
$load_query = "INSERT IGNORE INTO new (datetime_gmt,field2,field3)
    VALUES ";
while(!$lines->eof()) {
    $lines->next();       //Skipping first line
    $row = explode(',',$lines);
    for($i = 0; $i<4; $i++){
        if(!isset($row[$i])){
            $row[$i] = null;
        }
    }
    $y = (float) $row[1];
    $z = (float) $row[2];
    $load_query .= "('".$row[0]."','".$y."','".$z."'),";
}

if(!$mysqli->query(rtrim($load_query, ','))) {
    die("CANNOT EXECUTE".$mysqli->error."
");
}
$lines = null;

Also keep make sure the data is trusted. If the file can come from an outside user, appending directly to the query string creates an SQL injection vector.