php pdo一次插入太多行的有效方法[关闭]

I'm programmatically importing rows from an excel file to database. The excel file has 10 columns * 30000 rows. I've imported those data in php array and then it is inserted in database.

After uploading file it takes 7-8 minutes to insert all rows in database. I know two ways of inserting the rows

1st method:

Generate dynamic query something like,

INSERT INTO table_name (col1, col2,..) VALUES (row1, row1, ....), (row2, row2, ....), (row3, row3, ....), (row4, row4, ....), ...

and run the query to insert all rows.

2nd method:

$con->prepare("INSERT INTO table_name (col1, col2, ...) VALUES (:col1, :col2, ...)");
foreach ($rows as $row) { // Loop through all rows and insert them
    $result->bindParam(':col1', $val1);
    $result->bindParam(':col2', $val2);
    $result->bindParam(':col3', $val3);
    ...
    ...
    $result->execute();
}

1st seems messy and inefficient and I'm using 2nd method but it insert only 500-700 rows per second and takes 7-8 minutes overall time to insert all rows.

What other methods which are efficient and faster than these ?

EDIT : Do not suggest to import excel file directly to mysql. The data need to be processed before inserting into database.

Try wrapping up your SQL into a TRANSACTION instead, COMMITING your data at the end. This will free up a lot of resource, as COMMITING data does quite a bit (it tells your DB to save the data you're INSERTING, and REINDEX - assuming you have them).

Also, try and make sure your connection to the database is Pooled - this means that you won't go and grab a new connection from the DB each time, but instead use the same one.

The risk of using a transaction though, is that if one error occurs in your dataset (preventing the INSERT), it will rollback the entire dataset.

Something like this could work..

<?php
  try {
    $dbh = new PDO('odbc:SAMPLE', 'db2inst1', 'ibmdb2', 
      array(PDO::ATTR_PERSISTENT => true)); // POOLED
      echo "Connected
";
  } catch (Exception $e) {
    die("Unable to connect: " . $e->getMessage());
  }

  try {  
    $dbh->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);

    $dbh->beginTransaction();
    $dbh->prepare("INSERT INTO table_name (col1, col2, ...) VALUES (:col1, :col2, ...)");
    foreach ($rows as $row) { // Loop through all rows and insert them

      // I am not sure where you define $result
      // Review in your implementation if you use
      $result->bindParam(':col1', $val1);
      $result->bindParam(':col2', $val2);
      $result->bindParam(':col3', $val3);
      ...
      ...
      $result->execute();
    }

  } catch (Exception $e) {
    $dbh->rollBack();
    echo "Failed: " . $e->getMessage();
  }


  $dbh->commit();

?>

The fastest way is to wrap several hundreds of inserts into a transaction and the commit. MySQL will use a single input/output operation of the hard drive to write many records - which is quick.

$con->prepare("INSERT INTO table_name (col1, col2, ...) VALUES (:col1, :col2, ...)");

$inserts = 1000;
$counter = 0;

foreach ($rows as $row) { // Loop through all rows and insert them

    if($counter === 0 && !$con->inTransaction()) {
        $con->beginTransaction();
    }

    $result->bindParam(':col1', $val1);
    $result->bindParam(':col2', $val2);
    $result->bindParam(':col3', $val3);
    ...
    ...
    $result->execute();

    $counter++;

    if($counter === $inserts) {
        $con->commit();
    }    
}

if($con->inTransaction()) {
    $con->commit();
    $counter = 0;
}

The code above should commit after every 1000 inserts performed. I haven't tested it so it most likely contains errors but its purpose is to illustrate how to wrap 1000 inserts in a transaction and commit it.