Doctrine2协会坚持不懈

I have a while loop in where I create the main entity Order, this entity has a lot of fields and associations but persists fast, very fast.

But it as a oneToMany unidirectional (manyToMany) association, which takes forever... This association is small, only two fields so I don't know why it takes so long.

I have an array of $items, exactly 17 items. Inside the while loop, foreach Order I add theese 17 items inside the assoc.

while($i < 53000)
{
    // ... 
    foreach($item as $k => $v)
    {
        $item = new Item();
        $item->setName($v['name']);
        $item->setValue($order[$v['column']]);

        $order->addItem($item);
    }

    // ...

    $em->persist($order);

    $i++;
    if($i % 200 == 0) // I've tried multiples values in here but everything is slow.
    {
        $em->flush();
        $em->clear(); // Commenting this, doesn't speed up things neither.
    }
}

Of course I know that for every Orderthere are 17 Item created, but as Item object is so small, I don't think it needs to take that long...

With the Item forEach, it takes about 35 minutes to end the script. If I remove that, the script only takes 4 minutes to run, which is ridiculous...

The relation mapping is as follows

  manyToMany:
    items:
      targetEntity: AppBundle\Entity\Item
      cascade: ["all"]
      joinTable:
        name: order_has_item
        joinColumns:
          product_id:
            referencedColumnName: id
        inverseJoinColumns:
          item_id:
            referencedColumnName: id

It is unidirectional so no mapping for Item is created.

You're generating 53000 Orders, of which each has (as you say) around 17 Items. That's 954000 database entries being generated.

You're persisting every 200 Orders, so each call to $em->persist() creates 3600 entity objects and DB entries.

There are, at least, two performance hits:

  1. Doctrine will generate 3600 entities with all bells and whistles.
  2. The database indices of the affected tables are updated on each insert.

Also, don't forget that if you're doing this in the dev environment, Symfony will log an ass-load of stuff (see app/logs/dev.log) and has several debugging features on.

So, how can you shove nearly a million entries into your DB? Doctrine is unfortunately very bad at really bulk operations. It is usually a better to use a native SQL query (though, sacrificing portability … but who cares, anyway):

$em->getConnection()->executeUpdate(/*INSERT etc. …*/);

When inserting objects that relate to each other, it's a bit more difficult, because we must ensure consistency:

try
{
    $conn = $em->getConnection();
    $conn->beginTransaction();

    foreach ($orders as $order)
    {
        $conn->executeUpdate(/*INSERT the order …*/);
        $orderId = $conn->lastInsertId();

        foreach ($order->items as $item)
        {
            $item->orderId = $orderId;
            $conn->executeUpdate(/*INSERT the item …*/);
        }
    }

    $conn->commit();
}
catch(\Exception $e)
{
    $conn->rollback();
    throw $e;
}

// NOTE: the try/catch wraps *all* orders, so if one order fails *no* 
// orders will be saved. Alternatively, put the try/catch into the loop

(I kept the code short for readability, you'll get the idea. Not tested, just from the top of my head.)

Have a look at the Doctrine documentation to learn more about native queries.


By the way, in your example, if you only write to the DB when ($i % 200 == 0), you're going to loose your ($i % 200) last items. You'd need to write something like:

$amount = 53000;

while($i < $amount)
{
    // do your thing

    $i++;

    if($i % 200 === 0 || $i >= $amount)
    {
        // flush/clear
    }
}