使用PHP在MySQL中导入CSV

I'm trying to import huge csv (> 1Gb) files in MySQL database:

/**
 * @param $file_path
 */
private function importFileContents($file_path)
{
    $query = sprintf("LOAD DATA LOCAL INFILE '%s' 
                        INTO TABLE file_import_contents 
                        FIELDS TERMINATED BY ';'
                        LINES TERMINATED BY '\
'
                        IGNORE 1 LINES
                        (@col1, @col2, @col3, @col4, @col5, @col6, @col7, @col8, @col9, @col10, @col11, @col12, @col13, @col14, @col15, @col16, @col17, @col18, @col19, @col20, @col21) 
                        set add_params=@col1, available=@col2, category_id=@col3, currency_id=@col4, delivery=@col5, description=@col6, model=@col8, modified_time=@col9, name=@col10, oldprice=@col11, param=@col12, pickup=@col13, picture=@col14, price=@col15, sales_notes=@col16, type=@col17, type_prefix=@col18, url=@col19, vendor=@col20, vendor_code=@col21  
                    ", addslashes($file_path));

    $em = $this->getContainer()->get('doctrine.orm.default_entity_manager');
    $connection = $em->getConnection();
    $statement = $connection->prepare($query);
    $statement->execute();
}

And the problem is when I execute sql count in db after import:

SELECT COUNT(*) FROM file_import_contents;

it returns me 1 648 990 rows, but when I run command in terminal for counting rows in all csv files:

find ./ -type f -name "*csv*" -exec wc -w {} +

it returns 76 319 426 total rows... My php params are:

upload_max_filesize = 32000M max_execution_time = 300000 max_input_time = 600000 memory_limit = 1024M

...and in script:

ini_set('memory_limit', '-1');

Why is this happening and how could I upload those files entirely? Thank for any advice

Ok, It's working. Error was in counting lines in terminal:

find ./ -type f -name "*csv*" -exec wc -w {} +

instead -w must be -l

find ./ -type f -name "*csv*" -exec wc -l {} +

Result 2 million lines in 102 seconds for localhost

It maybe possible that the server you are running the query on is literally not capable of providing enough memory to hold your CSV file while it is parsed and injected into the database. So regardless of how much you try to configure the settings it will always fail.

I'd consider dividing the CSV into smaller chunks, and also I'd check to make sure that the data being imported is not malformed just in-case the script is not reading line endings correctly.