将包含50,000多个条目的数组转换为mysql数据库

Ok, so I am helping someone with their site. Their original coder thought it would be a great idea to put their entire database in an array.

I am looping through each piece of data in the array and inserting it into a new mysql table I created, the rows are getting inserted just fine but the problem is that php wont let my script execute for more than 60 seconds.

How can I override this setting?

There's the classic solution, using set_time_limit(), which allows you to set the maximum execution time (in seconds) for a script.

Still, you might want to either insert multiple rows with one query or convert the array into CSV and use LOAD DATA which is much faster.

Edit: examples

Inserting multiple rows:

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

Ref: http://dev.mysql.com/doc/refman/5.1/en/insert.html

Using LOAD DATA:

LOAD DATA INFILE 'data.txt' INTO TABLE db2.my_table;

Ref: http://dev.mysql.com/doc/refman/5.1/en/load-data.html

You can use the set_time_limit() function at the top of your page, as follows:

set_time_limit(0);

Try ini_set('max_execution_time', $time_in_seconds);

Source

Solutions:

  • Set max_execution_time to 0 in the php.ini or use set_time_limit()

  • Run the PHP script at the command-line. The time limit for CLI scripts is hardcoded to 0 already.

  • Dump the array out to a CSV file and then use LOAD DATA INFILE to make the database import go 20x faster than inserting one row at a time in a loop.

If you are using mysqli I would use multi_query to batch the operations. The whole thing will run a lot faster.