PHP和MySQL的最大报告

Good afternoon.

I think you're question is old already, but still have not got a definitive answer. In a software that has millions (even billions) you need to make a report, but it is impossible to do so without seeking all the data and work with them in PHP, for example.

problems:

1 - Naturally these records will take a lot to get to PHP since the data size is counted in GB. 2 - Put all that data into a PHP array to treat them will cause the memory is not enough.

Does anyone of you have gone through this dilemma? For in tables with thousands of data'm already experiencing slowness, and the problem is not the query as it was atimizada for me (which I admit is not a great connoisseur of the database) and most experts, the problem really is in PHP . ITERATE've heard of, but maybe you know a solution with longer example.

I believe this question to serve many users who have gone or are still going through it.

I thank you.

Usually when you have this much data you should load only a chunk of it in memory each time.
Perform your operation on each chunk and calculate the final report.

If your operation are simple enough you can let MySQL do it for you, provided you have a good amount of RAM (but that is cheap to buy).

Also note that if you have more data than the amount of your memory it doesn't not matter if you use PHP or any other language. The algorithm you use is the main concern.