Suppose I have many(thousand) rows, now I want to divide data it in 3 columns so what is better option 1) using single for loop 2) divide array in 3 chunks and then use for loop for each chunk . Whose speed will be fast?
I would use a single for loop.
But the algorithm depends on a few factors. Do we know how many rows you have? That way you can just divide by 3. And after the loop reaches a 3rd done, create new chunk.
User single array
and **foreach**
to traverse the array, foreach is better one.
foreach keep the inner pointer move forward one element each step.
or you can use each to traverse an array.
each — Return the current key and value pair from an array and advance the array cursor
while (list ($key, $val) = each ($para)) {
echo $key."=".$val;
}
Here is an performance test from the manual example and the result:
Regarding speed of foreach vs while(list) =each I wrote a benchmark script and the results are that clearly foreach is faster. MUCH faster. Even with huge arrays (especially with huge arrays). I tested with sizes 100,000. 1,000,000 and 10,000,000. To do the test with 10 million i had to set my memory limit real high, it was close to 1gb by the time it actually worked. Anyways,
> <?php function getDiff($start, $end) {
> $s = explode(' ', $start);
> $stot = $s[1] + $s[0];
> $e = explode(' ', $end);
> $etot = $e[1] + $e[0];
> return $etot - $stot; }
>
> $lim=10000000; $arr = array(); for ($i=0; $i<$lim; $i++) {
> $arr[$i] = $i/2; }
>
> $start = microtime(); foreach ($arr as $key=>$val);
>
> $end = microtime(); echo "time for foreach = " . getDiff($start, $end)
> . ".
";
>
> reset($arr); $start = microtime(); while (list($key, $val) =
> each($arr)); $end = microtime(); echo "time list each = " .
> getDiff($start, $end) . ".
"; ?>
>
> here are some of my results: with 1,000,000 time for foreach =
> 0.0244591236115. time list each = 0.158002853394. desktop:/media/sda5/mpwolfe/tests$ php test.php time for foreach =
> 0.0245339870453. time list each = 0.154260158539. desktop:/media/sda5/mpwolfe/tests$ php test.php time for foreach =
> 0.0269000530243. time list each = 0.157305955887.
>
> then with 10,000,000: desktop:/media/sda5/mpwolfe/tests$ php test.php
> time for foreach = 1.96586894989. time list each = 14.1371650696.
> desktop:/media/sda5/mpwolfe/tests$ php test.php time for foreach =
> 2.02504014969. time list each = 13.7696218491. desktop:/media/sda5/mpwolfe/tests$ php test.php time for foreach =
> 2.0246758461. time list each = 13.8425710201.
>
> by the way, these results are with php 5.2 i believe, and a linux
> machine with 3gb of ram and 2.8ghz dual core pentium
as commented by Franz Gleichmann in my case this was right answer "unless you use multiple threads, it does not matter greatly. because if you have 12345 rows, you will have to process 12345 rows. chunking it up will only help if you parallelize the workload."