使用PHP从csv文件创建多个数组

I have csv file with 1500+ entries in a column.I can able to read csv file's all values of column with this.

        $rowcount = 1;
        $srcFileName = "input/test.csv";
        $file = fopen($srcFileName,"r");
        $inputfielscount = count(file($srcFileName, FILE_SKIP_EMPTY_LINES));
        while($rowcount < $inputfielscount) 
        {
          $row = fgetcsv($file);
          $result=array("id" =>$row[0],"des"=>"I am jhon",salery="10000");
          $Final=array("listingsEmp"=>$result);
        }

After reading first (1-10) value i will create an array (like array [0] =>$result) and Then wantto repeat same task from (11-20) and create another array (like array [1] =>$Final this time $final array contain information about the next ids whic we read from csv file (11-10)) and so on.

For the above requirment i changed code to this :

        $rowcount = 1;
        $srcFileName = "input/test.csv";
        $file = fopen($srcFileName,"r");
        while($rowcount < 20) 
        {

             if(($rowcount % 10 == 0) && ( $rowcount != 0)) {
                 $rowcount++;
                break;
             }else{

                  $row = fgetcsv($file);
                 // some curl code for fetching data according to csv file field(Id)
                  $result=array("id" =>$row[0],"des"=>"I am jhon",salery="10000"); //contain 10 array 

             }
        }

    $Final=array("listingsEmp"=>$result);

Now i will post this $final array which has (0-10 index array ,each has unique id and corresponding values) using curl and get response which i am save in csv file.

        $currenttime=date("Y-m-d-H_i_s");   
        $opfile='output'.$currenttime.'.csv'; //path wher op csv file exist 
        if(!@copy($srcFileName,'/output/'.$opfile))
        {
            $errors= error_get_last();
            echo "COPY ERROR: ".$errors['type'];
            echo "<br />
".$errors['message'];
        }else { //  echo "File copied from remote!";
            $fp = fopen('output/output'.$currenttime.'.csv',"a");
            $fr = fopen($srcFileName,"r");
            $rowcounts=0;
            $FinalRES=$Final->response;

            while($rowcounts< $inputfielscount) {
                    $resultBulk=$FinalRES[$rowcounts];
                    $resultBulkStatus=$FinalRES->status;
                    $resultBulkErrors=$FinalRES->errors;
                    $errorMsgArray=$resultBulkErrors[0];
                    $BulkErrorsMessage=$errorMsgArray->message;
                    $rows = fgetcsv($fr);
                        if($resultBulkStatus=='failure'){
                                    $list = array ($rows[0],$rows[1],$resultBulkStatus,$BulkErrorsMessage);
                        }else {
                                    $list = array ($rows[0],$rows[1],$resultBulkStatus,"successfully");
                        }
                    fputcsv($fp,$list);
                        //$p++;
                        $rowcounts++;
            }
        }

This full code runs once and give response for 10 ids ,i want repeat this code again for next 10 id (11-20)and then for (21-30) so on . Once all response write in output csv file After that it display download output file link,Output file contain full response for all Ids which is in csv file(1500 +)

    <?php   $dnldfilw='output'.$currenttime.'.csv';?>
            <a href='download.php?filename=<?php echo $dnldfilw; ?>'>Download Output file</a>

        ?>

The easiest method is to just use the file() function you are already using... So to shorten the code to some pseudocode:

<?php

    $indexedArray = array();
    $indexedSplit = 10;

    $lines = file($srcFileName);
    $tempArray = array();
    foreach($lines as $line) {
        if(count($tempArray) % $indexedSplit === 0) {
            $indexedArray[] = $tempArray;
            $tempArray = array();
        }
        $tempArray[] = $line;
    }

    foreach($indexedArray as $index => $valueArray) {
        // do the curl magic
        // write results of curl into csv
    }

Your question is poorly phrased, but I think this would be your aim, right?