通过FTP / PHP下载每次都停在同一个地方

I have a PHP script to connect via FTP to a remote server to download everything recursively and add the file to a MySQL database.

The problem I have is that it always stops after different amounts of time. They are call recordings and so there are a lot of them. The folder structure looks a little like this:

|/
|¬merged
 |¬20150105
  |¬Time-20150105-091444-From-2244605-To-2244615.wav
 |¬20150106
  |¬Time-20150105-091444-From-1234567-To-1236547.wav

And so on... (Obviously multiple files in each...)

Heres the FTP script:

<?php
include('../includes/init.php');
$ftp_server = "ftp1.FTPSERVER.com"; 
$conn_id = ftp_connect ($ftp_server) 
    or die("Couldn't connect to $ftp_server"); 

$login_result = ftp_login($conn_id, "USER", "PASSWD"); 
if ((!$conn_id) || (!$login_result)) 
    die("FTP Connection Failed"); 

ftp_pasv($conn_id, false);

ftp_sync (".");    // Use "." if you are in the current directory 

ftp_close($conn_id);  

// ftp_sync - Copy directory and file structure 
function ftp_sync ($dir) { 

   global $conn_id, $core; 

    if ($dir != ".") { 
        if (ftp_chdir($conn_id, $dir) == false) { 
            echo ("Change Dir Failed: $dir<BR>
"); 
            return; 
        } 
        if (!(is_dir($dir))) 
            mkdir($dir);
        chdir ($dir); 
    } 

    $contents = ftp_nlist($conn_id, "."); 
    foreach ($contents as $file) { 

        if ($file == '.' || $file == '..') 
            continue; 

        if (@ftp_chdir($conn_id, $file)) { 
            ftp_chdir ($conn_id, ".."); 
            ftp_sync ($file); 
        } 
        else 
            ftp_get($conn_id, $file, $file, FTP_BINARY);
            $core->addFile('3', $file);
    } 

    ftp_chdir ($conn_id, ".."); 
    chdir (".."); 

} 
?>

I run this script via shell to avoid PHP's max execution time.

The addFile function simply reads the filename and places it on a new row in the DB.

If it helps, the CLI does output this...

Warning: ftp_get(): 1.763 seconds (measured here), 2.57 Mbytes per second in /home/user/web/domain.com/public_html/test/ftp.php on line 43 PHP Warning: ftp_get(): 1.763 seconds (measured here), 2.57 Mbytes per second in /home/user/web/domain.com/public_html/test/ftp.php on line 43

Warning: ftp_get(): 1.763 seconds (measured here), 2.57 Mbytes per second in /home/user/web/domain.com/public_html/test/ftp.php on line 43 PHP Warning: ftp_chdir(): 1.763 seconds (measured here), 2.57 Mbytes per second in /home/user/web/domain.com/public_html/test/ftp.php on line 47

Any help is greatly appreciated.

After a fair time debugging, I managed to work out that the problem actually lied in the $core->addFile() function.

This was running a MySQL query for every file and so was blocking. The MySQL connection ended up timing out and thus, ceasing the script.

This question was very hard to answer as I neglected to add the addFile function above. Sorry!

The fix was simple in the end... Up the timeout time in /etc/my.cnf (on CentOS).

Increase script execution time in your php.ini file. This may solve your problem. OR There may be some error associated with file name on 220th file which stops executing your script. OR increase memory limit

ini_set('memory_limit','1024M');