too long

I am connecting to another server via php's ftp connect.

However I need to be able to extract all html files from it's web root which is causing me a bit of a headache...

I found this post Recursive File Search (PHP) which talks about using RecursiveDirectoryIterator function however this is for a directory on the same server as the php script its self.

I've had a go with writing my own function but not sure I've got it right... assuming that the original path sent to the method is the doc root of the server:

public function ftp_dir_loop($path){

    $ftpContents = ftp_nlist($this->ftp_connection, $path);

    //loop through the ftpContents
    for($i=0 ; $i < count($ftpContents) ; ++$i)
        {
            $path_parts = pathinfo($ftpContents[$i]);

            if( in_array($path_parts['extension'], $this->accepted_file_types ){

                //call the cms finder on this file
                $this->html_file_paths[] = $path.'/'.$ftpContents[$i];

            } elseif(empty( $path_parts['extension'] )) {

                //run the directory method
                $this->ftp_dir_loop( $path.'/'.$ftpContents[$i] );  
            }
        }
    }   
}

Has anyone seen a premade class to do something like this?

You can try

public function ftp_dir_loop($path) {
    $ftpContents = ftp_nlist($this->ftp_connection, $path);
    foreach ( $ftpContents as $file ) {
        if (strpos($file, '.') === false) {
            $this->ftp_dir_loop($this->ftp_connection, $file);
        }
        if (in_array(pathinfo($file, PATHINFO_EXTENSION), $this->accepted_file_types)) {
            $this->html_file_paths[$path][] = substr($file, strlen($path) + 1);
        }
    }
}