1

I have a FTP server with a very high number of nested directories. I have to download the whole directory recursive. Because of the very high number of directories (most of them only have one file) the download seems to be limited.

My two approaches

  • Using FileZilla (with multiple parallel connections)
  • Download the zipped directory directly from the host

leading all to a very slow downloadrate of a few kb/s. With this speed I need 5 days to download all files ... this is no option.

What possibilities do I have to download the whole FTP directory as fast as possible?

3
  • I use FileZilla's parallel connections every day for bringing down multiple files at once. It works very well, but doesn't work recursively (i.e. if the directories are nested deep it'll only download the top level directories in parallel threads). I'm not aware of anything that would be multi threaded and also recursively loops through the directories. If you find something, let me know.
    – Dave Lucre
    Commented Feb 22, 2017 at 21:02
  • Maybe convert the folder with all subdirectories and files to a.zip or .rar folder and download that file instead.
    – rrobben
    Commented Feb 22, 2017 at 21:07
  • Thank you for all your replys - I found a simple solution and posted it as answer. Commented Feb 24, 2017 at 8:52

2 Answers 2

1

FTP is a horrible protocol - your approach of zipping and downloading files is appropriate as this is likely to be a lot faster then navigating the whole directory tree - indeed if this is only going at a few kb per second there is probably some kind of rate limiting or bandwidth bottleneck between you and the server. (Or possibly an MTU issue)

You did not say whether you have to use FTP. If you can bypass it, try using SCP, or even open up a web server and download the zip file over that. Both of these are run over a single TCP connection, and are much simpler then FTP.

If you have to use FTP, try each of passive and active modes, and ensure the host has the appropriate connection tracking enabled or use a VPN directly to the server you are trying to FTP from in order to avoid NAT / firewall issues.

(Try setting the MTU for the ethernet connection slightly lower on the host, or set up MTU clamping and see if that makes a difference )

0

Since FTP is not the favored protocol to copy deep nested directories, I tried to find a way to pack all files to a single file on server side.

One possibility to pack the files to use a SSH console with a command like

zip -r <username>/backup_29_08_12.zip<username>

Since my provided don't allows such a connection, in my case this was not possibe.

I found out that my provider allows to use the PEAR package Archive_Tar. The result is a PHP scipt packing the files:

<?
    @error_reporting(E_ALL ^ E_WARNING);
    @ini_set("max_execution_time", 0);
    @ini_set("memory_limit", "-1");

    $directory = "";

    $ignore = array("*.sql.gz", "*.tar.gz");

    $path = preg_replace('/(\/www\/htdocs\/\w+\/).*/', '$1', realpath(__FILE__));
    include "Archive/Tar.php";
    $archivename = preg_replace('/.+\/(.+)\/$/', '$1', $path).date('_Y-m-d_His').".tar.gz";
    $archive = new Archive_Tar($archivname, true);
    $archive->setIgnoreList($ignore);
    $archive->createModify($path.$directory, "", $path);
    echo "Backup done";
?>

Calling this script in a folder with CHMOD 0777 took about half an hour. After that I could download the packed files without a limited download speed.

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .