It is not cURL but commandline and works super:
If you're not bound to curl, you might want to use wget in recursive mode but restricting it to one level of recursion, try the following;
wget --no-verbose --no-parent --recursive --level=1\
--no-directories --user=login --password=pass ftp://ftp.myftpsite.com/
--no-parent
: Do not ever ascend to the parent directory when retrieving recursively.
--level=depth
: Specify recursion maximum depth level depth. The default maximum depth is five layers.
--no-directories
: Do not create a hierarchy of directories when retrieving recursively.
--delete-after
: can be added if you need to delete files after downloading.
--no-host-directories
: to download right in '.' current folder, not create directory named by domain.
--no-clobber
: skip downloads that would download to existing files
--continue
: Continue getting a partially-downloaded file for more stability
- combine with
cd
: to define the destination directory
So this sample can look like a bit more complicated:
cd /home/destination/folder \
&& wget --no-verbose --no-parent --recursive --level=1 \
--no-directories --no-host-directories \
--no-clobber --continue \
--user="login" --password="pass" ftp://ftp.myftpsite.com/
Feel free to use such command in crontab to automate delivery and local folder synchronization.
crontab -e
*/5 * * * * cd /home/destination/folder && wget --no-verbose --no-parent --recursive --level=1 --no-directories --no-host-directories --no-clobber --continue --user=login --password=pass ftp://ftp.myftpsite.com/
Or add the key --delete-after
for automatic delivery of files from remote to local folder.
If you have special characters in this command (e.g. password) -- put it in a separate .sh file and configure crontab to execute .sh only. Crontab is really tricky in characters escaping.