1

I am maintaining a community web site on a hosting company. Once every two weeks I need to download a backup to my home Linux host as an offsite backup. The backup is about 110M bytes total. My site otherwise is idle.

I use sftp to download. This download triggers an alert in the hosting company: IO or network is over limit. The limit is 5M bytes or bits per second. The hosting company then automatically shuts down my website.

My sftp command is very generic

sftp myname@mysite <<EOF
   get -p file1_100M
   get -p file2_5M
EOF

I see a lot posts about speeding up the file transfer. My goal is actually to slow it down, as much as possible.

My Linux is Ubuntu 18. The hosting company also uses Linux in a LAMP stack (Linux Apache MySQL PHP) with standard C-panel.

1 Answer 1

2

I am not sure about sftp but you can use scp with specific command line parameter:

-l limit
Limits the used bandwidth, specified in Kbit/s.

So you command will be:

scp -l 5000 myname@mysite:file1_100M .
3
  • 1
    this is an interesting suggestion. I need to check whether I can use the scp to connect to the hosting company
    – oldpride
    Commented Sep 1, 2022 at 18:46
  • 2
    actually following your lead, i can see sftp also has '-l limit' option. great!!!
    – oldpride
    Commented Sep 1, 2022 at 18:47
  • @oldpride, in such case use it :) Commented Sep 1, 2022 at 18:48

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .