I am maintaining a community web site on a hosting company. Once every two weeks I need to download a backup to my home Linux host as an offsite backup. The backup is about 110M bytes total. My site otherwise is idle.
I use sftp to download. This download triggers an alert in the hosting company: IO or network is over limit. The limit is 5M bytes or bits per second. The hosting company then automatically shuts down my website.
My sftp command is very generic
sftp myname@mysite <<EOF
get -p file1_100M
get -p file2_5M
EOF
I see a lot posts about speeding up the file transfer. My goal is actually to slow it down, as much as possible.
My Linux is Ubuntu 18. The hosting company also uses Linux in a LAMP stack (Linux Apache MySQL PHP) with standard C-panel.