0

I'm writing bash script and don’t know how to solve my problem.

I want to to download multiple files at the same time , but ... I want to save the files under modified names.

I have a variable with urls.

example.com/aaa.txt
example.com/ooo/bbbbbb.txt 
example.info/c.txt 

The order of the URL is important.

After downloading I want:

1.txt
2.txt
3.txt

I tested parallel and wget. But I do not know how to change the file names.

PS: Limit of 5 simultaneous downloads.

1 Answer 1

0

what i really need is to see a sample of your variable of URLs. but nonetheless, here's the basis of what i'm thinking:

 #!/usr/bin/bash

 end=$(wc -l URLs.txt | sed 's/\ .*//g')
 x=1

 while read URL; do

         until [ $x -eq $end ]; do

                wget $URL --output-document=$x.txt
                x=$(($x+1))

         done

 done < URLs.txt

i'll edit it if you have more infomation, but at the moment i've set it up so that it reads URLs.txt which has to be one URL per line and in the same directory

8
  • Your script is very similar to my code. Unfortunately this isn’t what I'm looking for. In your example, the files aren’t downloaded in parallel. In the loop runs only one wget. It works but it is slow. My URL receives from analysis file directory.djvu (djvudump ./directory.djvu)
    – Creek
    Commented Mar 20, 2016 at 20:29
  • perhaps make five separate URL.txt files and try: bash script1.sh & bash script2.sh & bash script3.sh & bash script4.sh & bash5.sh & Commented Mar 20, 2016 at 21:22
  • I also thought about it. I hope I can find a better solution.
    – Creek
    Commented Mar 20, 2016 at 21:27
  • length=$(wc -l URLs.txt | cut -f1 -d ' ') && split -d -l $(($length/5)) URLs.txt URLs.txt would make the splitting fast Commented Mar 20, 2016 at 21:39
  • seems like alot of results here: startpage.com/do/… Commented Mar 20, 2016 at 21:43

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .