0

I have a text file of URLs, each in the form of:

http://domain.com/a/b/c/d.jpg

I want to download all of these, but save each file under the name:

c_d.jpg

In other words, for each file, I want to save the file under its original filename prefixed by the name of its parent directory.

How would I go about doing this on Windows?

I'm fine with using a command line tool, such as wget or curl, just give me the arguments.

Thanks.

2

1 Answer 1

0

Not sure how to make it in a pure windows environment, but in a cygwin environment, you could try this: (requires bash, sed, wget)

while read link; do a=`echo $link | sed 's/.*\/\(.*\)\/\(.*\)/wget \0 -O \1_\2/'`; echo $a; $($a); done < links.txt

where links.txt is your file.

Of course you can tweak the sed expression to convert the link to a filename in anyway.

Cheers

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .