All Questions
15
questions
1
vote
0
answers
1k
views
Wget downloading old version of file
I'm trying to create an auto-updater script. I have a script that will continually download files from a Nexus repository. If the file in the repository is updated, the script will continue to ...
0
votes
1
answer
644
views
wget download file behind form
I try to download my LaTeX project periodically from sharelatex.com to archive it on my own.
Unfortunately, I cannot pass the form.
This is my try, what am I doing wrong?
#!/bin/bash
# Log in to the ...
2
votes
1
answer
3k
views
wget Script : %0D at the end of filename
I have a bash script who loop on the wget command. But I don't know why I have a %0D at the end of all of the filenames except the first.
for var in "${array[@]}"
do
wget -P output_folder "${var}"...
0
votes
0
answers
273
views
Wget which stops when it has a download limit
I have over 200 wget in screens on my server. I set a download limit in each low wget: --limit-rate=1K. The only problem is that after one hundred seconds, he considers as a finished one, so this ...
0
votes
1
answer
161
views
Trace web redirection in bash script
I want to make a script to download the image at best resolution (not the preview image) from a deviantART link, as if I clicked the "Download" button.
However, it seems that deviantART redirect the ...
1
vote
1
answer
834
views
Send HTTP request to website with password and username, then record results
I need to record certain numbers (temperature and others) from a web-based monitoring service (LaCrosse Alerts). However, you must login to use this service. I have an account, and am starting to ...
2
votes
1
answer
3k
views
wget execute script after download
I need every time you complete a transfer program wget this run a script.
this is my full command line:
wget -i urls-to-downloads.txt
My problem is that I want to go subtracting it downloads the ...
2
votes
1
answer
330
views
wgt downloads all files except for the images i want
i have trouble using wget to download images from a gallery.
as start i use the overwiew page. it has thumbs that link to the individual pages with the large images.
here is the script i use:
wget --...
8
votes
8
answers
40k
views
Downloading a RAR file from Mediafire using WGET
Example: http://www.mediafire.com/?tjmjrmtuyco
This was what I tried...
wget -A rar [-r [-l 1]] <mediafireurl>
That is to say, I tried with and without the recursive option. It ends up ...
1
vote
1
answer
419
views
How to run a command inside a script without specifying the exact location?
So I have a PHP script that is using wget to download some images. It looks something like this:
exec('wget -O saveImage.png http://location.of.image');
When I run my script on a different machine, ...
4
votes
3
answers
357
views
How to pass a file that changes name to another command in Bash?
I frequently use wget to download tarballs and zip files from the web, then either untar then or gunzip them. I do:
wget google.com/somefile.zip
unzip somefile.zip
rm somefile.zip
Is there a way for ...
17
votes
4
answers
47k
views
Using Wget to Recursively Crawl a Site and Download Images
How do you instruct wget to recursively crawl a website and only download certain types of images?
I tried using this to crawl a site and only download Jpeg images:
wget --no-parent --wait=10 --...
6
votes
3
answers
11k
views
How to download this webpage with Wget?
I want to download the webpage http://forum.ubuntu-it.org/, but it requires a username and password. So I have used this:
wget --save-cookies cookies.txt --post-data 'user=goyamy&passwrd=...
3
votes
1
answer
2k
views
Logging into webpage via script
I'm trying to automate the extraction of some information from a website that first requires me to log in. I have done this in the past (years ago) using wget, but that method no longer seems to work -...
0
votes
1
answer
525
views
Using WGET to Get All of the .mov Files from Diggnation
I'm trying to download every single episode of Diggnation using WGET to grab the small QuickTime movies. It would be too tiring to go through every single page to download them. Here's what I'm ...