315

I am trying to use Wget to download a page, but I cannot get past the login screen.

How do I send the username/password using post data on the login page and then download the actual page as an authenticated user?

1

9 Answers 9

386

Based on the manual page:

# Log in to the server.  This only needs to be done once.
wget --save-cookies cookies.txt \
     --keep-session-cookies \
     --post-data 'user=foo&password=bar' \
     --delete-after \
     http://server.com/auth.php

# Now grab the page or pages we care about.
wget --load-cookies cookies.txt \
     http://server.com/interesting/article.php

Make sure the --post-data parameter is properly percent-encoded (especially ampersands!) or the request will probably fail. Also make sure that user and password are the correct keys; you can find out the correct keys by sleuthing the HTML of the login page (look into your browser’s “inspect element” feature and find the name attribute on the username and password fields).

11
  • 10
    add --keep-session-cookies to the first command, or the second? Commented Nov 9, 2011 at 2:56
  • 4
    You don't need -p (--page-requisites) for this.
    – ændrük
    Commented Jan 6, 2012 at 17:24
  • 14
    It's also worth adding --delete-after to the first retrieval so you don't end up saving the result page from logging in. Commented Jan 2, 2013 at 15:41
  • 2
    I am getting error WGET64: missing URL I put whole wget command in one line and removed `\`
    – Mowgli
    Commented Mar 28, 2013 at 1:23
  • 6
    --keep-session-cookies is needed for the first command only. It tells the first command to include session cookies when saving cookies to the file. The second command simply reads all cookies from the provided file.
    – wadim
    Commented May 11, 2014 at 17:09
120

You can log in via Firefox, and copy the needed headers afterwards:

screenshot

Use "Copy as cURL" in the Network tab of Firefox's browser developer tools and replace curl's flag -H with wget's --header (and also --data with --post-data if needed).

5
  • Awesome! Also pointed me to the option of using curl instead of wget, since it can do the same thing and I don't even need to change the parameters.
    – Jan
    Commented Apr 8, 2019 at 6:35
  • This worked for me, whereas wget with the correct cookie did not; I suspect the web service checks for multiple different GET headers, even seemingly unimportant ones like "User-Agent" or "Cache-Control."
    – Arthur
    Commented Apr 20, 2020 at 19:16
  • 1
    @Arthur for me this solution was the only one that worked. I tried to remove as much header data from the URL as possible and ended up with essentially the cookie data. So I suspect wget supplied the data in a wrong way. Commented May 19, 2020 at 9:51
  • How can you just say "via browser", is that chrome or firefox?
    – barlop
    Commented Oct 15, 2022 at 7:33
  • This can also be done in Opera. In that case, two different options show up for me, "Copy as cURL (cmd)" and "Copy as cURL (bash)". In my case, after I chose the "Copy as cURL (cmd)" option, I needed to do the following changes as well: - replacing certain special characters in the parameter values (colons, ":") with their percent encodings (%3A for colons.) - remove the characters "^" peppered throughout the copied command.
    – J. D.
    Commented May 16, 2023 at 19:31
77

I directly gave cookies of an existing connection to wget with --no-cookies and the Cookie HTTP request header. In my case it was a Moodle university login where logging in looks more complex (using multiple requests with a login ticket). I added --post-data because it was a POST request.

For example, get all Moodle users list:

wget --no-cookies --header "Cookie: <name>=<value>" --post-data 'tab=search&name=+&personsubmit=Rechercher&keywords=&keywordsoption=allmine' https://moodle.unistra.fr/message/index.php
2
  • 8
    Awesome tip. This is useful when you can access the cookie from your own machine and then use that from another headless machine from the command line. :)
    – Tuxdude
    Commented Jul 27, 2016 at 18:29
  • 4
    You can set multiple cookies at the same time also, --header "Cookie: access_token=IKVYJ;XSRF-TOKEN=5e10521d"
    – Phil C
    Commented May 25, 2018 at 13:28
32

I had the same problem. My solution was to do the login via Chrome and save the cookies data to a text file. This is easily done with this Chrome extension: Chrome cookie.txt export extension.

When you get the cookies data, there is also an example on how to use them with wget. A simple copy-paste command line is provided to you.

5
  • 1
    unfortunately not applicable in automated scripting
    – Znik
    Commented Aug 21, 2015 at 13:49
  • 1
    The question doesn't specify automated scripting. This solution allows 99% of the work to be automated. Commented Feb 6, 2019 at 14:11
  • 1
    Unfortunately, Google must be too smart for this trick. I still get a login page. Commented Aug 22, 2019 at 14:14
  • 1
    Of course, Google uses secret reCAPTCHAs... as I've seen so many places, using standard programmatic APIs is the most practical option in this case. Commented Aug 22, 2019 at 14:38
  • The link you posted is unfortunately down. This one works: chrome.google.com/webstore/detail/get-cookiestxt/… To use with wget: wget --load-cookies /path/to/cookies.txt Commented Jun 21, 2022 at 14:15
10

I wanted a one-liner that didn't download any files; here is an example of piping the cookie output into the next request. I only tested the following on Gentoo, but it should work in most *nix environments:

wget -q -O /dev/null --save-cookies /dev/stdout --post-data 'u=user&p=pass' 'http://example.com/login' | wget -q -O - --load-cookies /dev/stdin 'http://example.com/private/page'

(This is one line, though it likely wraps on your browser)

If you want the output saved to a file, change -O - to -O /some/file/name.ext

9

You don't need cURL to do POSTed form data. --post-data 'key1=value1&key2=value2' works just fine. Note: you can also pass a file name to wget with the POST data in the file.

8

If they're using basic authentication:

wget http://username:[email protected]/page.html

If they're using POSTed form data, you'll need to use something like cURL instead.

4
  • I dont have access to change anything on the server, it is read only Commented Aug 24, 2009 at 20:13
  • 7
    So? None of this requires you to change anything on the server.
    – ceejayoz
    Commented Aug 24, 2009 at 20:15
  • 1
    The Op asked for wget and clearly needs an answer with cookies.
    – hiburn8
    Commented Apr 1, 2021 at 23:13
  • @hiburn8 Just skipping past the "if they're using basic authentication", I see? If OP needs anything twelve years later they're probably in trouble.
    – ceejayoz
    Commented Apr 2, 2021 at 13:19
7

Example to download with wget on server a big file link that can be obtained in your browser.

In example using Google Chrome.

Login where you need, and press download. Go to download and copy your link.

enter image description here

Then open DevTools on a page where you where login, go to Console and get your cookies, by entering document.cookie

enter image description here

Now, go to server and download your file: wget --header "Cookie: <YOUR_COOKIE_OUTPUT_FROM_CONSOLE>" <YOUR_DOWNLOAD_LINK>

enter image description here

2
  • This answer does not seem to scale well to Google -- where there are two pages of cookies! Commented Aug 22, 2019 at 14:09
  • Of course, Google uses secret reCAPTCHAs... as I've seen so many places, using standard programmatic APIs is the most practical option in this case. Commented Aug 22, 2019 at 14:38
7

A solution which uses lynx and wget.

Note: Lynx has to have been compiled with the --enable-persistent-cookies flag for this to work

When you want to use wget to download some file from a site which requires login, you just need a cookie file. In order to generate the cookie file, I choose lynx. lynx is a text web browser. First you need a configure file for lynx to save cookie. Create a file lynx.cfg. Write these configuration into the file.

SET_COOKIES:TRUE
ACCEPT_ALL_COOKIES:TRUE
PERSISTENT_COOKIES:TRUE
COOKIE_FILE:cookie.file

Then start lynx with this command:

lynx -cfg=lynx.cfg http://the.site.com/login

After you input the username and password, and select 'preserve me on this pc' or something similar. If login successfully, you will see a beautiful text web page of the site. And you logout. The in the current directory, you will find a cookie file named as cookie.file. This is what we need for wget.

Then wget can download file from the site with this command.

wget --load-cookies ./cookie.file http://the.site.com/download/we-can-make-this-world-better.tar.gz
1
  • 3
    what about if the login requires javascript? lynx does not seems to support javascript.
    – Tiberiu
    Commented Dec 13, 2018 at 17:41

Not the answer you're looking for? Browse other questions tagged or ask your own question.