0

I'm looking for a shell script to check a list of URLs and give output. For example:

http://abc.com/index.php
http://abc1.com/index.php

I may place them in a separate file or inside the script. I need a valid HTTP response code for all these URLs. If all the URLs work, the response code must be HTTP OK HTTP/1.1 200 OK.

If any URL from the above URLs fails, I need output with an error message for that particular URL.

1 Answer 1

1

You will have to install wget is you don't have it already:

#!/bin/bash
IFS='
'
LIST='http://www.google.com
http://www.drk.com.ar/daphne.php
http://www.google.com/this-is-an-error
http://serverfault.com/questions
'
for I in $LIST
do
  wget -q --no-cache --spider $I
  if [ $? != 0 ]; then
    echo Error: $I
  fi
done

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .