We want to check if a URL is down or not. But sometimes, the environment is down for maintenance for 3-4 hours and we don't want to keep sending emails during that time.
I have written a shell script for performing a URL check and running it every 30 mins using a cronjob.
The actual requirements are:
- Check if the URL is up. If it is down, send an email.
- Cronjob will execute the script again. If Step 1 sent an email, then send an email again asking if the environment is under maintenance.
- Cronjob will execute the script again. If it is still down, don't do anything.
- Keep checking the URL, and if it is responding, don't do anything. But if it goes down again, follow step 1-3.
The script works. Could you please review and suggest if there is a nicer way of writing the script? I'm learning shell script but don't know all the available options.
#!/bin/bash
#Checking urls from urls.txt
MAddr="[email protected]"
TIME=`date +%d-%m-%Y_%H.%M.%S`
SCRIPT_LOC=/user/inf/ete4/eteabp4/eid_scripts/jsing002
for url in `awk '{print 1ドル}' $SCRIPT_LOC/urls.txt`
do
/usr/bin/wget -t 0 --spider --no-check-certificate $url > wget.output 2>&1
HTTPCode=`(/usr/bin/wget -t 0 --spider --no-check-certificate $url) 2>&1 | grep HTTP| tail -1|cut -c 41-43`
ENV=`(grep $url $SCRIPT_LOC/urls.txt | awk '{print 2ドル}')`
echo $HTTPCode
E1=`/bin/grep -ise 'refused' -ise 'failed' wget.output`
if [ "$E1" != "" ] || [ $HTTPCode -ge 500 ]
then
status="DOWN"
echo "Step 1"
echo "${ENV}""_DOWN"
if [ -f "${ENV}""_DOWN" ];
then
echo "step 2"
echo "Please check if $ENV in Maintanance window.The check for $url has failed twice.Next The next failure email will be sent if preceding test was SUCCESSFUL" | /bin/mail -s "Is $ENV in Maintanance Window ?" $MAddr
mv "${ENV}""_DOWN" "${ENV}""_DOWN""_2"
echo "Step 3"
elif [ -f "${ENV}""_DOWN""_2" ];
then
echo "this is elif statement"
else
echo "E1 is empty. Site is down"
echo "Site is down. $url is not accessible" | /bin/mail -s "$ENV is $status" $MAddr
touch "${ENV}""_DOWN"
fi
else
if [ $HTTPCode -eq 200 ]
then
status="UP"
echo $status
rm "${ENV}""_DOWN""_2"
fi
fi
done
Content of urls.txt:
http://mer01bmrim:30270/rim/web E2E-RIMLITE4
http://mer01csmap:18001/console ABP_WL-E2E1
http://mer02sitap:18051/console ABP_WL-E2E2
http://mer03sitap:18101/console ABP_WL_E2E3
1 Answer 1
- Quoting: Be in the habit of always double-quoting your variables when you use them, e.g.
"$url"
instead of$url
. Otherwise, nasty vulnerabilities could happen if a variable's value contains spaces or shell metacharacters. URLs, especially, often contain special characters such as&
and?
. Structure: When processing input with multiple columns, one row at a time, the idiom to use is...
while read url env ; do # Do stuff here done < "$SCRIPT_LOC/urls.txt"
If the columns are delimited by something other than whitespace...
while IFS=: read user pwhash uid gid gecos homedir shell ; do # Do stuff here done < /etc/passwd
Status of
wget
: You runwget
twice for each URL. Instead of trying to get the HTTP status code, consider using just the exit status ofwget
to indicate success or failure.if wget -q -t 0 --spider --no-check-certificate "$url" ; then # Handle success else # Handle failure fi
I've used the
--quiet
flag here. Also, I think that an infinite timeout (-t 0
) is a bad idea.- HTTP status interpretation: HTTP status codes other than 200 (e.g. 2xx or 3xx) could also indicate some kind of success.
- Filesystem littering: You litter the current directory with temporary files
wget.output
and"${ENV}_DOWN"
and"${ENV}_DOWN_2"
. Perhaps you could append all the state to a single log file. SCRIPT_LOC
could probably be computed using$(dirname 0ドル)
.
-
\$\begingroup\$ @200_suceess . Thank you for replying. I'm going to include Quoting,--quiet in wget and Filesystem littering suggestion.<br/> . For structure section,i am using 2nd column to indicate the environment being represented by url. Thats why im using awk to get the url and then 2 column value in email to notify which environment is down. Is a good way to implement like this ? \$\endgroup\$user2950074– user29500742013年11月04日 15:56:59 +00:00Commented Nov 4, 2013 at 15:56