I don't want anything to execute if any preceding step fails:
#!/bin/sh
file="v0.9"
renamed=$file".tar.gz"
dir="utils/external/firepython/"
location="https://github.com/darwin/firepython/tarball/$file"
wget --no-check-certificate $location --output-document=$renamed && \
mkdir -p $dir && \
gunzip $renamed && \
echo "extracting to $dir" && \
tar xf $file".tar" --directory $dir --strip-components 1 && \
echo "Cleaning up..." && \
rm -r $file".tar" && \
echo "Done"
-
1\$\begingroup\$ If you use tar -zxf it will gunzip the file for you. \$\endgroup\$Michał Piaskowski– Michał Piaskowski2011年04月11日 14:35:54 +00:00Commented Apr 11, 2011 at 14:35
5 Answers 5
You're looking for set -e
. From POSIX:
-e
When this option is on, if a simple command fails for any of the reasons listed in Consequences of Shell Errors or returns an exit status value>0, and is not part of the compound list following a while, until, or if keyword, and is not a part of an AND or OR list, and is not a pipeline preceded by the ! reserved word, then the shell shall immediately exit.
In other words, plain commands cause the shell to exit if they fail. (You can use something like command || true
to allow command
to return nonzero.) If you need to perform some cleanup, you can set a trap for the EXIT
pseudo-signal.
And better double-quote all your variable expansions. That way your script won't fail horribly if you ever point it at a directory or an URL containing ?
or *
or a space.
dry -r
wget --no-check-certificate "$location" --output-document="$renamed"
mkdir -p "$dir"
gunzip "$renamed"
echo "extracting to $dir"
tar xf "$file.tar" --directory "$dir" --strip-components 1
echo "Cleaning up..."
rm -r "$file.tar"
echo "Done"
Another useful shell idiom to pass optional arguments to a shell script without hassle is to set variables only if they're unset. That way you can pass arguments through the environment, e.g. file=v0.9.1 myscript
.
: "${file=v0.9}"
: "${renamed=$file.tar.gz}"
: "${dir=utils/external/firepython/}"
: "${location=https://github.com/darwin/firepython/tarball/$file}"
@Gilles answer about set -e
is right on target. An alternative way if only one or two commands in a script are must-haves, you can use important-command || exit
as a way to drop out of the script if any one command fails.
I often include an auxilary function in my scripts called 'flunk' that handles any cleanup that needs to be done if something fails. It might look something like this:
function flunk () {
echo "SCRIPT FAILED: 1ドル"
rm $TMPFILES
exit 1
}
command
important-command || flunk "Could not do X"
command
-
\$\begingroup\$ I was actually wondering about failure scenarios like that, but held back since that was outside the scope of the question. \$\endgroup\$tshepang– tshepang2011年04月12日 07:36:00 +00:00Commented Apr 12, 2011 at 7:36
-
1\$\begingroup\$ @Tshepang: For most tasks, I prefer to program defensively: return an error code if there's anything suspicious. So failing commands abort the script, and only specifically-vetted commands are allowed to fail. Caleb's approach is right in cases where there's a very important command you want to execute, and its preparatory steps are optional; that's not the case here. \$\endgroup\$Gilles 'SO- stop being evil'– Gilles 'SO- stop being evil'2011年04月12日 17:56:44 +00:00Commented Apr 12, 2011 at 17:56
-
2\$\begingroup\$ The other thing that I don't think you can do with just having
set -e
is handling cleanup if there are things you need to do if a command fails. For example in my backup scripts I often have mounts. If something goes wrong with the backup process, I still want the end of the script to run that cleanly unmounts the drives. Putting each of those aspects in functions, then using a flunk function like above allows you to do any cleanup you want before the script closes even when your important-command bombed for some reason. \$\endgroup\$Caleb– Caleb2011年04月12日 19:31:25 +00:00Commented Apr 12, 2011 at 19:31
You are probably may use pipe instead of creating and deleting downloaded file:
mkdir -p "$dir"
echo "extracting to $dir"
wget --no-check-certificate "$location" --output-document=- |
tar zxvf - --strip-components 1 --directory="$dir"
echo "Done"
Sometimes using && or || can induce a race condition, so it may be better to rewrite this with if statements instead, as I do not see at a glance why exit status with the one-liner/"pipeline" approach is not doing it for you... And you can always test if the extracted files are present with like test -f "dir/whatever.file" || exit 1
or something.
-
\$\begingroup\$ Could you provide justification or a citation to back up your remark about possible race conditions? \$\endgroup\$200_success– 200_success2017年07月06日 06:34:06 +00:00Commented Jul 6, 2017 at 6:34
-
\$\begingroup\$ TBH I am not entirely sure. This is just some advice that an old posix savy friend gave me. I was hoping someone might ask this because I am not entirely sure and would like to know more myself. However, I do think it's better to use
if ; then; else
in general. Some pitfalls can be avoided. Please see : mywiki.wooledge.org/BashPitfalls#cmd1_.26.26_cmd2_.7C.7C_cmd3 \$\endgroup\$Chev_603– Chev_6032017年07月07日 17:36:49 +00:00Commented Jul 7, 2017 at 17:36
Just a reminder, these kind of scripts sometimes tend to fail if wget or gunzip is not installed. This might happen in some minimal installations.
-
2\$\begingroup\$ If
wget
orgunzip
are not installed, it will not tend to fail, but certainly fail ;-) \$\endgroup\$janos– janos2017年07月06日 08:30:44 +00:00Commented Jul 6, 2017 at 8:30 -
\$\begingroup\$ Agreed, it's a good idea to check the existence of whatever your script depends on.
if ! gunzip -V >/dev/null 2>&1 ; then { echo gunzip not found;exit 1 ;} ;else { do_whatever ;} ; fi
\$\endgroup\$Chev_603– Chev_6032017年07月07日 17:43:55 +00:00Commented Jul 7, 2017 at 17:43