Despite having written countless shell scripts over the years, typically as ephemeral helpers or for personal use, I still find myself regularly consulting my accumulated notes on the subject – mostly because I can never remember the syntax.
It seemed silly to not have a public URL for these notes, so here’s a brief overview of fairly mundane tips on writing shell scripts. I expect you’re already familiar with some basics (like looking up further details) and will assume you’re using Bash, though most of this should work just as well for Bourne-compatible shells.
Let’s start with some boilerplate:
#!/usr/bin/env bash
set -euo pipefail
This unofficial strict mode helps avoid common traps:
-
set -e
aborts if any command within the script exits with a non-zero value; this is pretty much mandatory -
set -u
balks at unset variables, which often turns out to be surprisingly useful -
set -o pipefail
ensures that piping doesn’t swallow a command’s exit codenote, however, that this might conflict with tools like
head
which do not consume the entire input streamalso, since this is not a POSIX standard, it’s only guaranteed to work in Bash
optionally,
set -x
prints commands before executing them; great for debugging
Conditionals
This is where the aforementioned inability to memorize the syntax kicks in for me, so it helps to have a simple reference to cargo-cult from:
if [ "$1" = "foo" ]; then
echo "lorem ipsum"
elif [ "$1" = "bar" ]; then
echo "dolor sit amet"
else
echo "…"
fi
(If you only care about Bash, you might want to
use double brackets [[…]]
instead.)
-
string comparison:
[ "$foo" = "…" ]
/[ "foo" != "…" ]
[ -z "$foo" ]
detects empty strings while[ -n "$foo" ]
ensures the respective string is not empty integer comparison:
$foo -eq 5
/$foo -ne 5
-
[ -f "$filepath" ]
checks whether a file exists,[ ! -f "$filepath" ]
that it doesn’tthere are various other options to choose from
Command-Line Arguments
Accepting command-line arguments is both helpful and straightforward for simple scenarios.
$0
is the path to the script itself, so generally not very interesting –
except, perhaps, if you want to use it as a reference directory:
root=`dirname "$0"`
(Some folks recommend using $(…)
instead of backticks, in part because it
allows nesting. ¯\_(ツ)_/¯ )
Positional arguments reside in $1
, $2
etc.
input_file="$1"
output_file="$2"
(See Functions below for validation options.)
$@
represents all positional arguments - there’s also $*
, which combines all
those arguments into a single one.
Parsing Options
getopts
comes into play if simple positional arguments are not enough and you
need named parameters (e.g. -f /path/to/file
):
### `:` prefix in front disables verbose error mode
### `:` suffix indicates the respective option expects an argument
while getopts ":hf:" option; do
case $option in
f)
filename="$OPTARG"
;;
h)
usage
exit 0
;;
\?)
echo "ERROR: invalid option '-$OPTARG'" >&2
usage >&2
exit 64 # EX_USAGE: command was used incorrectly
;;
esac
done
### discard option arguments to handle subsequent positional arguments
shift $(($OPTIND - 1))
target="$1"
The Bash Hackers Wiki has a more detailed introduction.
Functions
You’ll note that the h
elp block above invokes usage
, which is not a standard
Unix command but a function like this:
usage() {
cat << EOF
my sample script
Usage:
$ smpl [options] [<summary>]
Options:
-h
display this help message
-f <filename>
specify some file name
EOF
}
Function arguments work just like command-line arguments above:
abort() {
message="$@"
echo "$message" >&2
exit 1
}
abort "oops" "this wasn't supposed to happen"
(Exit codes are typically documented in /usr/include/sysexits.h
, by the way.)
It’s also possible to specify required arguments or default argument values:
foo="${1:?}"
bar=${2:-"lorem ipsum"}
This will complain if $1
is not provided, but $2
is optional and defaults to
"lorem ipsum"
. Default errors for required arguments are fairly unhelpful
though (“line 5: 1: parameter null or not set”), so you might want to provide
your own:
foo="${1:?missing foo}"
Miscellaneous
Spawning a subshell by wrapping commands in parentheses can be useful to provide an independent environment:
cd $HOME
pwd # /home/fnd
(cd /tmp; pwd) # /tmp
pwd # /home/fnd
There are a few variables worth knowing about:
-
$?
contains the exit status of the last command executed -
$!
contains the ID of the last backgrounded process
It’s often helpful to add a confirmation prompt before an operation:
read -r -p "enter 'yes' to continue: " confirmation
if [ "$confirmation" = "yes" ]; then
# …
fi
You might just want to read a single character, e.g. to pause:
read -r -p "press any key to continue" -n1 -s; echo
Of course such prompts might interfere with automating our script’s invocation, in part because they rely on STDIN, so you might consider adding a command-line option to skip them.
Temporary files/directories are created with mktemp [-d]
– but make sure to
clean them up afterwards:
quit() {
rm -rf "$tmpdir"
}
trap quit EXIT
This exit trap ensures that
quit
will be invoked when the script exits.
Further Reading
If you’ve made it down here, you might be interested in these additional resources:
- ShellCheck detects potential flaws in your scripts
- explainshell deconstructs commands and explains each argument
- Bash Pitfalls makes you cry
In case there’s anything I missed or misrepresented, or if you have your own tips to share, let me know in the comments. I might update this article in the future with whatever insights or patterns I pick up.
Many thanks to my colleagues Martin Kühl and Andreas Krüger for sharing their wisdom in reviewing and amending this article.