$ ls -l /tmp/test/my\ dir/
total 0
I was wondering why the following ways to run the above command fail or succeed?
$ abc='ls -l "/tmp/test/my dir"'
$ $abc
ls: cannot access '"/tmp/test/my': No such file or directory
ls: cannot access 'dir"': No such file or directory
$ "$abc"
bash: ls -l "/tmp/test/my dir": No such file or directory
$ bash -c $abc
'my dir'
$ bash -c "$abc"
total 0
$ eval $abc
total 0
$ eval "$abc"
total 0
-
7mywiki.wooledge.org/BashFAQ/050Kamaraj– Kamaraj2018年05月20日 12:49:26 +00:00Commented May 20, 2018 at 12:49
-
1Security implications of forgetting to quote a variable in bash/POSIX shells — But what if ...?Scott - Слава Україні– Scott - Слава Україні2018年07月30日 23:20:30 +00:00Commented Jul 30, 2018 at 23:20
6 Answers 6
This has been discussed in a number of questions on unix.SE, I'll try to collect all issues I can come up with here. Below is
- a description of why and how the various attempts fail,
- a way to do it properly with a function (for a fixed command), or
- with shell arrays (Bash/ksh/zsh) or the
$@
pseudo-array (POSIX sh), both of which also allow building the command line pieces, if you e.g. only need to vary some optoins - and notes about using
eval
to do this.
Some references at the end.
For the purposes here, it doesn't matter much if it's only the command arguments or also the command name that is to be stored in a variable. They're processed similarly up to the point where the command is launched, at which point the shell just takes the first word as the name of the command to run.
Why it fails
The reason you face those problems is the fact that word splitting is quite simple and doesn't lend itself to complex cases, and the fact that quotes expanded from variables don't act as quotes, but are just ordinary characters.
(Note that the part about quotes is similar to every other programming language: e.g. char *s = "foo()"; printf("%s\n", s)
does not call the function foo()
in C, but just prints the string foo()
. That's different in macro processors, like m4, the C preprocessor, or Make (to some extent). The shell is a programming language, not a macro processor.)
On Unix-like systems, it's the shell that processes quotes and variable expansions on the command line, turning it from a single string into the list of strings that the underlying system call passes to the launched command. The program itself doesn't see the quotes the shell processed. E.g. if given the command ls -l "foo bar"
, the shell turns that into the three strings ls
, -l
and foo bar
(removing the quotes), and passes those to ls
. (Even the command name is passed, though not all programs use it.)
The cases presented in the question:
The assignment here assigns the single string ls -l "/tmp/test/my dir"
to abc
:
$ abc='ls -l "/tmp/test/my dir"'
Below, $abc
is split on whitespace, and ls
gets the three arguments -l
, "/tmp/test/my
and dir"
. The quotes here are just data, so there's one at the front of the second argument and another at the back of the third. The option works, but the path gets incorrectly processed as ls
sees the quotes as part of the filenames:
$ $abc
ls: cannot access '"/tmp/test/my': No such file or directory
ls: cannot access 'dir"': No such file or directory
Here, the expansion is quoted, so it's kept as a single word. The shell tries to find a program literally called ls -l "/tmp/test/my dir"
, spaces and quotes included.
$ "$abc"
bash: ls -l "/tmp/test/my dir": No such file or directory
And here, $abc
is split, and only the first resulting word is taken as the argument to -c
, so Bash just runs ls
in the current directory. The other words are arguments to bash, and are used to fill 0ドル
, 1ドル
, etc.
$ bash -c $abc
'my dir'
With bash -c "$abc"
, and eval "$abc"
, there's an additional shell processing step, which does make the quotes work, but also causes all shell expansions to be processed again, so there's a risk of accidentally running e.g. a command substitution from user-provided data, unless you're very careful about quoting.
Better ways to do it
The two better ways to store a command are a) use a function instead, b) use an array variable (or the positional parameters).
Using functions:
Simply declare a function with the command inside, and run the function as if it were a command. Expansions in commands within the function are only processed when the command runs, not when it's defined, and you don't need to quote the individual commands. Though this really only helps if you have a fixed command you need to store (or more than one fixed command).
# define it
myls() {
ls -l "/tmp/test/my dir"
}
# run it
myls
It's also possible to define multiple functions and use a variable to store the name of the function you want to run in the end.
Using an array:
Arrays allow creating multi-word variables where the individual words contain white space. Here, the individual words are stored as distinct array elements, and the "${array[@]}"
expansion expands each element as separate shell words:
# define the array
mycmd=(ls -l "/tmp/test/my dir")
# expand the array, run the command
"${mycmd[@]}"
The command is written inside the parentheses exactly as it would be written when running the command. The processing the shell does is the same in both cases, just in one it only saves the resulting list of strings, instead of using it to run a program.
The syntax for expanding the array later is slightly horrible, though, and the quotes around it are important.
Arrays also allow you to build the command line piece-by-piece. For example:
mycmd=(ls) # initial command
if [ "$want_detail" = 1 ]; then
mycmd+=(-l) # optional flag, append to array
fi
mycmd+=("$targetdir") # the filename
"${mycmd[@]}"
or keep parts of the command line constant and use the array fill just a part of it, like options or filenames:
options=(-x -v)
files=(file1 "file name with whitespace")
target=/somedir
somecommand "${options[@]}" "${files[@]}" "$target"
(somecommand
being a generic placeholder name here, not any real command.)
The downside of arrays is that they're not a standard feature, so plain POSIX shells (like dash
, the default /bin/sh
in Debian/Ubuntu) don't support them (but see below). Bash, ksh and zsh do, however, so it's likely your system has some shell that supports arrays.
Using "$@"
In shells with no support for named arrays, one can still use the positional parameters (the pseudo-array "$@"
) to hold the arguments of a command.
The following should be portable script bits that do the equivalent of the code bits in the previous section. The array is replaced with "$@"
, the list of positional parameters. Setting "$@"
is done with set
, and the double quotes around "$@"
are important (these cause the elements of the list to be individually quoted).
First, simply storing a command with arguments in "$@"
and running it:
set -- ls -l "/tmp/test/my dir"
"$@"
Conditionally setting parts of the command line options for a command:
set -- ls
if [ "$want_detail" = 1 ]; then
set -- "$@" -l
fi
set -- "$@" "$targetdir"
"$@"
Only using "$@"
for options and operands:
set -- -x -v
set -- "$@" file1 "file name with whitespace"
set -- "$@" /somedir
somecommand "$@"
Of course, "$@"
is usually filled with the arguments to the script itself, so you'll have to save them somewhere before re-purposing "$@"
.
To conditionally pass a single argument, you can also use the alternate value expansion ${var:+word}
with some careful quoting. Here, we include -f
and the filename only if the filename is nonempty:
file="foo bar"
somecommand ${file:+-f "$file"}
Using eval
(be careful here!)
eval
takes a string and runs it as a command, just like if it was entered on the shell command line. This includes all quote and expansion processing, which is both useful and dangerous.
In the simple case, it allows doing just what we want:
cmd='ls -l "/tmp/test/my dir"'
eval "$cmd"
With eval
, the quotes are processed, so ls
eventually sees just the two arguments -l
and /tmp/test/my dir
, like we want. eval
is also smart enough to concatenate any arguments it gets, so eval $cmd
could also work in some cases, but e.g. all runs of whitespace would be changed to single spaces. It's still better to quote the variable there as that will ensure it gets unmodified to eval
.
However, it's dangerous to include user input in the command string to eval
. For example, this seems to work:
read -r filename
cmd="ls -ld '$filename'"
eval "$cmd";
But if the user gives input that contains single quotes, they can break out of the quoting and run arbitrary commands! E.g. with the input '$(whatever)'.txt
, your script happily runs the command substitution. That it could have been rm -rf
(or worse) instead.
The issue there is that the value of $filename
was embedded in the command line that eval
runs. It was expanded before eval
, which saw e.g. the command ls -l ''$(whatever)'.txt'
. You would need to pre-process the input to be safe.
If we do it the other way, keeping the filename in the variable, and letting the eval
command expand it, it's safer again:
read -r filename
cmd='ls -ld "$filename"'
eval "$cmd";
Note the outer quotes are now single quotes, so expansions within do not happen. Hence, eval
sees the command ls -l "$filename"
and expands the filename safely itself.
But that's not much different from just storing the command in a function or an array. With functions or arrays, there is no such problem since the words are kept separate for the whole time, and there's no quote or other processing for the contents of filename
.
read -r filename
cmd=(ls -ld -- "$filename")
"${cmd[@]}"
Pretty much the only reason to use eval
is one where the varying part involves shell syntax elements that can't be brought in via variables (pipelines, redirections, etc.). However, you'll then need to quote/escape everything else on the command line that needs protection from the additional parsing step (see link below). In any case, it's best to avoid embedding input from the user in the eval
command!
References
- Word Splitting in BashGuide
- BashFAQ/050 or "I'm trying to put a command in a variable, but the complex cases always fail!"
- The question Why does my shell script choke on whitespace or other special characters?, which discusses a number of issues related to quoting and whitespace, including storing commands.
- Escape a variable for use as content of another script
- How can I conditionally pass an argument from a POSIX shell script?
-
3you can get around the eval quoting thing by doing
cmd="ls -l $(printf "%q" "$filename")"
. not pretty, but if the user is dead set on using aneval
, it helps. It's also very useful for sending the command though similar things, such asssh foohost "ls -l $(printf "%q" "$filename")"
, or in the sprit of this question:ssh foohost "$cmd"
.phemmer– phemmer2018年05月20日 19:39:47 +00:00Commented May 20, 2018 at 19:39 -
1Not directly related, but have you hard-coded the directory? In that case, you might want to look at alias. Something like:
$ alias abc='ls -l "/tmp/test/my dir"'
Hopping Bunny– Hopping Bunny2018年05月23日 02:25:10 +00:00Commented May 23, 2018 at 2:25 -
1"... useful but dangerous..." Forsooth. :)paul garrett– paul garrett2022年07月22日 21:40:04 +00:00Commented Jul 22, 2022 at 21:40
-
1Fantastic answer @ilkkachu, I would love to use the arrays in my case, however I am not able to make it work during expansion when the string has to escape, for example - suppose we have the following code:
cmd=("find" "." "\( -name \"*.txt.gz\" -o -name \"*.sh\" \)"); echo "${cmd[*]}"; "${cmd[@]}"
. This fails as it is expanded with double quotes with the errorfind: ‘\\( -name "*.txt.gz" -o -name "*.sh" \\)’: No such file or directory
- do you know how we are able to fix this?eval
works without issues, i.e.eval "${cmd[*]}"
instead of"${cmd[@]}"
.jtimz– jtimz2023年02月07日 16:22:34 +00:00Commented Feb 7, 2023 at 16:22 -
1If you have an array and want to print it with proper quoting, you could try
echo "${tmptest[@]@Q}"
, which should at least do something like that. Here, it prints'mkdir' 'a; echo c'
, where quoting themkdir
is of course unnecessary, but whatever. (As far as I understand, the output of@Q
should be usable as input to Bash, but I can't think of all the details right now, so I'm not exactly sure if there's still some gotcha. And yes, it's Bash only, zsh has better ways for the same.)ilkkachu– ilkkachu2023年07月25日 11:52:17 +00:00Commented Jul 25, 2023 at 11:52
The safest way to run a (non-trivial) command is eval
. Then you can write the command as you would do on the command line and it is executed exactly as if you had just entered it. But you have to quote everything.
Simple case:
abc='ls -l "/tmp/test/my dir"'
eval "$abc"
not so simple case:
# command: awk '! a[0ドル]++ { print "foo: " 0ドル; }' inputfile
abc='awk '\''! a[0ドル]++ { print "foo: " 0ドル; }'\'' inputfile'
eval "$abc"
-
3It's worth noting the security issue using
eval
poses. See the "Using eval (be careful here!)" section in this answer: unix.stackexchange.com/a/444949/151000.typically– typically2021年06月27日 14:43:03 +00:00Commented Jun 27, 2021 at 14:43
The second quote sign break the command.
When I run:
abc="ls -l '/home/wattana/Desktop'"
$abc
It gave me an error.
But when I run
abc="ls -l /home/wattana/Desktop"
$abc
There is no error at all
There is no way to fix this at the time(for me) but you can avoid the error by not having space in directory name.
This answer said the eval command can be used to fix this but it doesn't work for me :(
-
5Yeah, that works as long as there's no need for e.g. filenames with embedded spaces (or ones containing glob characters).ilkkachu– ilkkachu2018年05月20日 13:22:44 +00:00Commented May 20, 2018 at 13:22
If it needs an array to execute, make it into an array!
IFS=' ' read -r -a command_arr <<< "${command}"
"${command_arr[@]}"
the first line converts the string into an array. The second line executes the command.
This does not appear to work with chained commands, e.g. using &&
or ;
.
-
This completely missed the point of why an array is needed.muru– muru2023年08月27日 00:43:37 +00:00Commented Aug 27, 2023 at 0:43
-
It’s intended to add a convenient syntax option, not a comprehensive answer, which is covered above. Not all answers need to fit into a box.ingyhere– ingyhere2023年08月28日 02:44:39 +00:00Commented Aug 28, 2023 at 2:44
-
2What's convenient about it? Done the way in this answer, there's really not much difference between this and just doing
set -f; $command; set +f
. It won't handle spaces in arguments well, which is one of the reasons why arrays are actually meant to be used.muru– muru2023年08月28日 02:56:01 +00:00Commented Aug 28, 2023 at 2:56 -
Speaking of boxes, this definitely looks like something someone would do to check a box named "Use arrays to execute commands stored in variables"muru– muru2023年08月28日 03:02:19 +00:00Commented Aug 28, 2023 at 3:02
Although @ilkkachu referenced bash's word splitting, I think it would be good to explicitly point out the importance of the IFS shell variable. For example in bash:
OLD_IFS="$IFS"
IFS=$'\x1a'
my_command=$'ls\x1a-l\x1a-a\x1a/tmp/test/my dir'
$my_command
IFS="$OLD_IFS"
would run the command stored in my_command as expected. \x1a is CTRL-Z from ASCII and a good delimiter choice. This works as long as the command the be executed does not contain any CTRL+Z character, which is arguably more likely than with whitespace. I also said bash, since ANSI-C style quoting $'...' is not POSIX as of now.
This technique works either when you have a hard coded command or when you are constructing a command. Just don't forget to reset IFS to its previous value.
-
Update: IEEE Std 1003.1-2024 was just published, ANSI-C style quoting is now in POSIXcubernetes– cubernetes2024年06月15日 17:46:08 +00:00Commented Jun 15, 2024 at 17:46
-
Possibly, but
$'\x1a'
is still a valid character in a Unix filename.2024年06月15日 21:54:04 +00:00Commented Jun 15, 2024 at 21:54 -
Absolutely. If you are not using eval and rely on word splitting, there is absolutely no way to 100% reliably achieve the desired result. You can however parse the command beforehand and determine which character to use as IFS, and then run it. This is only useful if you're playing some kind of CTF tho, in practice use eval, which even allows for constructs that are not just simple commands, so pipelines, AND-lists, subshells, etc. E.g. cmd="rev | (tr -d abc | nl)" && echo example | eval "$cmd" | grep elpcubernetes– cubernetes2024年06月16日 22:49:35 +00:00Commented Jun 16, 2024 at 22:49
-
Pretty sure you don't have to reset IFS.Cestarian– Cestarian2025年05月17日 10:54:00 +00:00Commented May 17 at 10:54
Another trick to run any (trivial/non-trivial) command stored in abc
variable is:
$ history -s $abc
and press UpArrow
or Ctrl-p
to bring it in the command line. Unlike any other method this way you can edit it before execution if needed.
This command will append the variable's content as a new entry to the Bash history and you can recall it by UpArrow
.
In combinaison with another command to replay the last listed history command you can replay it without pressing a key.
$ fc -e : -1