A function that funnels all filenames into a file and opens that file in vim. The user then changes the names, saves, and quits. Finally the function renames the files in the folder with the new names provided in the file.
bulkrename() {
mkdir -p ~/.temp
ls > ~/.temp/newfilenames && nvim ~/.temp/newfilenames
local files=(*)
for filenum in $(seq 1 $(ls | wc -l)); do
mv ${files[${filenum}]} $(sed "${filenum}q;d" ~/.temp/newfilenames)
done
rm ~/.temp/newfilenames
}
1 Answer 1
Always double-quote variables used in command arguments
The arguments of mv
should have been double-quoted, like this:
mv "${files[${filenum}]}" "$(sed "${filenum}q;d" ~/.temp/newfilenames)"
The script is fragile
As it is, the script looks very fragile:
The list of files for editing is generated by
ls
, and then the edited list items are paired up with thefiles
array generated with*
. I'm not sure the ordering is guaranteed to be consistent, and I think it would be painful to track down in the documentation if this is indeed the case. It would be easier to generate both lists in a way to ensure consistent ordering.Using
ls
to generate the list is problematic. The output ofls > ...
will depend on active aliases.command ls > ...
would be safer.Files whose names don't change will raise errors when executing
mv same same
If there are duplicate lines after editing, one of the original files may silently disappear.
The script may behave unexpectedly in certain corner cases:
- the user deleted a line from the file
- the user inserted a line in the file
- the list of files in the directory changed while editing the file
The user may not have a way to abort the operation. With default shell settings, even if
nvim
exits with failure, the script goes ahead with the renames, which is probably not what a user would want.Even with the double-quoting fixed, the script will not work for files whose names contain newlines. I think that's acceptable and not worth the pain to make it work, but it would be good to document (in a comment).
To mitigate these issues, I suggest:
- Create an array from
*
, let's call itoldnames
- Save
oldnames
to the work file:printf '%s\n' "${oldnames[@]}" > "$work"
- Let the user edit the work file
- Check the exit code, and abort on failure (user can cause failure by exiting
nvim
with:cq
) - Load the content of the work file into another array:
mapfile -t newnames < "$work"
- Add a sanity check to verify that the number of files match before and after.
- Run
mv
only for the files whose names changed, and use-i
to avoid overwriting existing files, and-v
to show what was actually renamed.
Use mktemp
to create temporary files
The script is not safe to use concurrently.
It's easy enough to create a unique temporary file using mktemp
.
Use trap
to clean up temporary files
To make sure that temporary files get cleaned up when the script exits, use trap
, for example:
trap 'rm -f "$tmpfile"' EXIT
Declare this trap right before creating tmpfile
.
Declare all local variables as local
It's good you declared local files
. There is filenum
too.
Don't use seq
The seq
utility is not installed by default in all systems,
and Bash has a native way to use counting loops:
for ((i = 0; i < size; i++)); do ...; done
If you use Bash arrays, reap all the benefits
Instead of $(ls | wc -l)
to find the count of files,
you already have that in the files
array: ${#files[@]}
.
Improve performance
Calling sed
in a loop to get the n-th line of a file is inefficient.
It would be better to read the lines into an array,
and then use a counting loop to with the two arrays, for example:
for ((i = 0; i < ${#oldnames[@]}; i++)); do
old=${oldnames[i]}
new=${newnames[i]}
if [[ "$old" != "$new" ]]; then
mv -vi "$old" "$new"
fi
done
-
\$\begingroup\$ I implemented all of your suggestions. Learned a lot from them. Thank you. New question. \$\endgroup\$user85459– user854592019年07月27日 19:42:27 +00:00Commented Jul 27, 2019 at 19:42