2

I have been working on a bash script for a few days and got stuck with filenames including single and double quotes. Give I want to iterate over the following files in a directory:

'file1.txt'
'file2.txt'
"file3.txt"
file"3.txt
file'4.txt
very 'bad file.txt
also "very bad.txt

I know that this is bad filenaming – but you know, not all people are programmers and know how to proberly name files. So I try to cope with this edge cases.

I would like to loop through the files and use them in some if statements like this for example:

dir="my/dir/"
files="$(ls -N "$dir"*)"

shopt -s extglob

for f in $files; do
  if [[ "$files" =  *"(*.png|*.jpg|*.jpeg|*.JPG|*.JPEG|*.webp)" ]]; then
    for $e in "$files"; do
      cp "$e" "$dir""somefilename.pdf"
    else
      cp "$f" "$dir""somefilename.pdf"
  fi
done

Now the problem arises, that bash is intepreting the double and single quotes in the filenames and splitting up the files – errors follow…

So my question is: How to cope with these cases in bash?

3
  • Sorry forgot that – updated the code…
    – benjamin10
    Commented Dec 19, 2023 at 23:15
  • 1
    Good update. In the future, you should include code so readers can easily recreate your test data, i.e. mkdir ./myTestDir ; cd ./myTestDir ; touch \'file1.txt\' ; touch \"file3.txt\" etc.
    – shellter
    Commented Dec 20, 2023 at 0:29
  • You could just use find instead and do -exec things to the files, and/or have the filenames separated by null characters for xargs to process
    – Xen2050
    Commented Dec 20, 2023 at 8:49

1 Answer 1

4

Don't parse the output of ls. That way lies madness and all sorts of strife. If you are trying to do things to or with all files in a given directory, let the shell take care of the filenames for you:

workdir="/path/to/directory"
for file in "${workdir}"/*; do
    case "$file" in
        *.png|*.jpg|*.jpeg|*.JPG|*.JPEG|*.webp)
            cp -- "$file" "$dir"/"somefilename.pdf"
            ;;
    esac
done

Note that I have taken the predicate directly from your question, which has the dubious logic of repeatedly overwriting somefile.pdf with each matching file, which is probably not what you want. Addressing that is beyond the scope of the question of iterating over files, the short(er) answer for which is to simply use for file in path/to/place/*; do stuff_to "$file"; done.

13
  • 1
    In the example solution given, the list of files in directory is captured at run time at the instantiation of the for loop; any files added while the script is running beyond that point will not be taken into account.
    – DopeGhoti
    Commented Dec 20, 2023 at 0:06
  • 2
    @benjamin10 that's how for f in * works too, but if you wanted to reassure yourself, just build an array using globs: files=( my/dir/* ) and then use that for f in "${files[@]}"
    – muru
    Commented Dec 20, 2023 at 0:09
  • 1
    @benjamin10 no, used correctly like in my comment, arrays made using globs handle any valid paths correctly.
    – muru
    Commented Dec 20, 2023 at 0:34
  • 1
    FYI, using the shell's * to list files omits those beginning with a dot . (aka the "hidden" files). Unless the shell option dotglob is enabled
    – Xen2050
    Commented Dec 20, 2023 at 8:39
  • 1
    Not quite but close to tears I was 😂 I tried in despair to format the output of ls so that it handles the single and double quotes...Thanks anyway!
    – benjamin10
    Commented Dec 21, 2023 at 9:05

Not the answer you're looking for? Browse other questions tagged .