2

How do I escape the output of grep so that bash can read the file names correctly?

I have a text file that is the output of a find command. For each line, I want to make a symlink. For now, I'm just testing my loop with ls. However, I am not quoting the string output of grep properly for special characters. This causes filesystem commands on each file to fail.

$ tree dir/
dir/
├── Another & file ' name.txt
└── file name.txt

0 directories, 2 files

$ cat files.txt
dir
dir/Another & file ' name.txt
dir/file name.txt

$ grep file files.txt | awk -v q='"' '{printf(q"%s"q"\n", $0);}'
"dir/Another & file ' name.txt"
"dir/file name.txt"

$ while read p ; do 
    echo $p; ls $(grep file files.txt | awk -v q='"' '{printf(q"%s"q"\n", $0);}') ; 
done < files.txt
dir
ls: cannot access '"dir/Another': No such file or directory
ls: cannot access '&': No such file or directory
ls: cannot access 'file': No such file or directory
...
dir/Another & file ' name.txt
ls: cannot access '"dir/Another': No such file or directory
ls: cannot access '&': No such file or directory
ls: cannot access 'file': No such file or directory
...

I've tried both single quotes and double quotes. How can I escape this to execute commands on the paths outputted from grep?

3
  • 1
    Try using a for loop instead of parsing the output of ls.
    – clk
    Commented Jul 28, 2016 at 4:37
  • What is that loop supposed to do? It looks you're taking each line in files.txt one-by-one, and then grepping the lines out of the same file again? Do you want to find duplicates or partial matches from the file, ...? If you just want to do something with every file (line) matching the grep, would just something like this work: grep file files.txt | while read p ; do echo "<$p>" ; ln -s "$p" make-links-here/ ; done
    – ilkkachu
    Commented Jul 28, 2016 at 8:48
  • @ilkkachu in this contrived example, the loop really does nothing. In my real-life project, the loop finds one match out of thousands of lines in the file of filesystem entries, and makes a symlink to it.
    – user394
    Commented Jul 28, 2016 at 12:11

1 Answer 1

5

In

ls $(grep file file.txt)

You're using the split+glob operator incorrectly and that's where your problem lies. You don't want to insert quotes in the output of grep as that output is not interpreted as shell code (thankfully!), but you need to tune your split+glob operator.

Here

  1. You don't want the glob part (the expansion of * words for instance into the list of files in the current directory), so you need to disable it with:

     set -o noglob
    
  2. You want to split the output of grep on newline characters only (that still means your file names can't contain newline characters, but that's a limitation on the format of your file.txt file) with:

     IFS='
     '
    

or

    IFS=$'\n'

in some shells.

And then you can invoke your split+glob operator (leaving $(...) unquoted), but more like:

ls -d -- $(grep file files.txt)

That split+glob operator is a misfeature inherited from the Bourne shell. With modern shells, there are other ways to split some text into the list of its lines.

With zsh:

ls -d -- ${(f)"$(grep file files.txt)"}

The, f parameter expansion flag is to split on newline (linefeed), short for ps[\n] (s[string] to split on arbitrary strings, p to understand escape sequences like \n in that string). We quote $(...) to disable the split+glob operator, though in the case of zsh, there's only the split part (no glob).

With bash:

readarray -t files < <(grep file files.txt)
ls -d -- "${files[@]}"

Where you could want to insert quotes is when using xargs as xargs's input format is a blank separated list where the separators can be quoted with single quote, double quote or backslash (but in a different way than shell quoting is done).

So you could do:

 sed '
   s/"/"\\""/; # escape the " characters themselves
   s/^/"/;s/$/"/; # insert one at the beginning and one at the end
 ' < files.txt | xargs ls -d --
2

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .