642

I am trying to use find -exec with multiple commands without any success. Does anybody know if commands such as the following are possible?

find *.txt -exec echo "$(tail -1 '{}'),$(ls '{}')" \;

Basically, I am trying to print the last line of each txt file in the current directory and print at the end of the line, a comma followed by the filename.

5

14 Answers 14

950

find accepts multiple -exec portions to the command. For example:

find . -name "*.txt" -exec echo {} \; -exec grep banana {} \;

Note that in this case the second command will only run if the first one returns successfully, as mentioned by @Caleb. If you want both commands to run regardless of their success or failure, you could use this construct:

find . -name "*.txt" \( -exec echo {} \; -o -exec true \; \) -exec grep banana {} \;
6
  • 1
    how to grep twice? this is failing: find ./* -exec grep -v 'COLD,' {} \; -exec egrep -i "my_string" {} \;
    – rajeev
    Commented Jan 22, 2013 at 16:08
  • 73
    @rajeev The second exec will only run if the return code for the first returns success, otherwise it will be skipped. This should probably be noted in this answer.
    – Caleb
    Commented Mar 20, 2014 at 14:54
  • 1
    Note the use of -n in some of the other answers to suppress the newline generated by echo, which is handy if your second command produces only one line of output and you want them to be easier to read. Commented Apr 19, 2018 at 12:07
  • 1
    Pipe the results of the first -exec into grep? find . -iname "*.srt" -exec xattr -l {} | grep "@type" \; > text.txt
    – John
    Commented Sep 29, 2020 at 17:52
  • Here is way to run the second command (grep banana) only if the first (echo) failed: find . -iname '*.zip' \( -exec unzip {} \; -o -exec 7z x {} \; \). My use case is I want to try to unzip *.zip files with unzip, then with 7z if unzip fails (or cannot be found).
    – CDuv
    Commented Nov 28, 2022 at 15:03
149
find . -type d -exec sh -c "echo -n {}; echo -n ' x '; echo {}" \;
5
  • 7
    If you want to run Bash instead of Bourne you can also use ... -exec bash -c ... instead of ... -exec sh -c .... Commented Oct 15, 2016 at 21:00
  • 21
    Never embed {} in shell code. See unix.stackexchange.com/questions/156008/… Commented Sep 3, 2017 at 7:03
  • 7
    +1 @Kusalananda Injecting filenames is fragile and insecure. Use parameters. see SC2156
    – pambda
    Commented Sep 30, 2017 at 9:04
  • 1
    find . -type d -exec sh -c 'echo -n $0; echo -n " x "; echo $0' "{}" \; solves the issues raised in the above comments
    – Jamie Pate
    Commented Oct 28, 2022 at 23:46
  • 1
    I hate it when I have to spawn a subshell to do this, because this makes variable substitution too complicated; you need to escape the variable twice because simply doing sh -c "command ${variable}" will likely break. Commented Mar 3, 2023 at 18:10
76

One of the following:

find *.txt -exec awk 'END {print $0 "," FILENAME}' {} \;

find *.txt -exec sh -c 'echo "$(tail -n 1 "$1"),$1"' _ {} \;

find *.txt -exec sh -c 'echo "$(sed -n "\$p" "$1"),$1"' _ {} \;
14
  • 20
    What is the underscore before {} for?
    – qed
    Commented Aug 1, 2013 at 10:05
  • 7
    @qed: It is a throw-away value that holds the place of $0. Try this with "foobar" instead of "_": find /usr/bin -name find -exec sh -c 'echo "[$0] [$1]"' foobar {} \; - the output: "[foobar] [/usr/bin/find]". Commented Aug 15, 2013 at 1:20
  • 1
    @XuWang: Yes, I would say that's the case. As you know, $0 is usually the program name (ARGV[0]). Commented Aug 15, 2013 at 3:31
  • 5
    It is critical, for this method, that the script passed to sh -c is in single quotes, not double. Otherwise $1 is in the wrong scope.
    – Nick
    Commented Mar 27, 2015 at 15:02
  • 3
    @Nick quotes has nothing to do with it - you can write '$1' with double quotes as long as you escape the variable ("\$1"). You can escape other characters as well ("\""). Commented Jun 5, 2016 at 5:46
37

Another way is like this:

multiple_cmd() { 
    tail -n1 "$1" 
    ls "$1"
}
export -f multiple_cmd
find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;

in one line

multiple_cmd() { tail -1 "$1"; ls "$1"; }; export -f multiple_cmd; find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;
  • "multiple_cmd()" - is a function
  • "export -f multiple_cmd" - will export it so any other subshell can see it
  • "find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;" - find that will execute the function on your example

In this way multiple_cmd can be as long and as complex, as you need.

4
  • perfect, just what I needed!
    – Anentropic
    Commented Feb 1, 2019 at 17:14
  • @Thomas it does but try this one liner, tested in osx. I made a directory called 'aaa' with some files/dirs in there and CDd to it. Then, ~/aaa$ acmd() { echo x \"$1\" x; }; export -f acmd; find . -exec bash -c 'acmd {}' \;
    – barlop
    Commented May 11, 2020 at 21:54
  • It stays stuck on ">"
    – Sandburg
    Commented Jul 7, 2020 at 7:48
  • @Sandburg The one-liner version had a typo which I fixed now (and both versions had quoting problems; see When to wrap quotes around a shell variable?) Going forward, probably avoid one-liner versions for maintainability and legibility anyway.
    – tripleee
    Commented May 19 at 12:20
25

There's an easier way:

find ... | while read -r file; do
    echo "look at my $file, my $file is amazing";
done

Alternatively:

while read -r file; do
    echo "look at my $file, my $file is amazing";
done <<< "$(find ...)"
5
  • 3
    filenames can have newlines in them, this is why find has the -print0 argument and xargs has the -0 argument Commented Nov 30, 2016 at 21:37
  • 4
    @abasterfield I always hope never to find those in the wild lol Commented Dec 6, 2016 at 22:56
  • 2
    what I wanted to do was "find ... -exec zcat {} | wc -l \;" which didn't work. However, find ... | while read -r file; do echo "$file: zcat $file | wc -l"; done does work, so thank you! Commented Aug 10, 2017 at 12:37
  • In comment above I have "back ticks" around "zcat $file | wc -l". Unfortunately SO turns those into formatting, so I've added it as an actual answer with the correct code visible Commented Aug 10, 2017 at 12:41
  • 1
    @GregDougherty You can escape the backticks ` to do that you use backslashes like so: \​` (still, that's another good reason to use $() instead). Commented Aug 14, 2017 at 5:48
13

Extending @Tinker's answer,

In my case, I needed to make a command | command | command inside the -exec to print both the filename and the found text in files containing a certain text.

I was able to do it with:

find . -name config -type f \( -exec  grep "bitbucket" {} \; -a -exec echo {} \;  \) 

the result is:

    url = [email protected]:a/a.git
./a/.git/config
    url = [email protected]:b/b.git
./b/.git/config
    url = [email protected]:c/c.git
./c/.git/config
4
  • 3
    You can also print the filename and the grep'd content on a single line by passing /dev/null as a second argument to the grep command with one -exec parameter: find . -name config -type f -exec grep "bitbucket" {} /dev/null \;
    – Bill Feth
    Commented Mar 24, 2020 at 14:56
  • In this case, you could do: $ find . -name config -type f -exec grep -nl "bitbucket" {} \; And it will only print the name of the files that matches Commented Sep 2, 2021 at 10:02
  • FYI: grep -H does exactly that. Prints the file name along with the matching line.
    – Emsi
    Commented Mar 9, 2023 at 12:02
  • The -a is really not doing anything useful here; the default behavior of find is already to abandon the current file if -exec fails. (The -a option is useful in more complex command lines where you have conditions in parentheses or etc.)
    – tripleee
    Commented May 19 at 12:19
8

I don't know if you can do this with find, but an alternate solution would be to create a shell script and to run this with find.

lastline.sh:

echo "$(tail -1 "$1"),$1"

Make the script executable

chmod +x lastline.sh

Use find:

find . -name "*.txt" -exec ./lastline.sh {} \;
1
  • 9
    backticks are deprecated, please encourage the usage of $(...) which is better readable, fontindependently, and easy to nest. Thank you. Commented Mar 12, 2011 at 18:41
7

Thanks to Camilo Martin, I was able to answer a related question:

What I wanted to do was

find ... -exec zcat {} | wc -l \;

which didn't work. However,

find ... | while IFS= read -r file; do echo "$file: $(zcat "$file" | wc -l)"; done

does work, so thank you!

2
  • Even after fixing trivial quoting errors, this is still prone to various issues around file names with newlines etc etc.
    – tripleee
    Commented May 19 at 12:44
  • Anyway, find ... -exec sh -c 'echo "$1: $(zcat "$1")"' _ {} \; does what you ask, robustly (or, also more efficiently, find ... -exec sh -c 'for f; do echo "$f: $(zcat "$f" | wc -l)"; done' _ {} +)
    – tripleee
    Commented May 19 at 13:07
5

The first answer of Denis is the answer to resolve the trouble. But in fact it is no more a find with several commands in only one -exec like the title suggests. To answer the one -exec with several commands thing we will have to look for something else to resolve. Here is a example:

Keep last 10000 lines of .log files which has been modified in the last 7 days using one -exec command using several {} references

  1. See what the command will do on which files:

    find / -name "*.log" \
        -type f -mtime -7 \
        -exec sh -c '
            echo tail -10000 "$1" \> fictmp; echo cat fictmp \> "$1"' _ {} \;
    
  2. Do it (note no more \> but only >this time):

    find / -name "*.log" \
        -type f -mtime -7 \
        -exec sh -c '
            tail -10000 "$1" > fictmp; cat fictmp > "$1" ; rm fictmp"' _ {} \;
    
2
  • But this will break if one of the filenames has a space, I believe. Commented Jun 5, 2016 at 4:17
  • I updated this to avoid the obvious issues.
    – tripleee
    Commented May 19 at 12:53
3

I usually embed the find in a small for loop one liner, where the find is executed in a subcommand with $().

Your command would look like this then:

for f in $(find *.txt); do echo "$(tail -1 $f), $(ls $f)"; done

The good thing is that instead of {} you just use $f and instead of the -exec … you write all your commands between do and ; done.

Not sure what you actually want to do, but maybe something like this?

for f in $(find *.txt); do echo $f; tail -1 $f; ls -l $f; echo; done
3
  • 5
    It's worth noting that according to ShellCheck it's not the best solution - SC2044: For loops over find output are fragile. Use find -exec or a while read loop. There is a great example and description on ShellCheck wiki github.com/koalaman/shellcheck/wiki/Sc2044 Commented May 18, 2021 at 15:08
  • 2
    This also is exactly what BashPitfalls #1 advises against. Commented Nov 30, 2021 at 16:54
  • Unfortunately, this needs more downvotes. Don't recommend command substitutions for this without a stern warning and/or an explanation of how it could go wrong.
    – tripleee
    Commented May 19 at 12:17
3

I found this solution (maybe it is already said in a comment, but I could not find any answer with this)

you can execute MULTIPLE COMMANDS in a row using "bash -c"

find . <SOMETHING> -exec bash -c "EXECUTE 1 && EXECUTE 2 ; EXECUTE 3" \;

in your case

find . -name "*.txt" -exec bash -c "tail -1 '{}' && ls '{}'" \;

i tested it with a test file:

 [gek@tuffoserver tmp]$ ls *.txt


casualfile.txt
[gek@tuffoserver tmp]$ find . -name "*.txt" -exec bash -c "tail -1 '{}' && ls '{}'" \;
testonline1=some TEXT
./casualfile.txt
1
  • The single quotes are an attempt to fix quoting issues, but they will fail spectacularly if the file name contains literal single quotes. The safe approach is to pass the file name in $1; find . -name "*.txt" -exec bash -c 'tail -1 "$1" && ls "$1"' _ {} \; - see also mywiki.wooledge.org/BashFAQ/020
    – tripleee
    Commented May 19 at 12:14
2

A find+xargs answer.

The example below finds all .html files and creates a copy with the .BAK extension appended (e.g. 1.html > 1.html.BAK).

Single command with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} cp -- "{}" "{}.BAK"

Multiple commands with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} echo "cp -- {} {}.BAK ; echo {} >> /tmp/log.txt" | sh

# if you need to do anything bash-specific then pipe to bash instead of sh

This command will also work with files that start with a hyphen or contain spaces such as -my file.html thanks to parameter quoting and the -- after cp which signals to cp the end of parameters and the beginning of the actual file names.

-print0 pipes the results with null-byte terminators.


for xargs the -I {} parameter defines {} as the placeholder; you can use whichever placeholder you like; -0 indicates that input items are null-separated.

2
  • xargs -I{} sh -c '...{}...' has major security problems, and xargs -I{} echo '...{}...' | sh is just as bad. What happens when you get a filename that contains $(/tmp/evil) in its name as literal text? (Yes, every character in that string is valid in a filename). Or $(rm -rf ~)'$(rm -rf ~)' -- yes, again, single quotes can exist in filenames on UNIX. Commented Nov 30, 2021 at 16:55
  • 1
    The safe thing is to keep your names out-of-band from your code. find ... -exec bash -c 'for arg; do something_with "$arg"; done' _ {} + keeps the fienames as arguments, out-of-band from the string interpreted by the shell as code. Commented Nov 30, 2021 at 16:56
1

should use xargs :)

find *.txt -type f -exec tail -1 {} \; | xargs -ICONSTANT echo $(pwd),CONSTANT

another one (working on osx)

find *.txt -type f -exec echo ,$(PWD) {} + -exec tail -1 {} + | tr ' ' '/'
2
  • 3
    This overlooks a major use case for find - situations where the number of matching files is too large for a command line. -exec is a way to get around this limit. Piping out to a utility misses that benefit. Commented Jan 27, 2017 at 14:32
  • 1
    xargs -n exists to choose the number of matches per invocation. xargs -n 1 foocmd will execute foocmd {} for every match.
    – AndrewF
    Commented Apr 3, 2019 at 23:27
0

Here is my bash script that you can use to find multiple files and then process them all using a command.

Example of usage. This command applies a file linux command to each found file:

./finder.sh file fb2 txt

Finder script:

# Find files and process them using an external command.
# Usage:
#   ./finder.sh ./processing_script.sh txt fb2 fb2.zip doc docx

counter=0
find_results=()
for ext in "${@:2}"
do
    # @see https://stackoverflow.com/a/54561526/10452175
    readarray -d '' ext_results < <(find . -type f -name "*.${ext}" -print0)

    for file in "${ext_results[@]}"
    do
        counter=$((counter+1))
        find_results+=("${file}")
        echo ${counter}") ${file}"
    done
done
countOfResults=$((counter))
echo -e "Found ${countOfResults} files.\n"


echo "Processing..."
counter=0
for file in "${find_results[@]}"
do
    counter=$((counter+1))
    echo -n ${counter}"/${countOfResults}) "
    eval "$1 '${file}'"
done
echo "All files have been processed."

1
  • All this so that you can report how many matches you found, and you hadn't thought of using ${#ext_results[@]} to get that number? That's a lot of code for a very pedistrian additional convenience over what just find already provides.
    – tripleee
    Commented May 19 at 13:00

Not the answer you're looking for? Browse other questions tagged or ask your own question.