301

I'm grepping through a large pile of code managed by git, and whenever I do a grep, I see piles and piles of messages of the form:

> grep pattern * -R -n
whatever/.git/svn: No such file or directory

Is there any way I can make those lines go away?

4
  • 1
    These days I'd recommend using ag, ack, or cgrep instead - they're much faster/better than grep for searching code repositories.
    – lunixbochs
    Commented Aug 3, 2014 at 17:06
  • If you're grepping through code and looking to avoid particular directories, perhaps you should look at ack. It's a source-code aware grep, and as such will actively ignore such VCS directories (as well as vi and emacs backups, non-source files etc.). Commented Oct 9, 2015 at 9:07
  • 4
    How can a user get No such file or directory messages for files and/or directories that exist? Or, conversely, how can grep * be getting names of files that don't exist? Is this a race condition, where some other process manipulates the directory tree (creating, renaming and deleting files) while the grep is running? Commented Jun 2, 2017 at 0:25
  • @Scott-СлаваУкраїні probably a race condition or maybe dangling symlinks. For large directories it's certainly possible that some tooling is concurrently mutating the directory - maybe user runs a build or some other long-running FS-churning command, and is grepping as part of other work while that happens in the background.
    – mtraceur
    Commented Oct 8, 2023 at 21:37

12 Answers 12

416

You can use the -s or --no-messages flag to suppress errors.

-s, --no-messages suppress error messages

grep pattern * -s -R -n
6
  • 29
    @Alex @Dogbert This does answer the question, but '-s' can mask problems, e.g. when you use xargs with grep. Try creating 2 files in a dir, 'aaa.txt' and 'a b.txt', both containing the string 'some text'. The command /bin/ls -1 | xargs grep 'some text' will give you "no such file or directory" because it breaks up 'a b.txt' into 2 args. If you suppress, you won't notice you missed a file.
    – Kelvin
    Commented Jun 21, 2011 at 21:26
  • @Kelvin does, e.g. if I use find and use print0 with xargs -0 Does that solve the issue? Thanks
    – Luka
    Commented Mar 12, 2018 at 1:00
  • 2
    @Luka That should solve the issue. You won't run into problems if you always use those NUL options, but if you don't, it's almost guaranteed (IMHO) that you'll forget at the most inopportune time.
    – Kelvin
    Commented Mar 12, 2018 at 20:20
  • this works on Mac OS X where other options (--quiet) do not
    – philshem
    Commented Feb 28, 2019 at 17:10
  • @Luka if you use find you can omit xargs and use -exec from find (if you need shell facilities you can wrap you command with your favorite shell)
    – Et7f3XIV
    Commented Dec 27, 2020 at 3:11
72

If you are grepping through a git repository, I'd recommend you use git grep. You don't need to pass in -R or the path.

git grep pattern

That will show all matches from your current directory down.

4
  • 7
    +1 for the useful git-specific command. Won't work for svn though :-)
    – cadrian
    Commented Jun 21, 2011 at 13:58
  • 2
    +1 This the git command I've been missing - this lets me grep for a string from the state of the tree in any commit (by adding the commit after "pattern").
    – Kelvin
    Commented Jun 21, 2011 at 21:38
  • 1
    With the fugitive plugin, Ggrep also searches starting from the top of the Git directory instead of current directory. Commented Mar 10, 2016 at 22:48
  • This appears to be significantly faster than standard grep. (Perhaps it ignores binary files, etc? No idea, but useful.)
    – Daniel
    Commented Mar 24, 2020 at 14:12
18

Errors like that are usually sent to the "standard error" stream, which you can pipe to a file or just make disappear on most commands:

grep pattern * -R -n 2>/dev/null
1
  • 1
    Answers the question, but can mask problems. See my comment under Dogbert's answer.
    – Kelvin
    Commented Jun 21, 2011 at 21:41
7

I have seen that happening several times, with broken links (symlinks that point to files that do not exist), grep tries to search on the target file, which does not exist (hence the correct and accurate error message).

I normally don't bother while doing sysadmin tasks over the console, but from within scripts I do look for text files with "find", and then grep each one:

find /etc -type f -exec grep -nHi -e "widehat" {} \;

Instead of:

grep -nRHi -e "widehat" /etc
5

I usually don't let grep do the recursion itself. There are usually a few directories you want to skip (.git, .svn...)

You can do clever aliases with stances like that one:

find . \( -name .svn -o -name .git \) -prune -o -type f -exec grep -Hn pattern {} \;

It may seem overkill at first glance, but when you need to filter out some patterns it is quite handy.

4
  • 1
    +1. This is far better than suppressing errors. However, I think you forgot the -exec before your grep.
    – Kelvin
    Commented Jun 21, 2011 at 21:29
  • 1
    What does this mean? \( -name .svn -o -name .git \)
    – sbhatla
    Commented Jul 21, 2016 at 15:10
  • Why not use grep's --exclude or --exlcude-dir flags? Commented Sep 25, 2018 at 19:09
  • 1
    @sbhatla the parenthesis create an order of operations for the find command. stackoverflow.com/questions/24338777/… Commented Sep 25, 2018 at 19:09
5

Have you tried the -0 option in xargs? Something like this:

ls -r1 | xargs -0 grep 'some text'
1
  • 3
    for find you should add -print0 find -print0 | xargs -0 grep 'text'
    – aliva
    Commented Dec 25, 2012 at 18:14
3

Use -I in grep.

Example: grep SEARCH_ME -Irs ~/logs.

2
  • 2
    -I skips binary files - it's equivalent to --binary-files=without-match. It doesn't suppress "No such file or directory" messages though.
    – mwfearnley
    Commented Aug 11, 2016 at 8:07
  • If the errors are caused by files/dirs with spaces, the -I is a good solution. For example find . -type f -name "*.txt" | xargs -I{} grep "search_str" "{}"
    – pards
    Commented Jun 7, 2022 at 13:11
3

I redirect stderr to stdout and then use grep's invert-match (-v) to exclude the warning/error string that I want to hide:

grep -r <pattern> * 2>&1 | grep -v "No such file or directory"
0

I was getting lots of these errors running "M-x rgrep" from Emacs on Windows with /Git/usr/bin in my PATH. Apparently in that case, M-x rgrep uses "NUL" (the Windows null device) rather than "/dev/null". I fixed the issue by adding this to .emacs:

;; Prevent issues with the Windows null device (NUL)
;; when using cygwin find with rgrep.
(defadvice grep-compute-defaults (around grep-compute-defaults-advice-null-device)
  "Use cygwin's /dev/null as the null-device."
  (let ((null-device "/dev/null"))
    ad-do-it))
(ad-activate 'grep-compute-defaults)
0

One easy way to make grep return zero status all the time is to use || true

 → echo "Hello" | grep "This won't be found" || true

 → echo $?
   0

As you can see the output value here is 0 (Success)

0

Many answers, but none works, sigh

 grep "nick-banner" *.html -R -n

It does not work:

grep: *.html: No such file or directory
0

Problem:

This drove me bananas. I tried everything under the (Google) Sun and nothing worked with this grep which just puked repeated errors about "sysctl: reading key ..." before finally printing the match:

sudo sysctl -a | grep vm.min_free_kbytes

Solution:

Nothing worked UNTIL I had an epiphany: What if I filtered in the front rather than at the back?... Yup: that worked:

sysctl -a --ignore 2>/dev/null | grep vm.min_free_kbytes

Conclusion:

Obviously not every command will have the --ignore switch, but it's an example of how I got around the problem filtering BEFORE my grep. Don't get so blinkered you chase your tail pursuing something that won't work ;-)

Not the answer you're looking for? Browse other questions tagged or ask your own question.