2

I've historically performed something like:

find . 2>/dev/null | xargs grep -i something_to_find 2>/dev/null

If my pwd is barfoo (/foo/bar/baz/foofoo/foobar/foobaz/barfoo) it finds matches. However, if I cd to /foo, it no longer finds the matches.


Conditions:

  • permissions are all 775
  • the directories are not symbolic links
  • they are all on the same file system / server

So I'm curious if there is a default -maxdepth that is applied to find, or are there other constraints as to why this would not work?


Additional Info:

Some great comments have been posted. Here is some additional info:

  • this is for GNU, not POSIX
  • find --version : GNU find version 4.2.27
  • grep --version : (GNU grep) 2.5.1
  • xargs --version : GNU xargs version 4.2.27
  • removing the redirection of STDERR has no bearing on the result, or lack thereof
  • the path to the files in barfoo (known to work) do not have spaces, however files in other directories in /foo/bar may have spaces; though, I don't see how that would be problematic
  • I realize I wasn't specific on the path, but these are all well-named directories, not to be confused with any devices

Interesting Finding:

The first doesn't work, but the second does:

  1. find . -type f | xargs grep -i something_to_find
  2. find . -type f -name "*.ext" | xargs grep -i something_to_find

Even odder is that -name "*.*" does not work, the file extension has to be given; which could be problematic when searching for something.

I'm wondering if there is termination after a max error count, or maximum buffer size. I know there are a lot of files in these directories, but the fact it works when specifying the filetype (limiting results) is interesting.

22
  • 1
    I'm inclined to think that by deliberately redirecting stderr to the bit bucket, you may be hiding a useful error message from yourself,
    – rici
    Commented Oct 24, 2012 at 18:54
  • 2
    Have you tried to step upward one dir each time (cd .. ; find ...)? At what directory does it happen then?
    – ott--
    Commented Oct 24, 2012 at 19:10
  • 1
    I don't think there's a default -maxdepth, but there could be a filesystem limit on the length of pathnames. How long are the real pathnames involved?
    – Barmar
    Commented Oct 24, 2012 at 19:11
  • 2
    Your paths don't have spaces in them by any chance?
    – rici
    Commented Oct 24, 2012 at 19:17
  • 2
    Remove the 2>/dev/null if you want to have any chance of finding out what's going on. Edit your question to add the error messages. Commented Oct 24, 2012 at 23:42

4 Answers 4

7

Directories with names that contain spaces, visible from /foo/bar and not from barfoo, are the likely culprits. xargs splits its output by spaces, and also interprets quotes, backslashes, and even the _ character—see the manual for details, so whitespace in file or directory names cause it to pass incomplete file names to grep.

To work around this issue, use find -print0 in conjunction with xargs -0, like this:

find . -print0 2>/dev/null | xargs -0 grep -i something_to_find 2>/dev/null

The -print0 option tells find to separate file names with a binary 0 character, which cannot appear in a valid file name. The corresponding -0 option tells part to use that same character as the separator, and also not to interpret quotes and backslashes.

3
  • Your explanation makes sense. What I don't completely understand is why it would crash, and not continue on, even if it encountered a file that didn't exist. Regardless, this solved my particular issue +5 (if i could)
    – vol7ron
    Commented Oct 24, 2012 at 22:21
  • 1
    Maybe it's not the spaces, xargs could also have encountered an apostrophe in file name (often present in file names of songs or films), which would make it search for a "matching quote", nowhere to be found. Since you completely discard error output, you wouldn't have seen the "unmatched single quote" error, nor errors from grep. When debugging problems, it's a good idea to at least temporarily remove the 2>/dev/null redirections, so programs can warn you of mistakes.
    – user4815162342
    Commented Oct 24, 2012 at 22:40
  • When I debugged, I removed the error redirection (removing the redirection of STDERR has no bearing on the result, or lack thereof) and I didn't notice anything other than permission denied/does not exist errors (no special characters). I think the does not exist errors were because of the spaces, but it encountered a few of those and still continued on. I'm still stumped on what the true nature was, but am appreciative of your help. I think I tried the -0 on xargs, but didn't do the -print0 on find -- thanks again!
    – vol7ron
    Commented Oct 24, 2012 at 22:44
0

Given your latest edit I'd like to point you at my comment above again:

Since you mention "same server": Is there any chance that special files like /proc/kcore or /dev/zero are anywhere in the path? That would certainly stop grep from going any further.....

Since adding an extension produces different results from the run w/o that kind of rules spaces as the culprit out.

1
  • that's a valid comment and I tried to answer it with I realize I wasn't specific on the path, but these are all well-named directories, not to be confused with any devices. It's still possible that a malformed filename with spaces could have split oddly - I'll have to check when I have a little more time; however I doubt it. I typically don't use spaces in Linux. I think in looking at the "errors" the stopping point was at different places, but like I said... I will check because I hate not knowing the cause of problems to prevent them in the future.
    – vol7ron
    Commented Oct 25, 2012 at 3:36
0

Try

grep -r something_to_find 2>/dev/null

"grep -r ..." will search all files recursively from $PWD.

-1

Try this:

find . -type f -print0 | tee /tmp/file-list | xargs -0 egrep whatever

Does /tmp/file-list (view it with something that doesn't mind nulls) contain your desired file(s)? If no, it's a find problem. If yes, it's an xargs matter.

I've intentionally not 2>'d away the errors. They may be useful.

0

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .