34

Using bash, how can one get the number of files in a folder, excluding directories from a shell script without the interpreter complaining?

With the help of a friend, I've tried

$files=$(find ../ -maxdepth 1 -type f | sort -n)
$num=$("ls -l" | "grep ^-" | "wc -l")

which returns from the command line:

../1-prefix_blended_fused.jpg: No such file or directory
ls -l :  command not found
grep ^-: command not found
wc -l:   command not found

respectively. These commands work on the command line, but NOT with a bash script.

Given a file filled with image files formatted like 1-pano.jpg, I want to grab all the images in the directory to get the largest numbered file to tack onto the next image being processed.

Why the discrepancy?

10
  • What's with the quotes around your commands?
    – sarnold
    Commented Jun 21, 2012 at 3:13
  • Also, get rid of the dollar signs when assigning to a variable; files=$(find ...) and num=$( ls -l ... ).
    – chepner
    Commented Jun 21, 2012 at 3:15
  • @sarnold, several scripting tutorials say that having spaces inside execution blocks is a bad thing, and to use double quotes to alleviate the risk.
    – Jason
    Commented Jun 21, 2012 at 3:24
  • @Jason - Most scripting tutorials you find in the "wild" are junk and teach bad habits. mywiki.wooledge.org/BashGuide
    – jordanm
    Commented Jun 21, 2012 at 3:57
  • 2
    @jason: I'd love to see a reference to the guide that proposed foo=$("ls -l") was ever a useful thing to do... :)
    – sarnold
    Commented Jun 21, 2012 at 20:24

11 Answers 11

43

The quotes are causing the error messages.

To get a count of files in the directory:

shopt -s nullglob
numfiles=(*)
numfiles=${#numfiles[@]}

which creates an array and then replaces it with the count of its elements. This will include files and directories, but not dotfiles or . or .. or other dotted directories.

Use nullglob so an empty directory gives a count of 0 instead of 1.

You can instead use find -type f or you can count the directories and subtract:

# continuing from above
numdirs=(*/)
numdirs=${#numdirs[@]}
(( numfiles -= numdirs ))

Also see "How can I find the latest (newest, earliest, oldest) file in a directory?"

You can have as many spaces as you want inside an execution block. They often aid in readability. The only downside is that they make the file a little larger and may slow initial parsing (only) slightly. There are a few places that must have spaces (e.g. around [, [[, ], ]] and = in comparisons) and a few that must not (e.g. around = in an assignment.

1
  • 1
    bash array are a bit too much for this kind of use !! (the become slow when they are big)
    – neam
    Commented Jun 21, 2012 at 8:04
22
ls -l | grep -v ^d | wc -l

One line.

6
  • 3
    This is not universally correct. This call will in most occasions include a "total <NUM_BLOCKS_ALLOCATED>" line (and thus count one too many, see stackoverflow.com/questions/7401704/…). Add a wildcard to prevent this: ls -l * | grep -v ^d | wc -1 Commented May 17, 2017 at 13:07
  • Well spotted, perhaps: ls -l | grep -v ^d | grep -v ^t | wc -l then.
    – mckenzm
    Commented May 17, 2017 at 22:03
  • 1
    I really just wanted to know how many thousands of files were in a directory, so this was close enough for me. Commented Jan 11, 2018 at 18:33
  • 1
    To exclude symbolic links, you can use ls -l | grep -v ^d | grep -v ^t | grep -v ^l | wc -l .
    – Zafer
    Commented Nov 30, 2018 at 11:10
  • 1
    Rather than "anti-grepping", just use ls -l | grep ^- | wc -l. Only file lines start with -.
    – mbomb007
    Commented Jan 31 at 19:40
16

How about:

count=$(find .. -maxdepth 1 -type f|wc -l)
echo $count
let count=count+1 # Increase by one, for the next file number
echo $count

Note that this solution is not efficient: it spawns sub shells for the find and wc commands, but it should work.

2
  • 1
    This is similar to what I ended up doing. Also, efficiency in microseconds isn't too much of a concern as this command will be run at an interval of roughly 20-40 minutes.
    – Jason
    Commented Jun 21, 2012 at 15:56
  • Exactly; if you use pipelining correctly, none of the spawns will be in the inner loop. Well, they'll all be part of the inner-loop assembly line; but they won't be the bottleneck. Commented Jun 22, 2012 at 2:46
4

The most straightforward, reliable way I can think of is using the find command to create a reliably countable output.

Counting characters output of find with wc:

find . -maxdepth 1 -type f -printf '.' | wc --char

or string length of the find output:

a=$(find . -maxdepth 1 -type f -printf '.')
echo ${#a}

or using find output to populate an arithmetic expression:

echo $(($(find . -maxdepth 1 -type f -printf '+1')))
0
3

file_num=$(ls -1 --file-type | grep -v '/$' | wc -l)

this is a bit lightweight than a find command, and count all files of the current directory.

1
  • 1
    For me, ls -1 --file-type returns ls: illegal option -- -. If you're going to provide a vendor-specific solution, at least mention the vendor. Also, this doesn't exclude symlinks.
    – ghoti
    Commented Jun 21, 2012 at 11:47
3

Simple efficient method:

#!/bin/bash
RES=$(find ${SOURCE} -type f | wc -l)
3
  • 4
    Can you give a brief explanation as to why this works? Code dumping is (usually) discouraged because the OP may not know enough about the syntax to fully understand what you're trying to do.
    – rayryeng
    Commented Sep 11, 2014 at 15:44
  • 1
    Be careful, this does not work if there are file names with linebreaks in them in the directory. Commented Sep 11, 2014 at 16:42
  • RES=$(find ${SOURCE} -maxdepth 1 -type f -printf '.' | wc -c) works regardless of file names characters, since it does not count names but the amount of dot characters printed by find for each found file.
    – Léa Gris
    Commented Aug 25, 2019 at 15:36
1

Get rid of the quotes. The shell is treating them like one file, so it's looking for "ls -l".

1

Short and sweet method which also ignores symlinked directories.

count=$(ls -l | grep ^- | wc -l)

or if you have a target:

count=$(ls -l /path/to/target | grep ^- | wc -l)
0

REmove the qoutes and you will be fine

0

Expanding on the accepted answer (by Dennis W): when I tried this approach I got incorrect counts for dirs without subdirs in Bash 4.4.5.

The issue is that by default nullglob is not set in Bash and numdirs=(*/) sets an 1 element array with the glob pattern */. Likewise I suspect numfiles=(*) would have 1 element for an empty folder.

Setting shopt -s nullglob to disable nullglobbing resolves the issue for me. For an excellent discussion on why nullglob is not set by default on Bash see the answer here: Why is nullglob not default?

Note: I would have commented on the answer directly but lack the reputation points.

0

Here's one way you could do it as a function. Note: you can pass this example, dirs for (directory count), files for files count or "all" for count of everything in a directory. Does not traverse tree as we aren't looking to do that.

function get_counts_dir() {

    # -- handle inputs (e.g. get_counts_dir "files" /path/to/folder)
    [[ -z "${1,,}" ]] && type="files" || type="${1,,}"
    [[ -z "${2,,}" ]] && dir="$(pwd)" || dir="${2,,}"

    shopt -s nullglob
    PWD=$(pwd)
    cd ${dir}

    numfiles=(*)
    numfiles=${#numfiles[@]}
    numdirs=(*/)
    numdirs=${#numdirs[@]}

    # -- handle input types files/dirs/or both
    result=0
    case "${type,,}" in
        "files")
            result=$((( numfiles -= numdirs )))
        ;;
        "dirs")
            result=${numdirs}
        ;;
        *)  # -- returns all files/dirs
            result=${numfiles}
        ;;

    esac

    cd ${PWD}
    shopt -u nullglob

    # -- return result --
    [[ -z ${result} ]] && echo 0 || echo ${result}
}

Examples of using the function :

folder="/home"
get_counts_dir "files" "${folder}"
get_counts_dir "dirs" "${folder}"
get_counts_dir "both" "${folder}"

Will print something like :

2
4
6

Not the answer you're looking for? Browse other questions tagged or ask your own question.