4

I'm trying to write a script that will display the name of oldest file within the directory that the script is executed from.

This is what I have so far:

#!/bin/bash
for arg in $*
do
 oldest=$1
 if [[ $arg -ot $oldest ]]
 then
  oldest=$arg
 fi
done

echo "Oldest file: $oldest"

I'm not sure how to increment to the next file to check if it is older

for example:

oldest=$2
oldest=$3
etc..

trying to run this script in the bash shell given the following args:

myScript `ls -a`

I get a result of:

Oldest File: .
2
  • for arg in $* is buggy -- try it with arguments that contain whitespace or quoted wildcards to see why. for arg in "$@" or just for arg (as default behavior is to iterate over "$@") is the correct alternative. Of course, all of those are assuming that you're iterating over command-line arguments, not files. Commented Nov 24, 2014 at 3:08
  • ...to iterate over files, it's for arg in *, perhaps with a [ -d "$arg" ] && continue just inside the loop to skip directories. Commented Nov 24, 2014 at 3:10

4 Answers 4

9

The ls program has an option to sort on time and you can just grab the last file from that output::

# These are both "wun", not "ell".
#             v          v
oldest="$(ls -1t | tail -1)"

If you want to avoid directories, you can strip them out beforehand:

# This one's an "ell", this is still a "wun".
              v                         v
oldest="$(ls -lt | grep -v '^d' | tail -1 | awk '{print $NF}')"

I wouldn't normally advocate parsing ls output but it's fine for quick and dirty jobs, and if you understand its limitations.


If you want a script that will work even for crazies who insist on putting control characters in their file names :-) then this page has some better options, including:

unset -v oldest
for file in "$dir"/*; do
    [[ -z $oldest || $file -ot $oldest ]] && oldest=$file
done

Though I'd suggest following that link to understand why ls parsing is considered a bad idea generally (and hence why it can be useful in limited circumstances such as when you can guarantee all your files are of the YYYY-MM-DD.log variety for example). There's a font of useful information over there.

2
3

You may use find command: find -type f -printf '%T+ %p\n' | sort | head -1

1
  • 3
    Revise that to { IFS= read -r -d' ' time; IFS= read -r -d '' name; } < <(find -type f -printf '%T+ %p\0' | sort -z) and you work with all possible filenames, including ones with newlines (which your current code doesn't handle). Requires GNU sort, of course, but since you're already depending on a GNU extension to find... Commented Nov 24, 2014 at 4:07
3

You can use this function to find oldest file/directory in any directory:

oldest() { 
   oldf=
   for f in *; do
      # not a file, ignore
      [[ ! -f "$f" ]] && continue
      # find oldest entry
      [[ -z "$oldf" ]] && oldf="$f" || [[ "$f" -ot "$oldf" ]] && oldf="$f"
   done
   printf '%s\n' "$oldf"
}

And call it in any directory as:

oldest
0
2

The following oneliner finds the oldest file in the whole subtree and reports it using the long format and relative full path. What else could you ever need?

ls -ltd $(find . -type f) | tail -1
1
  • 1
    This one fails if the number of files is large, because the argument list is too long.
    – misnomer
    Commented May 15, 2023 at 9:57

Not the answer you're looking for? Browse other questions tagged or ask your own question.