70

Is it possible to list the largest files on my hard drive? I frequently use df -H to display my disk usage, but this only gives the percentage full, GBs remaining, etc.

I do a lot of data-intensive calculations, with a large number of small files and a very small number of very large files. Since most of my disk space used is in a very small number of files, it can be difficult to track down where these large files are. Deleting a 1 kB file does not free much space, but deleting a 100 GB file does. Is there any way to sort the files on the hard drive in terms of their size?

Thanks.

1

11 Answers 11

80

With standard available tools:

To list the top 10 largest files from the current directory: du . | sort -nr | head -n10

To list the largest directories from the current directory: du -s * | sort -nr | head -n10

UPDATE These days I usually use a more readable form (as Jay Chakra explains in another answer and leave off the | head -n10, simply let it scroll off the screen. The last line has the largest file or directory (tree).

Sometimes, eg. when you have lots of mount points in the current directory, instead of using -x or multiple --exclude=PATTERN, it is handier to mount the filesystem on an unused mount point (often /mnt) and work from there.

Mind you that when working with large (NFS) volumes, you can cause a substantial load on the storage backend (filer) when running du over lots of (sub)directories. In that case it is better to consider setting quota on the volume.

3
  • 3
    For your first option, can't you just list them with ls -Sl | head?
    – Bernhard
    Commented Apr 24, 2012 at 18:23
  • No, du traverses the whole directory tree, whereas ls -S only checks the current directory.
    – jippie
    Commented Jun 5, 2012 at 17:25
  • @jippie Your first command only includes directories. Commented May 14, 2021 at 9:49
58

Adding to jippie's answer

To list the largest directories from the current directory in human readable format:

du -sh * | sort -hr | head -n10

Sample:

[~]$ du -sh * | sort -hr | head -n10
48M app
11M lib
6.7M    Vendor
1.1M    composer.phar
488K    phpcs.phar
488K    phpcbf.phar
72K doc
16K nbproject
8.0K    composer.lock
4.0K    README.md

It makes it more convenient to read :)

2
  • The command in this answer does give the correct information, whereas the command in the accepted one doesn't, due to sort -n instead of the correct sort -h.
    – sequence
    Commented Feb 6, 2022 at 11:43
  • This only lists directories, not files. What if there's a directory with a few thousand files in sizes between 2GB and 90GB. But it's not one directory, it's a few hundred directories each with thousand of files between 2GB and 90GB. How to efficiently only list the large files?
    – Ocean
    Commented May 12, 2022 at 12:28
24

Try ncdu, as it can give you an overview of disk usage. From its website:

A disk usage analyzer with an ncurses interface, aimed to be run on a remote server where you don't have an entire gaphical setup, but have to do with a simple SSH connection. ncdu aims to be fast, simple and easy to use, and should be able to run in any minimal POSIX-like environment with ncurses installed.

4

(gnu)

du -max /dir | sort -nWill display big files as well as big directories, can be used to identify where you need to do some cleanup.

du -max | sort -n | tail -1000
...
46632   ./i386/update/SuSE-SLES/8/rpm/i586/kernel-source-2.4.21-138.i586.rpm
49816   ./UnitedLinux/apt/i386/RPMS.updates/k_debug-2.4.21-138.i586.rpm
679220  ./UnitedLinux/apt/i386/RPMS.updates
679248  ./UnitedLinux/apt/i386
679252  ./UnitedLinux/apt
691820  ./UnitedLinux/i586
691836  ./i386/update/SuSE-SLES/8/rpm/i586
695192  ./i386/update/SuSE-SLES/8/rpm
695788  ./i386/update/SuSE-SLES/8
695792  ./i386/update/SuSE-SLES
695804  ./i386/update
695808  ./i386
1390184 ./UnitedLinux

(I know that's a quite old tree :p )

3

There is a simple and effective way to find size of every file and directory in Ubuntu:

Applications > Accessories > Disk Usage Analyzer

in this window click "Scan Filesystem" button on toolbar. after a short time (seconds) you have disk usage of every directory and file.

1

You can try with this command, it will list all files larger than 20Mb.

find / -type f -size +20000k -exec ls -lh {} \; 2> /dev/null \
  | awk '{ print $NF ": " $5 }'  | sort -hrk 2,2
4
  • 3
    If the biggest file on your filesystem is 20MB, you probably wouldn't be running low on disk space. At least with any HD made this millenium
    – Kevin
    Commented Apr 24, 2012 at 17:15
  • That's only example, you put there whatever you want. It will find everything bigger than 20MB, not only 20MB files.
    – patseb
    Commented Apr 24, 2012 at 19:25
  • ls -lh then sort?? ls -s or stat -c %b are probably better.
    – Mikel
    Commented Apr 24, 2012 at 19:54
  • I don't get. My example use ls, and sort. He wanted to find files over whole disk not directory.
    – patseb
    Commented Apr 24, 2012 at 19:57
1

If you prefer a graphical tool, theres https://github.com/shundhammer/qdirstat

1
1

With GNU tools:

find . -type f -printf '%b %p\0' |
  sort -rzn |
  head -zn 20 |
  tr '\0' '\n'

Would report the top 20 largest regular files in terms of disk usage (%b reports the disk usage in 512-byte units) under the current working directory.

To do it for a whole file system, replace find . with find /mount/point/of/that/filesystem -xdev.

To get the input with human-readable K/M/G... suffixes, you can insert a call to numfmt like:

find . -type f -printf '%b %p\0' |
  sort -rzn |
  head -zn 20 |
  numfmt -z --from-unit=512 --to=iec |
  tr '\0' '\n'
0

type the following command

cd /

then type

du -sh * | grep G

above command show you how much memory used by which directory. after that you have to decide which directory or file you want to delete

-2

You can try with this command, it will list the large file:

ls -lrS | tail -1
1
  • shows the current directory, not the entire HDD.
    – slm
    Commented Nov 5, 2014 at 13:05
-3
du -csb `ls` | sort -nr | head -n10
1

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .