24

Would like to be able to find full paths of files in a directory tree that exceed a specific size (say 10MB).

Currently aware of Microsoft's Diruse (part of Windows XP Service Pack 2 Support Tools) which does what I want except it only lists directory sizes rather than files.

9 Answers 9

36
forfiles /P D:\ /M *.* /S /D +"01/17/2012"  /C "cmd /c if @fsize gtr 209715200 echo @path @fsize @fdate @ftime"

will scan D:\ and its sub-directories, look for all files whose last modified dates are greater than "17-JAN-2012" and whose sizes are greater than or equal to 200MB, then print their details.

forfiles is included on some Windows Servers, but not by default on Windows XP. You can extract it from the "Windows Server 2003 Resource Kit" download at http://www.microsoft.com/download/en/details.aspx?id=17657 (althou it says is for Windows Server, it runs on Win XP without problems).

4
  • Superb - this did what I wanted (as soon as I fixed the date for the UK 17/01/2012). Thanks. Commented Jan 23, 2012 at 23:02
  • Fails for me with an ERROR: Invalid flags. message. It seems to be because of @fdate.
    – LWC
    Commented Jul 3, 2020 at 18:26
  • @LWC with /D 01/01/2012 it works
    – CarLaTeX
    Commented Nov 9, 2021 at 10:08
  • This date didn't work for me either - see screenshot
    – LWC
    Commented Nov 9, 2021 at 18:16
12

This sounds like a job for PowerShell's

get-childitem

Navigate to the directory in question, check properties with:

get-childitem | get-member

length and FullName look interesting, for example:

get-childitem |ft fullname, length -auto

Once you have mastered the basics try filtering with a where statement.

get-childitem | where-object {$_.length -gt 10000} |ft fullname, length -auto

Experiment with 100000

2
  • 2
    Use Get-ChildItem -recurse to search recursively
    – themadmax
    Commented Mar 13, 2017 at 10:19
  • Mucho faster than the "forfiles" prefered solution ! Commented Jun 11, 2020 at 14:50
3

The Linux utilities port at UnxUtils contains the Linux find command.

You should rename find.exe to something else, example xfind.exe, as find is a built-in function in the Windows Command Prompt. You can then find all file larger then 1000000 bytes by:

xfind directory -size +1000000 -print

Here is the doc for the Linux command find, but I do not know how exactly it was implemented in UnxUtils and for which version of find.

3
  • i believe GnuWin32 is more up-to-date: gnuwin32.sourceforge.net ... plus there's always Cygwin: cygwin.com Commented Nov 3, 2009 at 15:07
  • @~quack: You're right, only that the FileUtils package in GnuWin32 seems quite complex to install, requiring quite a few files.
    – harrymc
    Commented Nov 3, 2009 at 15:16
  • fair enough. i'm a cygwin user, personally. and i tend to prefer up-to-date over simple-to-install. but thanks for explaining your reasoning. Commented Nov 3, 2009 at 21:39
2

Take Command Console LE (which I end up recommending a lot recently), a free replacement for cmd.exe with a lot of extra features, has a command for that: PDIR

pdir /s /(fpn z) /[s10485760,]
  • /s means recursively, run the command from the directory you want to search.
  • /(fpn z) is the format for displaying the results, here: fullpathfullname size
  • /[s10485760,] means size = 10 MB or bigger
1

The find command of cygwin utilities does this. For your requirement

find full paths of files in a directory tree that exceed a specific size (say 10MB).

this gives the result:

find -size +10M -type f -printf "%p %s\n"

-size +10M gives you "objects" bigger than 10 megabyte

-type f gives you files only

-printf prints the found files, %p is path, %s is size (in bytes) and \n is the newline.

1

I've just happened upon the command line tool, Disksum, which seems similar to to diruse, but gives two forms of output:

  • sorted by file counts per directory (ascending)
  • sorted by directory size (ascending)
1

I believe using this solution is more accurate with the description:

Download the command line executable sfk.exe at http://sourceforge.net/projects/swissfileknife/files/

Use it like this

skf.exe list -big

Example output (abbreviated):

[listing 50 of 78 files by size:]
        3951 mens\noname_30.mht
        3996 mens\noname_14.mht
        3996 mens\noname_25.mht
        4060 mens\noname_24.mht
        4263 mens\noname_31.mht
        4701 mens\noname_1.mht
       14568 Thumbnail Restore.zip
       45056 netmeter.exe
     [...]
     12337752 rktools.exe
     16826024 sp35378.exe
     16926496 jre-6u30-windows-i586.exe
     19480227 SugarCE-6.2.4.zip
     21073936 vlc-1.1.11-win32.exe
     22083184 EasyPHP-5.3.8.1-setup.exe
     25517642 MPSOFTWARE.phpDesigner.v8.0.0.145-CRD.rar
     31085033 phpdesigner8usb.zip
     48835640 netbeans 7.exe
     58900704 ZendServer-CE-php-5.2.17-5.6.0-Windows_x86.exe    
     491538432 53400105.iso

If you only want the top 10 bigger files, use:

skf.exe list -big=10

You can customize it further following instructions from: http://stahlworks.com/dev/index.php?tool=list

1
0

I second jitbit answer:

open the location (a disk, or a folder) where you want to look for large files in Windows Explorer in the top-right search box type "size:gigantic" (the box will auto-suggest the syntax and other possible options)

(When you type "size:", you will get other options as well.)

1
  • This should be a comment rather than an answer, but as a new user you are not allowed to comment. Commented May 15, 2021 at 0:43
-1

I know that the question is about command line, but this question keep coming up in Google, so adding another simple way - via Explorer.

  1. open the location (a disk, or a folder) where you want to look for large files in Windows Explorer
  2. in the top-right search box type "size:gigantic" (the box will auto-suggest the syntax and other possible options)
2
  • Question was looking for command line answers.
    – john
    Commented Jan 6, 2014 at 12:36
  • This is a good and valid reply. It shouldn't be downvoted.
    – zar
    Commented Jul 16, 2018 at 18:42

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .