0

I started a decompress operation using compact /U /S /I /Q > compact-report.txt on a large dir (120GB+, ~950M+ files) and it has been going on for about 24 hours now. Most files seem decompressed, but I am wondering whether it is hanging on a certain file by listing all files that are not decompressed yet.

I know that the command compact /s lists the compressed status of all files recursively, but is there a command that lists just those files that are actually compressed?

Note: I solved it temporarily by running `compact /s >status.txt" and querying the text file for ocurrences of "New files added to this directory will not be compressed.", and it turns out that the total number remains the same, leading me to a next question, but I'm still interested in a more direct command"

3
  • Yep, a billion files will make NTFS weep in sweet misery. If anyone finds a way of dumping a billion files in rapid succession, let me know. I hope the directory structure was laddered to host the actual files. Usually when I deposit millions or billions of files to disk I use their hash or date timestamp to create folders 5 or 6 deep, and drop the files in those destination folders. Seems to help a lot.
    – beeks
    Commented Apr 18, 2014 at 3:08
  • @beeks: the directories contain Hg repositories, not much I can do to change the structure of the dirs, the dirnames or the depths.
    – Abel
    Commented Apr 18, 2014 at 12:56

0

You must log in to answer this question.

Browse other questions tagged .