Skip to main content
edited tags
Link
Oliver Salzburg
  • 88k
  • 64
  • 266
  • 308
Source Link

Verifying file integrity on NTFS under linux

I have an NTFS volume that I share between a linux machine and a windows machine. Recently I have had a couple of strange problems where files have become corrupted. This has affected some media files, but also some archive files. It isn't a major disaster as I did have some old backups, so I could recover the data, although more recent backups were also corrupt.

The weird thing with one of the archive files is that some of the backups, that now appear corrupt, predate times where I had successfully accessed and updated the data in question on the live volume. That issue is really messing with my head and I wonder if there is some other issue going on as well.

But in any event what I would like is a simple solution that will calculate a hash such as MD5 on all the files in the file system, and then enable me to periodically run a comparison so I can detect any changes (corruption). If possible I would prefer a linux solution. Potentially this could be done using some sort of shell script, but I would prefer a more robust / tested solution. Also since I have over 1TB in files it will need to be fast.

I have found programs such as fcheck and aide. Are they appropriate for this? Are there any other recommended solutions? Thanks.