1

I have a USB connected harddisk of which many (1000s) files can not be read. The problem is, these files seem randomly distributed over the drive, and it takes a very long time (over an hour) before the drive gives up on trying to read each of these files. So a simple copy process is out of the question.

I have in my mind two possible ways to approach this. I'm not sure how I would perform this in practice, if it is at all possible.

My first thought would be to minimize the amount of time it takes before giving up trying to copy a file, to for example 10 seconds. However I have not found any way to do this, maybe this would require an OS- or hardware-level change? I have tried killing copy processes, but short of unplugging the drive nothing seems to work. (And before anybody suggests: no, "robocopy /w:10" does not fix this.)

My second thought would be to log every file that is being tried, then when a file copy gets stuck I can unplug the drive, and restart the process, skipping the problematic file by skipping any files that have previously been logged. Since we're talking 1000s of files, I have to be able to run multiple processes parallel.

Is there anybody able to help me with my problem? Tell me how I can perform either option, or suggest other approaches?

Thanks for any advise!

0

1 Answer 1

0

You tried robocopy /w:10, but I guess what you need here is to use robocopy /r:1 (or a slightly larger number, in case you observe that this helps to save more files).

Robocopy documentation says, default number of retries for the same file is one million, /r:1 reduces this to one. Note also /w does not change the time until robocopy gives up, it controls the time between two retries.

2
  • Limiting the number of retries for robocopy doesn't change that it takes over an hour before a read timeout. With 1000s of unreadable files, this will still take 1000s of hours to read all the files, even when limiting to only 1 retry. This is simply not realistic. I need a method to either decrease the timeout value, or one that allows me to unplug the drive and continue with the next file after plugging in again.
    – Peter
    Commented Sep 7, 2019 at 20:37
  • @Peter: I did not try this, but maybe a sector-copying tool has a better time-out mechanism? Like HDD-Raw-Copy-Tool or Clonezilla? Or the Unix tool "dd", which is also available for Windows? Note it you manage to make an image, you still have to extract the files. see this question from the Clonezilla FAQ.
    – Doc Brown
    Commented Sep 7, 2019 at 21:21

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .