Not the usual 137GB query! (Can't copy a 180GB file)

Associate
Joined
17 Oct 2005
Posts
311
I'm having trouble copying a 186GB file (yes, that is the right size!) between two external USB drives (both Lacie).

The copy keeps failing after copying about 140GB (~151,000,000,000 bytes to avoid confusion about 1024 v.s. 1000); Explorer gives a "device is not ready" message (from memory, might not be exact wording).

I wrote a small C program to try to do the same copy (so that (a) I could try to resume after a failure and (b) I could try to get more info on the error). It didn't shed a lot of light, but the problem seems to occur when reading from the source drive rather than when writing to the destination, and perror() reports "Permission denied", which I don't totally believe, seeing as there didn't seem to be any problems for the first 140GB.

The drives in question are both Lacie, the source drive is 250GB and the destination is 500GB, and there is ample space on the destination.

I'm running XP/SP2 on a fairly modern (2 years or so) old machine, so I don't think it should be the 48bit addressing issue. But the point at which problems occur is a little suspicious.

So does anyone know about any known issues with accessing very large files?
 
matja said:
Have you tried using SetFilePointerEx to seek to a point past ~140GB? You can get a problem like this writing if the NTFS block size is set too small, but I've never seen it happen when reading.
I've been using the "low level I/O routines" (_open, _lseeki64 etc); I haven't specifically tried to "jump past" the problem point but I could try it and see I guess.

To answer the other posters:

I really don't fancy the ZIP/RAR suggestion, based on how I've seen them perform with files of a few GB. Tying the machine up for days isn't high on the priority list. And I think it is still likely to fail as soon as it reads past the 150GB mark.

There is 250GB free on the destination drive, so that should not be the issue. And as I say, the problem very definitely occurs during a call to _read().

Although, saying that, I made a ~30GB file a couple of years ago, which BSOD's Windows 2003 Server everytime I try to copy it - it remains on the RAID array to this date. NTFS isn't exactly sane :rolleyes:
Joy... I think we are going to move away from the "one humungous file" approach; seems a little flaky, and solving problems remotely is a complete nightmare.

Thanks anyhow!
 
Back
Top Bottom