Linux 2GB filesize limit

Soldato
Joined
2 May 2004
Posts
19,950
Hi,

Is there any way to get round the Linux 2GB file size limit?

I'd like to mirror Vista onto my server, but when it gets past 2GB downloading it the file disappears and it gives access denied when trying to go to the file in a web browser.

Is there any way I can get round this limit please?

Thanks,
Craig.
 
Berserker said:
Linux doesn't have a 2GB file size limit (unless it's really old). It's the download tool you're using that does, usually because someone only allocated 31 bits to the file size (very common problem).

Let us know which tool you're using, and someone might be able to find an alternative.

I've tried curl -o, wget, lynx and all 3 have messed up past 2GB :(

Craig.
 
Wget 1.10.2 - can't remember the exact version of Red hat we're using, but it's a recent version.

riddlermarc said:

The seg size was limit to what looked like 2GB on my normal account, so i logged into root and tried from that and all was unlimited but it still wouldn't let me download more than 2GB :(

leezer3 said:
AFAIK, its some sort of a bug in a library used by both curl & wget. I would assume Lynx is using the same library also.
I can only find references to it from 2004/ 2005, so my first point of call would be a complete update & compilation of the latest kernel.

The filesystem could also be a problem, but only really if you're using something odd.

-Leezer-

I'll check with my server admin if we're on the latest kernel.
 
Berserker said:
We've got a Red Hat Enterprise Linux 3.0 server with a copy of the 32-bit Vista install. Admittedly I patched the kernel to 2.6.16(ish) on that one, so it's not the original kernel.

Anything running 2.4 or newer with a decent file system (e.g. ext3) should cope fine.

'uname -r' should get the kernel version.

2.6.9-34.0.1.ELsmp
 
R4z0r said:
Are you writing to a SAMBA share by any chance? If so, there is a 2GB limit (Last time I looked into it anyway)...

Your best bet would be to transfer the file using NFS, SCP, FTP, etc.

Nope, not writing to a samba share.

How would I transfer from an FTP:// URL using SCP?

Thanks,
Craig.
 
R4z0r said:
You'd need to mess about with SSH tunneling, no point for what you're doing. It doesn't really matter anyway as you're not copying to an SMB share anyway !!

Can you post the output of "mount" and also advise the directory you are downloading to. Well, it doesn't need to be the exact directory but enough so we know which partition!

Also, are you sure the partition you are downloading to is not running out of space? Check with "df -h".

To rule out some possible problems, have you tried downloading from another source - Maybe a Linux DVD ISO to see if you still get the 2GB limit?

Mount:
/dev/hda3 on / type ext3 (rw,usrquota)
none on /proc type proc (rw)
none on /sys type sysfs (rw)
none on /dev/pts type devpts (rw,gid=5,mode=620)
usbfs on /proc/bus/usb type usbfs (rw)
/dev/hda1 on /boot type ext3 (rw)
/dev/hdd3 on /old type ext3 (rw)
none on /proc/sys/fs/binfmt_misc type binfmt_misc (rw)
sunrpc on /var/lib/nfs/rpc_pipefs type rpc_pipefs (rw)
/usr/tmpDSK on /tmp type ext3 (rw,noexec,nosuid,loop=/dev/loop0)
/tmp on /var/tmp type none (rw,noexec,nosuid,bind)

I think im using either hda3 or hdd3 - I'm using the disk that user files are stored in which from what I can tell is hda3 but I'm not 100% certain.

I'm going to test a Linux DVD ISO now, although I don't think it'll work as the other day I was taring some stuff up and it broke when the tar got >2GB.

-- Edit --
Looking at it closer now I've been trying to download it onto hda3.

-- Edit 2 --
Also, I did df -h and the server has loads of space left.

-- Edit 3 --
Right, tested it on a SuSE DVD ISO (3.5GB), luckily I got a nice mirror which was located in the same country as the server so it was nice and fast. It did the same thing, after 2GB the file size dissappeared from the directory listing but the file appeared to still be there and the download was still going, but when I tried accessing the file from my browser I just got access denied :(

Thanks,
Craig.
 
Last edited:
Augmented said:
So you're attempting to download the file from Apache over HTTP?
Check the version of Apache you're serving the file with. Apache <2.2 does not support files larger than 2GB.

http://httpd.apache.org/docs/2.2/new_features_2_2.html (See Large File Support),

I'm not 100% sure what you mean by attempting to download the file from Apache? I'm just trying to get a file from a ftp or http address using wget.

Also, if the version of apache we had didn't support >2GB wouldn't the file still show up in FTP?

Thx
Craig.
 
Well, my original plan was to host vista for a bit for other people on OcUK, but not much point now as most people now have it.

Anyway, I'd still like to get it sorted out for the future :)

WGET version 1.10.2, oh also, I've tried mget as well in FTP in the shell.
 
Ok, here's what I do:

Login to shell using my account / the root account (normal accounts seem to have a limit on filesize according to ulimit so I use the root account).

Type in wget http://address.com/directory/file.extension - this starts the download and downloads it into whatever folder i'm currently in.

When the download reaches 2GB I check it through Mozilla but I cannot see the filesize in the directory listing (I was able to see the filesize up until 2GB). So I wait for the download to finish and then try and download the file from the server but get access denied.

I check through FTP if the file's there but cannot see the file at all in FTP.

Craig.
 
"naughty"? Beta 2 is public right now and free for download.

Anyway, I don't want to mirror Vista anymore, just wanna get this 2GB problem sorted for future stuff. :)

Can't download to an IIS server as the only server I have access to is a Linux one which is based in Texas.

Craig.
 
Una said:
So you made sure you were running wget as root? (stupid tbh) Otherwise if you have ulimits on your user accounts and exec wget as that your going to be limited.

Well I've been running wget to try sort this prob since I found the normal user accounts have a ulimit of 2GB.

adamofgreyskull said:
;) Windows doesn't make you feel like you need a shower? No? Just me?

Just you :) :p
 
Back
Top Bottom