• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for the 3080? Discuss..

Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
With the 3080 getting an extra 2GB of vram I thought this would be a good time to start a new thread for it.

Personally I think 10GB was enough for the old 3080 as some compromise is acceptable when moving down from the top tier card. It should also be remembered that no one single card can do everything not even the upcoming 3090 Ti and buying GPUs is aways about compromise.

Anyway what do you guys think about 12GB on the 3080, is it enough for your needs?
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
12GB is not enough. Need 24GB minimum. Poor AMD boys stuck on 16GB :p

Jokes aside I agree. I felt the 10gb on the 3080 was fine due to price point for the performance you got. It was also released in 2020. We are now 2022 and there are barely a handful of games that have any issues with lack of vram. There still is not a single title that is a must play that one would really be missing out on.


I think it's too much. The 11gb of the 2080ti was the sweet spot.
:cry:
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
This is just repeating what's already been said in other threads. If 10gb isnt enough then neither is 12gb - it's not enough of an increase to make any meaningful difference. When the 3080 was released, 10gb was fine. Now, it's still fine just about, but we are seeing the start of more and more vram being used for reasons i don't really understand and yeah, FC6 seems to take a dump on the 3080 10gb. so yeah. It's down to use case and either you dont care about fringe cases, or you do. And those fringe cases will slowly become the norm. I was hoping direct storage would help with this but given it was supposed to be released early last year, i don't think we'll get it any time soon and even then I'm sure it'll take years to be supported by most new releases.

Anyway, I get the impression AMD are more efficient with their memory usage hence why they can get away with a 256bit bus when nvidia are pushing 320/384bit for their top end cards. I can see that being the reason why nvidia cant match the vram capacity on the AMD cards on their 3080 GPUs...

on the 3080s nVidia use 1gb chips, which are the most common by far. Ten of them, giving a 10gb vram buffer. To match AMD, they could:

1) drop to a 256bit and, using GDDR '8x' mode, double the number of modules used to give them a 20gb buffer. This is how the 3090 has 24gb without needing a 768bit bus.
2) drop to a 256bit bus and use 2gb modules to give them a 16gb buffer.
3) widen the bus further to 448bit/14gb or even 512/16gb.
4) use the apparent '1.5gb' modules that are in the gddr6 spec to given us a 320bit/15gb 3080. No ideas on availability or suitability for the nv architecture though.

I dont think dropping to 256bit would given nVidia the performance they want and going wider adds even more cost and complexity and the cynic in me says they wouldn't do that when leather jacket man can see just how much more they can wring out of their legions by releasing a 12gb card instead (and jacking up the RRP in the process *cough*).
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
This is just repeating what's already been said in other threads. If 10gb isnt enough then neither is 12gb - it's not enough of an increase to make any meaningful difference. When the 3080 was released, 10gb was fine. Now, it's still fine just about, but we are seeing the start of more and more vram being used for reasons i don't really understand and yeah, FC6 seems to take a dump on the 3080 10gb. so yeah. It's down to use case and either you dont care about fringe cases, or you do. And those fringe cases will slowly become the norm. I was hoping direct storage would help with this but given it was supposed to be released early last year, i don't think we'll get it any time soon and even then I'm sure it'll take years to be supported by most new releases.

Anyway, I get the impression AMD are more efficient with their memory usage hence why they can get away with a 256bit bus when nvidia are pushing 320/384bit for their top end cards. I can see that being the reason why nvidia cant match the vram capacity on the AMD cards on their 3080 GPUs...

on the 3080s nVidia use 1gb chips, which are the most common by far. Ten of them, giving a 10gb vram buffer. To match AMD, they could:

1) drop to a 256bit and, using GDDR '8x' mode, double the number of modules used to give them a 20gb buffer. This is how the 3090 has 24gb without needing a 768bit bus.
2) drop to a 256bit bus and use 2gb modules to give them a 16gb buffer.
3) widen the bus further to 448bit/14gb or even 512/16gb.
4) use the apparent '1.5gb' modules that are in the gddr6 spec to given us a 320bit/15gb 3080. No ideas on availability or suitability for the nv architecture though.

I dont think dropping to 256bit would given nVidia the performance they want and going wider adds even more cost and complexity and the cynic in me says they wouldn't do that when leather jacket man can see just how much more they can wring out of their legions by releasing a 12gb card instead (and jacking up the RRP in the process *cough*).
+1
 
Soldato
Joined
1 Feb 2006
Posts
3,389
The amount of VRAM needed is mostly down to the software you use. Most games could run very well with 4GB or less but the developer don’t have the time needed or just don’t want to stream data in/out to allows this. A good example of this is GTA5, I first played this on an Xbox 360 which has 512MB total RAM, moving to the PC was an upgrade but the spec difference was massive. PC's must have massive overheads or developers just don’t bother managing resources on PC. I think some just use large amounts of VRAM as a marketing boost. VRAM is a high speed cache not a place to dump the entire level.
 
Soldato
Joined
19 Feb 2007
Posts
14,340
Location
ArcCorp
I think giving GPU's more and more memory could backfire, Let me explain.

There are a lot of games that have jaw droppingly gorgeous visuals, Going by a lot of my own benchmarks, General playtime etc... with GPU-Z up I see the high majority of games using an average of 8.5GB of memory at 1440P max settings on a 3080 Ti and based on my extensive reading it seems an extra 2GB is allocated most of the time meaning actual usage is only around 6.5GB at 1440P with the settings cranked.

So my point, Devs take shortcuts wherever possible especially with memory utilisation, If however more and more cards have 16GB or more memory I can easily see devs getting sloppy, Just look at Far Cry 6 with the HD texture pack, Visuals are generally no better than Far Cry 5 with HD texture pack, Minus RT puddles, Yet it can easily chew through memory, Why ? Because it's there, Handy fallback for sloppy coding.

IMO keeping memory around the 8GB, 10GB and 12GB area, For the time being, Forces devs to not get sloppy.
 
Last edited:
Back
Top Bottom