• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Way way back, I remember being told 2GB wasn't enough to run games, when 3GB was being shown on the 7970. I went tri-SLI on the 680 and ran 3 screens and never ran out of VRAM, even though I was told time and time again that I didn't have enough. Move on a few years and now we are being told that 10GB isn't enough to run 8K..... What the Dickens lol
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,508
Location
Greater London
Way way back, I remember being told 2GB wasn't enough to run games, when 3GB was being shown on the 7970. I went tri-SLI on the 680 and ran 3 screens and never ran out of VRAM, even though I was told time and time again that I didn't have enough. Move on a few years and now we are being told that 10GB isn't enough to run 8K..... What the Dickens lol
We need 24GB minimum now mate! Otherwise the new games won't even launch :p
 
Associate
Joined
3 Jan 2010
Posts
1,379
You can bet there won’t be any issue with 10gb vram but hey, some people think
they better troll as much as possible til things are out there in the open.

Nvidia clearly so incompetent that you’ll see with day 1 games how the vram limits the 3080 in comparison to the 2080ti, as if they only hope to sell for a few days until people buy them, test them and storm the forums telling the rest not to buy/ returning their products for full refunds.

You have to live in LaLaLand to believe that scenario. Or to want to be an insufferable troll.
You can bet but not guarantee. I think 10gb is likely to be 'sufficient' for the current games but to have it for another few years or more would be entering the unknown territory over whether it will last. It still depends on the kind of games you play but most developers build with the consoles in mind at least and they're now entering a new gen where validating a PC port is going to require more than showing you can get better fps if you spent triple or more than the console. the point is that the bar is getting raised from here on out. It will be harder to see what the new raised bar for PC gamers will look like and I'm definitely no expert on VRAM so I'm not trying to heckle people but I do think there's room for things to change in their demands with the standards changing.

For me the 10gb has put me off slightly as it's not guaranteed to be future proof. I'm sure it's fine for now and will work for a year or two but I want to buy a full system and have it last 4 or 5 years (could easily upgrade in that time but would want it to reliably last that time if I didn't). To be fair I've been tempted to wait for the next set of CPU's anyway since it would be a full system but the 10gb did put the nail in the coffin with the asking price of the card combined with that lack of future proofing. It's not a good time to be hedging your bets with the change we will see coming in from the consoles as it's always the developers that make the difference and they'll be working with 4K and new effects as standard now so where does the bar get raised from there? I do trust what Gregster is saying as well, VRAM has been overblown for a while but I'm just tempted to believe this time it could be different as we're right at the beginning of a change in game development with those consoles.
 
Associate
Joined
3 Jul 2012
Posts
425
Looks increasingly likely Nvidia is going to release some filler cards with more VRAM given Navi is going to have a 16GB model at an affordable price, think it's wise to wait, if Nvidia just shafted 2080Ti users, they may do the same to 3080 early adopters.
 
Soldato
Joined
23 Apr 2010
Posts
11,896
Location
West Sussex
I don't think it is enough for 4k.

We have not seen any true next gen games yet (and won't for a bit until they launch on consoles, always consoles first) but then when they add in all of the heavy textures and stuff? yeah, 10gb for 4k will not be enough.

It might hold in there at 1440p, but with ultra settings and RT and shadows to max in COD MW I use over 9gb at points.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
Based on the arguments in this thread you'd think the best card to buy right now is the Radeon 7 with it's 'next gen' vram of 16GB.

That would be a stupid decision. You need the performance to go with the VRAM, even a child can tell you that.

A 3080 with 16GB+ of VRAM is the sweet spot. Nvidia know this, hence why they'll make the obvious move of a 3080ti at a later time, perhaps on a better process with less power draw.

If they released the 3080ti now, it would massively impact 3090 sales.
 
Soldato
Joined
18 May 2010
Posts
22,371
Location
London
That would be a stupid decision. You need the performance to go with the VRAM, even a child can tell you that.

A 3080 with 16GB+ of VRAM is the sweet spot. Nvidia know this, hence why they'll make the obvious move of a 3080ti at a later time, perhaps on a better process with less power draw.

If they released the 3080ti now, it would massively impact 3090 sales.

Thats exactly the point I have been trying to make. The 2080ti being recommend for Watch Dogs Legion is not because it has 11GB of VRAM becuase as we have just agreed the Radeon 7 has more at 16GB. It's the compute the 2080ti is packing that makes it the recommend card at 4k.

And I guarantee you the 3080 will out perform the 2080ti at 4k in Watch Dogs Legion and CyberPunk amongst others.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
Thats exactly the point I have been trying to make. The 2080ti being recommend for Watch Dogs Legion is not because it has 11GB of VRAM becuase as we have just agreed the Radeon 7 has more at 16GB. It's the compute the 2080ti is packing that makes it the recommend card at 4k.

And I guarantee you the 3080 will out perform the 2080ti at 4k in Watch Dogs Legion and CyberPunk amongst others.

I still feel 10GB is stingy on the 3080 - but I also know in my heart that no game developer will release a game that struggles to run at 4k with 10GB of VRAM on the 3080. They just won't have the balls, Nvidia's market share and mindshare are just too strong. They'll all ensure 10GB is fine, until the 4080 rolls out.
 
Soldato
Joined
12 May 2014
Posts
5,235
Hardware unboxed texted Doom in 4k with two texture settings, ultra nightmare (~9GB of VRAM) and ultra settings (~7GB of VRAM).

A change in the texture settings did not affect the results of the GPUs that had enough VRAM to handle the highest texture settings (1080ti, Radeon VII, 3080 and 2080ti).

Every GPU that didn't have sufficient VRAM gained performance by reducing the VRAM usage.
Numbers from Hardware unboxed 3080 Review
Only setting changed was texture quality.
I have only selected a handful of cards to get a general picture, would you like me to do all of them?

The 980ti and 1070 were the only cards that didn't gain a sufficient boost in Avg frame rates (6% and 4% respectively). The 980ti only had 6GB of memory so even with Ultra textures it still didn't have enough VRAM, however there was an increase in performance (even if it was a small amount). I wonder how the results would have differed if the different texture settings were tested at 1440p or 1080p instead.


From hardware unboxed numbers it can be concluded that a change in texture quality for Doom has no impact on the frame rate of the game (see 1080ti, Radeon VII, 3080 and 2080ti results). It would be fair to conclude that other games will follow the same trend were texture settings would have no, or a small impact to frame rate.

Exceeding the VRAM buffer of these cards by 1GB, caused a significant decrease in framerates.
Reducing the VRAM, so that the GPUs did not exceed their 8GB of VRAM, caused average frame rates to increase by 20%-29% and minimum framerate to increase by 23%-40%.

For the 2080 super and non super, 2070 super and 5700XT, they ran out of VRAM before they ran out of GPU horsepower.
 
Soldato
Joined
18 May 2010
Posts
22,371
Location
London
I still feel 10GB is stingy on the 3080 - but I also know in my heart that no game developer will release a game that struggles to run at 4k with 10GB of VRAM on the 3080. They just won't have the balls, Nvidia's market share and mindshare are just too strong. They'll all ensure 10GB is fine, until the 4080 rolls out.

It is stingy. But Nvidia's reasoning is GDDR6X is expensive. They wanted to keep the price of the card down.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Hardware unboxed texted Doom in 4k with two texture settings, ultra nightmare (~9GB of VRAM) and ultra settings (~7GB of VRAM).

A change in the texture settings did not affect the results of the GPUs that had enough VRAM to handle the highest texture settings (1080ti, Radeon VII, 3080 and 2080ti).

Every GPU that didn't have sufficient VRAM gained performance by reducing the VRAM usage.
Numbers from Hardware unboxed 3080 Review
Only setting changed was texture quality.
I have only selected a handful of cards to get a general picture, would you like me to do all of them?

The 980ti and 1070 were the only cards that didn't gain a sufficient boost in Avg frame rates (6% and 4% respectively). The 980ti only had 6GB of memory so even with Ultra textures it still didn't have enough VRAM, however there was an increase in performance (even if it was a small amount). I wonder how the results would have differed if the different texture settings were tested at 1440p or 1080p instead.


From hardware unboxed numbers it can be concluded that a change in texture quality for Doom has no impact on the frame rate of the game (see 1080ti, Radeon VII, 3080 and 2080ti results). It would be fair to conclude that other games will follow the same trend were texture settings would have no, or a small impact to frame rate.

Exceeding the VRAM buffer of these cards by 1GB, caused a significant decrease in framerates.
Reducing the VRAM, so that the GPUs did not exceed their 8GB of VRAM, caused average frame rates to increase by 20%-29% and minimum framerate to increase by 23%-40%.

For the 2080 super and non super, 2070 super and 5700XT, they ran out of VRAM before they ran out of GPU horsepower.
All they are going to do is ignore it.
We have the proof we need that 10GB isn't enough. And when you show it they will simply say that it's not "powerful enough".
They have to stick to that narrative ;). LOL
 
Associate
Joined
15 Feb 2009
Posts
209
All they are going to do is ignore it.
We have the proof we need that 10GB isn't enough. And when you show it they will simply say that it's not "powerful enough".
They have to stick to that narrative ;). LOL

But where is the evidence that 10gb isnt enough? I still haven't seen any direct evidence after all of this. The 3080 is beating the 11gb older generation cards with no issues at 4k?
 
Associate
Joined
13 Jul 2020
Posts
500
Doom is the only ‘case’. It’s not even texture quality, it’s pooling, in other words filling vram even when not needed. Texture quality is identical. Good job hanging on that 1 exception to prove nothing at all.
 
Status
Not open for further replies.
Back
Top Bottom