• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.

This is a great idea. We should foget the 3080/90 and focus on the 3070. The worst card in the line up. We should all be using memory that offers half the bandwidth of GDDR6X :rolleyes:

Or to put it another way, don't be a Dave :D
 
Soldato
Joined
31 Oct 2002
Posts
9,860
I would expect a 16gb 3070 to get beat by a 10gb 3080 in everything now. In a few years (which is really what this vram discussion is about) I bet the 10gb 3080 will still beat a 16gb 3070 in the vast majority of the new games. There may be a game here or there that is coded in such a way that it gobbles up vram without hammering the GPU....those select few titles might put a 16gb 3070 ahead of a 3080 if and when the settings are carefully selected to get that outcome.

? The 16GB 2070 will be cheaper than the 10GB 3080, of course it will be slower.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
This is a great idea. We should foget the 3080/90 and focus on the 3070. The worst card in the line up. We should all be using memory that offers half the bandwidth of GDDR6X :rolleyes:

Or to put it another way, don't be a Dave :D

Did you even read what you quoted? I said "get the 3070 16GB if you want to save a few bucks". This means, if you want to save money and get a cheaper card, you can get a 16GB 3070.

Nvidia obvious won't price the 3070 16GB higher than the 3080 10GB, that would be madness. My point was that even lower range cards than the 3080 10GB will have more VRAM.
 
Soldato
Joined
2 Oct 2012
Posts
3,246
People like you make me laugh. You're given examples of 8GB not being enough for some games at 4K, yet you refuse to accept this.

Example - Doom Eternal. Released March 2020. Gets a significant performance hit on 8GB cards, when running 4k, due to running out of VRAM.

The issue is, some not wanting to buy a 10GB that's already 'on or close to the limits' of what's needed for 4K. A brand new GPU is supposed to be the top dog for a few years, not have questions on it's capacity from day one.

Nope you make me laugh. Before the new gen of cards was announced, no one anywhere was saying 8-11GB of vram wasn't enough at 4k. What makes me laugh even more is that Dave2150 knows more about how games request and use vram on cards more than nvidia.
If nvidia think 10GB is enough then it's enough. Just because you decided to smack a game such as doom enternal up at the max settings and find it struggles at 4k is astounding. Is that a Vram problem or your inability to understand that smacking a game at 4k max settings will cripple the card because that's what they are supposed to do. Just like Crytek decided to do with Crysis so that thier game ages better.
 
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
? The 16GB 2070 will be cheaper than the 10GB 3080, of course it will be slower.

? Maybe you missed what I was replying to:

Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.

I would expect a 16gb 3070 to get beat by a 10gb 3080 in everything now. In a few years (which is really what this vram discussion is about) I bet the 10gb 3080 will still beat a 16gb 3070 in the vast majority of the new games. There may be a game here or there that is coded in such a way that it gobbles up vram without hammering the GPU....those select few titles might put a 16gb 3070 ahead of a 3080 if and when the settings are carefully selected to get that outcome.
 
Soldato
Joined
20 Aug 2019
Posts
3,030
Location
SW Florida
My point was that even lower range cards than the 3080 10GB will have more VRAM.

So...what? Slower is slower.

I don't care if a cheaper card has 32gb of vram. If it's slower, it's slower.

I may buy a slower card because it's cheaper, but I'm not going to buy a slower card just because it has more vram.
 
Associate
Joined
2 Oct 2020
Posts
36
I'm reminded of the big debate around CPUs a few years ago, when Crysis came out. One of the big things was that it was the game of the future, that you can keep coming back to again and again at every generation with new futuristic CPUs with their eleventy billion GHZ clock speeds...

As it turned out gaming didn't go that way, clock speeds didn't go up by much and instead CPUs went the route of multiple cores. Crysis never got to unlock its true potential because it banked on the wrong thing.

My point is there's no need to get too wound up about VRAM or anything, because no one knows what advances this generation will bring or what route devs will take in future. Maybe DLSS will become amazing and we'll all be able to render at 360 blown up to 8k, who knows.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Did you even read what you quoted? I said "get the 3070 16GB if you want to save a few bucks". This means, if you want to save money and get a cheaper card, you can get a 16GB 3070.

Nvidia obvious won't price the 3070 16GB higher than the 3080 10GB, that would be madness. My point was that even lower range cards than the 3080 10GB will have more VRAM.

Yep. I replied....

This is a great idea. We should foget the 3080/90 and focus on the 3070. The worst card in the line up. We should all be using memory that offers half the bandwidth of GDDR6X :rolleyes:

Or to put it another way, don't be a Dave :D

Do you really think a slower GPU with slower memory would be a good choice for someone currently considering a 3080? It's not hard to work out. The 3080 GPU is borderline for RT now. Traditionally the amount of VRAM is decided by the bus width. The consoles will use ~10GB for graphics having only 16GB total VRAM, the Xbox actually having a 6/10GB split. Streaming data is the new direction games will take. By the time you need more VRAM you will also need a new GPU.

BTW. The 3070 will also come in an 8GB version, which is what Ubisoft are recommending for Watchdogs 3 @ 1440p, or a 10GB 3080 @ 4k.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK
Exactly mate. Seems people like you and I are some of the few with enough common sense to see that the 3080 should have had more than 10GB. Thankfully, Nvidia agree and are releasing a 20GB version. Or get the 16GB 3070 if you want a save a few bucks.

A quick appeal to people using this argument. Putting aside for a second that these are so far just rumors and not official announcements, can we at least acknowledge that Nvidia are a for-profit organization and that if there's a market demand for something then it's in there best interest to fill that demand. The "fact" that Nvidia would make such a card is not acknowledgement that such a vRAM config is necessary for gaming, anymore than people who make gold plated HDMI connectors is an acknowledgement that gold plated HDMI connectors are better (Hint: they're not). The only thing it indicates is that there's a market to sell to.

I would expect a 16gb 3070 to get beat by a 10gb 3080 in everything now. In a few years (which is really what this vram discussion is about) I bet the 10gb 3080 will still beat a 16gb 3070 in the vast majority of the new games. There may be a game here or there that is coded in such a way that it gobbles up vram without hammering the GPU....those select few titles might put a 16gb 3070 ahead of a 3080 if and when the settings are carefully selected to get that outcome.

Basically what I said. Those things are generally speaking, edge cases. And who cares about them seriously. People care about the mainstream trends. You do not pile up assets into vRAM in a modern game engine and it not impact frame rate. An open world game isn't going to load in an asset 300 miles away into vRAM for no good reason. If it's going into vRAM it's because it's on some close game tile/zone that will likely used in the next few seconds/minutes. In the Nvidia 8nm Ampere thread when we discussed this issue people brought up numerous examples of "games that use more than 10Gb of vRAM" but didn't even check the performance charts to see the very examples they posted are obliterated in terms of frame rate.

I actually expect my components to become obsolete over time. I also expect to have to turn down settings as my equipment ages and new games come out....and then, at some point, I expect to buy new hardware.

A natural progression felt by anyone who has been in this game for very long. The important part of this post is that as you turn down settings of newer games on older hardware, the vRAM requirements drop. This argument that it's the FUTURE games that will be the vRAM problem is a bad argument for this reason. Future games like all current games will do the same thing, they will demand varying amounts of vRAM based on their relative settings used.

I'm reminded of the big debate around CPUs a few years ago, when Crysis came out. One of the big things was that it was the game of the future, that you can keep coming back to again and again at every generation with new futuristic CPUs with their eleventy billion GHZ clock speeds...

As it turned out gaming didn't go that way, clock speeds didn't go up by much and instead CPUs went the route of multiple cores. Crysis never got to unlock its true potential because it banked on the wrong thing.

My point is there's no need to get too wound up about VRAM or anything, because no one knows what advances this generation will bring or what route devs will take in future. Maybe DLSS will become amazing and we'll all be able to render at 360 blown up to 8k, who knows.

This is a good point. And one last thing to add is that Crysis is a game, they took a gamble on how hardware would go. But if you look at the next gen consoles they're making hardware bets that will define how the games will respond. And the bets that both new consoles are making are for very fast SSDs and improvement in technology that allows direct SSD to vRAM (or vRAM equivelent) transfer. With Xbox and windows this is Microsofts DirectStorage and Nvidias implementation of this is RTX IO. If you want to make a bet on the future, bet on the massive investment made in these areas. That and the fact that these console devs basically paired their modern GPUs with no more than 10GB of memory themselves.
 
Associate
Joined
1 Oct 2009
Posts
1,033
Location
Norwich, UK

THAT...is super flipping interesting. I had no idea the original point of DLSS was to render at native res and then upscale to above your native resolution, and then down sample that to native again, to effectively give you a type of Anti-Aliasing. According to that video it was only later in the development that Nvidia shifted this to using lower than your native resolution and upscaling to get native. I assumed that was the original purpose because RTX effects are so slow that doing them at more than 1080p is impossible, and thus to maintain current expectations of 1440p and 4k, smart upscaling was needed.

My bugbear with AA for years and years now has been that all these post processing AA techniques do not add more information into the scene by taking sub samples, they just blur the existing image using post processing filters which suck. Where as DLSS used with native res, upsampling past that gives you a kind of hybrid, that results in what looks like pretty damn good AA, especially compared to the soup that is FXAA and alike. This is a bit of a game changer to me, I can see myself way more interested in running true 4k with RTX off and using DLSS to effectively do anti aliasing. Something about lower native res with DLSS up until this point never sat well with me.

This is a really good find, thank you.
 
Soldato
Joined
6 Feb 2019
Posts
17,566
yeah DLSS was originally supposed to be used as a low performance cost super sampling method - so you'd run the game at say 1440p natively then DLSS would upsample a higher resolution version and then downsample to 1440p to try and produce a better than native image with no need to use anti aliasing like TAA and with a performance cost not much different to usual 1440p.

That quickly got scraped though and they went in the opposite direction.

Both methods are trying to achieve the same outcome - an image that looks better than your PC could otherwise natively output at a given performance level, but the steps to get there are different. I suppose the original method either had a higher than expected performance cost or the image quality wasn't what they expected
 
Associate
Joined
29 Oct 2002
Posts
806
People need to pivot their thinking on this, how games use vRAM has changed radically over the last decade or so now and scaling of vRAM doesn't need to be as aggressive anymore, which is why all the new technology is focusing on speeding up the SSD/Storage to GPU link, rather than inflating GDDR6x usage.

Yes exactly. We use cache is to speed up requests which would otherwise have to be fulfilled by much slower storage system. As storage speeds get faster and faster you rely upon the cache less and less which will act as a counter-weight to ever increasing VRRAM sizes.

From what I gather right now, at 4K, you don't need more than 10GB of VRRAM and that's unlikely to change because very few people are running more than 4K resolution. It may be many many years before consumer gamers need more than that.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
Nope you make me laugh. Before the new gen of cards was announced, no one anywhere was saying 8-11GB of vram wasn't enough at 4k. What makes me laugh even more is that Dave2150 knows more about how games request and use vram on cards more than nvidia.
If nvidia think 10GB is enough then it's enough. Just because you decided to smack a game such as doom enternal up at the max settings and find it struggles at 4k is astounding. Is that a Vram problem or your inability to understand that smacking a game at 4k max settings will cripple the card because that's what they are supposed to do. Just like Crytek decided to do with Crysis so that thier game ages better.

You're aware that before these cards launched, the majority of those playing at 4K60+ FPS were on 2080TI's. Do you know how much memory 2080ti's have?

2080ti is the bare minimum in terms of 4K at a comfortable FPS in AAA titles. 3000 series is the card that takes it mainstream.
 

fx1

fx1

Associate
Joined
20 Mar 2015
Posts
173
You're aware that before these cards launched, the majority of those playing at 4K60+ FPS were on 2080TI's. Do you know how much memory 2080ti's have?

2080ti is the bare minimum in terms of 4K at a comfortable FPS in AAA titles. 3000 series is the card that takes it mainstream.
Dave You dont really have a clue how VRAM works.

Game devs dont make games for video cards that dont exist. 0.01% of users have more than 11GB VRAM.

Just because a game dev put 8xMSAA in to a game with 200% scaling doesnt mean you are supposed to max those settings out and blast your VRAM use to 20GB. OH MA GAAAD I NEEEEEED 20 GIGGGGGSSSSSS!!!!! CUS OF FLIIIGHT SSSIIIIIM!!!!

Ever considered that the reason those settings are there is because you can EITHER choose high MSAA OR 200% resolution scaling????

IF you max your VRAM out you are killing your performance because YOUR GPU CANT HANDLE THE FRAMES!

So you are getting worse performance due to maxing your VRAM then blaming them lack of VRAM when in fact you killed your GPU by overloading it. VRAM is paired with the memory bus memory bandwidth and the VRAM which can balance the load with the GPU.

The ONLY caveat is professional use which isnt about FPS in games.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,524
Location
Greater London
Oh would you look at that, Ubisoft updated their requirements chart and the VRAM needed has gone from 11 to 10gb and now they recommend a 3080 over a 2080Ti for 4K. This was obviously going to happen as I said, but people were loving using the original requirements to point out how 10gb would not be enough. Who is laughing now? :D


Lol.jpg
 
Soldato
Joined
6 Feb 2019
Posts
17,566
Oh would you look at that, Ubisoft updated their requirements chart and the VRAM needed has gone from 11 to 10gb and now they recommend a 3080 over a 2080Ti for 4K. This was obviously going to happen as I said, but people were loving using the original requirements to point out how 10gb would not be enough. Who is laughing now? :D


Lol.jpg


VRAM and RAM should not be on system requirements anymore, they are totally pointless and just confuse people - also developers have no bloody clue.
 
Status
Not open for further replies.
Back
Top Bottom