• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
6 Feb 2019
Posts
17,617
And yet the Radeon 7 with it's 16GB vram only manages 24fps when even a 2080 Super with 8GB manages 29fps and the 2080ti with it's only 11GB manages 35fps.

I think there's a lot of things at play when we talk about vram. Some games will use it all up just because it's there. Doesn't mean you gain anything from it.

Maybe we need to test it with the pagefile. When a game is actually running out of VRAM, it will start loading data in RAM right? So lets check if a 2080 playing FS has more system RAM usage than a Radeon VII - because in theory, if the game is pushing data in RAM instead of VRAM, that will negatively affect performance, how much though is hard to find out
 
Soldato
Joined
18 May 2010
Posts
22,389
Location
London
Maybe we need to test it with the pagefile. When a game is actually running out of VRAM, it will start loading data in RAM right? So lets check if a 2080 playing FS has more system RAM usage than a Radeon VII

It will start swapping out to ram I believe yes.

However swapping out to system ram would probably cause stutter because it's not fast enough to swap it in and out.

But Direct Storage / RTX IO would fix that.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Thank you!

People keep talking as if VRAM is the end of the world but there already exists a monstrously good graphics card by AMD with heaps of VRAM.. that sadly gets its man handled and abused by the anaemic 8GB VRAM RTX 2080.

Even if AMD release GPUs with more VRAM.. the 3080 can clearly still have the potential to be far better than it.
But we only have old-gen games with which to draw that conclusion.

Will 8 GB cards be optimal for next-gen games?

We already have the iD Lead Engine Programmer saying, "Don't touch anything with < 8GB which is the absolute minimum going forwards."

Do you want to bet that an 8GB (3070) or 10GB (3080) card will not have VRAM issues with next-gen games, at all? Even if it's only 10% of them.
 
Soldato
Joined
19 May 2012
Posts
3,633
It will start swapping out to ram I believe yes.

However swapping out to system ram would probably cause stutter because it's not fast enough to swap it in and out.

But Direct Storage / RTX IO would fix that.


We will need to see sevidence of Direct Storage and RTX IO first.

At the moment it just sounds like another NVIDIA piece of crap tech which we won't hear about after 2 years. I hope I'm wrong but I got stuny by the virtual-link NVIDIA hype last year which died a quick death with zero adopted headsets and removed for the 3xxx series.
 
Soldato
Joined
19 May 2012
Posts
3,633
But we only have old-gen games with which to draw that conclusion.

Will 8 GB cards be optimal for next-gen games?

We already have the iD Lead Engine Programmer saying, "Don't touch anything with < 8GB which is the absolute minimum going forwards."

Do you want to bet that an 8GB (3070) or 10GB (3080) card will not have VRAM issues with next-gen games, at all? Even if it's only 10% of them.


Yeah, its a good point, no doubt about it.

In all honesty for my own personal circumstances, I need the 3080 to last me 24 months. Then I'll upgrade.

Hopefully Cyperpunk gives us a good indication of if 10GB VRAM is enough.

If the 3080 can 4k/60+ Cyberpunk ultra RTX all options on, I think we are good to go. Similarly if AMD can do the same, then I think that card will also be awesome.
 
Soldato
Joined
18 May 2010
Posts
22,389
Location
London
We will need to see sevidence of Direct Storage and RTX IO first.

At the moment it just sounds like another NVIDIA piece of crap tech which we won't hear about after 2 years. I hope I'm wrong but I got stuny by the virtual-link NVIDIA hype last year which died a quick death with zero adopted headsets and removed for the 3xxx series.

I agree. But.... it's not an Nvidia tech. It's because the next gen consoles have an architecture that can do things the PC can't. So they have needed to think up a solution for the PC otherwise we would get left behind.

I expect AMD will have a similar tech on their new card too.
 
Soldato
Joined
19 May 2012
Posts
3,633
I agree. But.... it's not an Nvidia tech. It's because the consoles have an architecture that can do things the PC can't. so they have needed to think up a solution for the PC otherwise we would get left behind.

I expect AMD will have a similar tech on their new card too.


Very good point. I hope you are correct and it works but with NVIDIA and new technologies, i will hold my breath. If it needs to be adopted by developers, I will already assume its dead *looks at DLSS and RTX and physX*.

We need universial open ended solutions to these issues if consoles really are going to pose port problems.. not nvidia proprietry rubbish.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
I agree. But.... it's not an Nvidia tech. It's because the next gen consoles have an architecture that can do things the PC can't. So they have needed to think up a solution for the PC otherwise we would get left behind.

I expect AMD will have a similar tech on their new card too.

Hopefully. DirectStorage (which is what powers the 'velocity architecture' in the Series X) is a microsoft API and as we know that's here for the PC. AMD just need to utilise it and considering the hardware/software solution on the series X was a joint effort between MS and AMD, it would be a really bizzare choice if AMD didnt take advantage of it on the PC.
 
Soldato
Joined
6 Feb 2019
Posts
17,617
We will need to see sevidence of Direct Storage and RTX IO first.

At the moment it just sounds like another NVIDIA piece of crap tech which we won't hear about after 2 years. I hope I'm wrong but I got stuny by the virtual-link NVIDIA hype last year which died a quick death with zero adopted headsets and removed for the 3xxx series.

Direct Storage is Microsoft tech and RTX IO is Nvidia tech - though they do the same thing, Nvidia just wants to try and make everything Nvidia branded.

Its the real deal, but PC has been caught napping. Consoles are ready to go with Direct Storage this November and their games will load in 1-2 seconds. The Windows development side for this new tech is only starting now, thats why Nvidia has said its up to a year away from launch to where developers can start porting consoles games to it.
 
Soldato
Joined
19 May 2012
Posts
3,633
Direct Storage is Microsoft tech and RTX IO is Nvidia tech - though they do the same thing, Nvidia just wants to try and make everything Nvidia branded.

Its the real deal, but PC has been caught napping. Consoles are ready to go with Direct Storage this November and their games will load in 1-2 seconds. The Windows development side for this new tech is only starting now, thats why Nvidia has said its up to a year away from launch to where developers can start porting consoles games to it.


Thank you. So it affects game launch times... will it affect performance in the interim? I'm going to take a wild guess... yes but it won't translate to consoles at any point in time being ahead of PCs cos of the crazy GPU grunt we have from NVIDIA and hopefully AMD?
 
Associate
Joined
16 Jul 2019
Posts
102
k.[/QUOTE]
No but it'd be nice if you were to quote VRAM usage for a specific game in the context of 'is this enough VRAM debate' to at least put more of a proviso and explanation on the point you're trying to illustrate, if you understand how VRAM works.

Your post by itself in isolation is misleading and could lead to people who might not know your post history to make it sound as if HZD needs 8GB of VRAM fully used at all times to function at 1080p, which doesn't seem to even be the case at 4k.
You quoted and retorted to the back end of a quick “too and fro” of an initial post of mine. What you ask for is all in my original post...as I said previously.
 
Soldato
Joined
9 Nov 2009
Posts
24,856
Location
Planet Earth
But we only have old-gen games with which to draw that conclusion.

Will 8 GB cards be optimal for next-gen games?

We already have the iD Lead Engine Programmer saying, "Don't touch anything with < 8GB which is the absolute minimum going forwards."

Do you want to bet that an 8GB (3070) or 10GB (3080) card will not have VRAM issues with next-gen games, at all? Even if it's only 10% of them.

Nvidia will probably want to refresh this line as soon as possible IMHO. Samsung 7NM EUV is good enough for IBM to have its next CPUs(with hugh dies) out next year,clocking at well over 4GHZ.

So once the refresh Ampere to a better node,they can probably also increase standard RAM amounts too,as GDDR6X will drop in price. They will only increase VRAM this generation,if AMD has something competitive,and feel the marketing need to do so.
 
Associate
Joined
16 Jul 2019
Posts
102
Playing The Witcher 3 at 1920 x 1200 60FPS, maximum settings (minus hair-works), HD textures and increased grass density... according to MSI Afterburner, I use as much as circa 3GB VRAM. Playing Horizon Zero Dawn at 1920 x 1200 60FPS, high settings, POV at 90 and motion blur turned off...according to MSI Afterburner, I use as much as circa 7.5GB VRAM. I understand that allocation of VRAM does no equate to actual usage. I understand how each GDDR generational jump and associated increase in operating frequency results in increases in performance. I do not understand why the RTX 3070 has only 8 GB of VRAM when a triple a game in 2015 requests circa 3GB of VRAM and a triple a game in 2020 requests almost 8GB.
 
Soldato
Joined
9 Nov 2009
Posts
24,856
Location
Planet Earth
Playing The Witcher 3 at 1920 x 1200 60FPS, maximum settings (minus hair-works), HD textures and increased grass density... according to MSI Afterburner, I use as much as circa 3GB VRAM. Playing Horizon Zero Dawn at 1920 x 1200 60FPS, high settings, POV at 90 and motion blur turned off...according to MSI Afterburner, I use as much as circa 7.5GB VRAM. I understand that allocation of VRAM does no equate to actual usage. I understand how each GDDR generational jump and associated increase in operating frequency results in increases in performance. I do not understand why the RTX 3070 has only 8 GB of VRAM when a triple a game in 2015 requests circa 3GB of VRAM and a triple a game in 2020 requests almost 8GB.

VRAM amounts might stay constant for the first year of the new consoles,as old ones have more volume(so its half in,half out). But its going to be 2022 onwards when we will start to see if there are problems. The thing is by then both AMD and Nvidia will have newer GPUs out,so TBF neither will care.

It was like with the GTX1060 3GB,it was OK for a while,but it was only from 2018 onwards,you started to see problems with the VRAM amount against the 6GB model.
 
Associate
Joined
16 Jul 2019
Posts
102
VRAM amounts might stay constant for the first year of the new consoles,as old ones have more volume(so its half in,half out). But its going to be 2022 onwards when we will start to see if there are problems. The thing is by then both AMD and Nvidia will have newer GPUs out,so TBF neither will care.

It was like with the GTX1060 3GB,it was OK for a while,but it was only from 2018 onwards,you started to see problems with the VRAM amount against the 6GB model.
I pretty much agree with you, i recon 6 / 8GB VRAM will be the typical standard for a few years.
 
Status
Not open for further replies.
Back
Top Bottom