• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Soldato
Joined
2 Oct 2012
Posts
3,246
This forum makes me laugh.
Before the consoles or new cards were announced 8GB and 12GB was plenty. Now oohh no it's not enough.
Like all these that think that thier 1070,1080,1080ti etc suddenly lost performance in current games so they need a 3080 or 3070.
 
Associate
Joined
13 Oct 2008
Posts
1,132
The bus width dictates the number of VRAM chips that MUST be mounted on the PCB. A GPU with 256-bit bus has 8 memory slots, since each memory chip have a 32-bit wide bus. You can double the memory amount on a 256bit bus and have 16 memory chips (front and back of PCB) but you couldn't for example have 10 chips. VRAM chips are usually 1GB or 2GB.

192bit bus = 6 modules (hence the 12GB Navi 22 rumours). Clearly having 6GB VRAM for this level of GPU would be a serious problem for 4K gaming.
256bit bus = 8 modules (so either 8GB or 16GB versions). Going with 8GB would not be an option if this GPU is aimed at 4K
320bit bus = 10 modules (now you see why we have 10GB or 20GB versions).
384bit bus = 12 modules (12x2GB). It could be possible for Nvidia to release a 12GB 3090 as a cut down cheaper model and still be decent for 4K gaming. This could be an option if the RX 6900XT is close to 3090 speed for half the price. Nvidia release a cheaper 12GB 3090 and they have an answer to AMD.

This is helpful to illustrate the RAM packaging options they have available, thanks!
 
Associate
Joined
11 Apr 2015
Posts
272
This forum makes me laugh.
Before the consoles or new cards were announced 8GB and 12GB was plenty. Now oohh no it's not enough.
Like all these that think that thier 1070,1080,1080ti etc suddenly lost performance in current games so they need a 3080 or 3070.

well for 4k gaming i can tell you my 1080ti does not cut it..i am torn between thinking if 10gbs is enough or if it isent enough still.. maybe for vanila games sure. but what about when you add a bunch of 4k texture modes to say fallout 4...or up and coming games like star field...10gb does not seem like it is enough. my 1080ti with 11gbs of vram at 4k on fallout 4 when using texture mods results in stuttering fps drops when walking aroudn and even shooting things.
 
Soldato
Joined
31 Oct 2002
Posts
9,863
This forum makes me laugh.
Before the consoles or new cards were announced 8GB and 12GB was plenty. Now oohh no it's not enough.
Like all these that think that thier 1070,1080,1080ti etc suddenly lost performance in current games so they need a 3080 or 3070.

People like you make me laugh. You're given examples of 8GB not being enough for some games at 4K, yet you refuse to accept this.

Example - Doom Eternal. Released March 2020. Gets a significant performance hit on 8GB cards, when running 4k, due to running out of VRAM.

The issue is, some not wanting to buy a 10GB that's already 'on or close to the limits' of what's needed for 4K. A brand new GPU is supposed to be the top dog for a few years, not have questions on it's capacity from day one.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
I don't claim to have the answers.

But my theory is next gen games are not going to be comparable to current gen games in terms of vram and the way it is using it.

Next gen games will be able to stream way more and at a much faster speed from the disk subsystem via technologies such as RTX IO (Direct Storage)

How this will affect vram requirement who knows.

For previous and current gen games which are not engineered to stream from disk and store all their assets in vram then yes vram is going to be a bottle neck.

But next gen games which stream from disk at speeds we currently cannot imagine may not need to store so many assets in memory.

There is an excellent article on the internet explaining how the PS5 does this. I will go and find it.

---

Here you go:

"The PlayStation 5 features 16GB of GDDR6 unified RAM with 448GB/sec memory bandwidth. This memory is synergized with the SSD on an architectural level and drastically boosts RAM efficiency. The memory is no longer "parking" data from an HDD; the SSD can deliver data right to the RAM almost instananeously.
Essentially the SSD significantly reduces latency between data delivery and memory itself. The result sees RAM only holding assets and data for the next 1 second of gameplay. The PS4's 8GB of GDDR5 memory held assets for the next 30 seconds of gameplay.
"There's no need to have loads of data parked in the system memory waiting to potentially be used. The other way of saying that is the most of the RAM is working on the game's behalf."
The SSD allows Sony to keep RAM capacity down and reduce costs.
"The presence of the SSD reduces the need for a massive inter-generational increase in size.""

Article

---

Also this should be read as well.

---

People thinking that we need 16GB of vram on a gpu to keep up with a next gen console are dreaming.

How do they think the console will power it's self, run it's os let alone run the game even before the topic of vram is even addressed?

The next gen X box has 10GB of FAST ram and 6GB of less fast ram.

My guess is the FAST ram will be used as vram and the less fast ram used for running the console and game ram.

---

So it sounds like to me the consoles will be streaming in most of their assets form disk to ram.

And I see no reason why on the 3080 with RTX IO it wont be doing the same. In fact that is EXACTLY it's job and the entire reason it exists.
 
Last edited:
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
This forum makes me laugh.
Before the consoles or new cards were announced 8GB and 12GB was plenty. Now oohh no it's not enough.

I think the following sums up this thread...

8GB = 8192MB
9GB = 9216MB

1GB=1024MB


People like you make me laugh. You're given examples of 8GB not being enough for some games at 4K, yet you refuse to accept this.

Example - Doom Eternal. Released March 2020. Gets a significant performance hit on 8GB cards, when running 4k, due to running out of VRAM.

The issue is, some not wanting to buy a 10GB that's already 'on or close to the limits' of what's needed for 4K. A brand new GPU is supposed to be the top dog for a few years, not have questions on it's capacity from day one.

Perhaps this was down to the GPU more than the VRAM? Doom Eternal uses just a little over 8GB. The performance comes from mega textures which resuslts in less fragmentation/grabage collection on the GPU but at the expense of needing more VRAM. This is not a good tech to carry forward and will not benefit from streaming, which is the direction we are heading within the next year or two.

Top dog for a few years... Well I imagine the 3080 doing very well for the next two years until Hopper/RDNA3 arrive. Then I will be upgrading again, again for better RT performance.
 
Soldato
Joined
12 May 2014
Posts
5,236
I don't claim to have the answers.

But my theory is next gen games are not going to be comparable to current gen games in terms of vram and the way it is using it.

Next gen games will be able to stream way more and at a much faster speed from the disk subsystem via technologies such as RTX IO (Direct Storage)

How this will affect vram requirement who knows.
It will free up VRAM to be used for rendering more visual assests on screen.
Unless devs plan on mandating PCIE 4 NVME SSD, don't expect too much to change on PC. They may require SSDs as minimum spec, we shall see.


People thinking that we need 16GB of vram on a gpu to keep up with a next gen console are dreaming.

How do they think the console will power it's self, run it's os let alone run the game even before the topic of vram is even addressed?

The next gen X box has 10GB of FAST ram and 6GB of less fast ram.

My guess is the FAST ram will be used as vram and the less fast ram used for running the console and game ram.
.

I still think it is funny that we are talking about a £700+ GPU being on par with a £450 console and people are saying that it is acceptable and that it okay if the GPU only last 1 generation before hitting problems. Like i said before if this was the 3070 then yeah it would be acceptable and most people would agree.
The bar to impress some people is soo low.


---

So it sounds like to me the consoles will be streaming in most of their assets form disk to ram.

And I see no reason why on the 3080 with RTX IO it wont be doing the same. In fact that is EXACTLY it's job and the entire reason it exists.

Yeah it could when games are coded to take advantage of it; in about 10 years time when even the cheapest gaming PC comes with a PCIE 4.0 NVME drive. I guess at that point we will have moved on to the RTX 8000 series with 100+GB of VRAM.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
People like you make me laugh. You're given examples of 8GB not being enough for some games at 4K, yet you refuse to accept this.

Example - Doom Eternal. Released March 2020. Gets a significant performance hit on 8GB cards, when running 4k, due to running out of VRAM.

The issue is, some not wanting to buy a 10GB that's already 'on or close to the limits' of what's needed for 4K. A brand new GPU is supposed to be the top dog for a few years, not have questions on it's capacity from day one.
Doom eternal is a terrible example. Have you looked at any comparison screenshots? ultra nightmare settings often don't look any better than ultra and in some cases it's a bit of a weird regression. I dont know why it's used in these discussions at all.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
But also what gets forgotten about in discussions is price!

Of course a 20GB 3080 is more future proof than a 10GB 3080.

But it all depends on how much one is willing to shell out for a new GPU.

I thought my 1080 at £450 was expensive.

A 3080 at £700 is already :eek:.

How much more will they charge for a 20GB 3080?
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
But also what gets forgotten about in discussions is price!

Of course a 20GB 3080 is more future proof than a 10GB 3080.

But it all depends on how much one is willing to shell out for a new GPU.

I thought my 1080 at £450 was expensive.

A 3080 at £700 is already :eek:.

How much more will they charge for a 20GB 3080?

MSRP of the 3080 is still £650. I'd imagine the 20GB version will be ~£800-£850. This extra VRAM will not improve the GPU, which is barely managing RT as is. The GPU will be past it's use by date in two years.
 
Soldato
Joined
6 Feb 2019
Posts
17,595
But also what gets forgotten about in discussions is price!

Of course a 20GB 3080 is more future proof than a 10GB 3080.

But it all depends on how much one is willing to shell out for a new GPU.

I thought my 1080 at £450 was expensive.

A 3080 at £700 is already :eek:.

How much more will they charge for a 20GB 3080?

I've become desensitized to high prices now, but I fully get where you're coming from - the market is nuts.

In 2009 I bought the fastest single GPU for gaming on the market, the AMD HD5870 for $350usd
 
Soldato
Joined
6 Feb 2019
Posts
17,595
Doom eternal is a terrible example. Have you looked at any comparison screenshots? ultra nightmare settings often don't look any better than ultra and in some cases it's a bit of a weird regression. I dont know why it's used in these discussions at all.

That's cause you don't have an 8k TV
 
Soldato
Joined
31 Oct 2002
Posts
9,863
Doom eternal is a terrible example. Have you looked at any comparison screenshots? ultra nightmare settings often don't look any better than ultra and in some cases it's a bit of a weird regression. I dont know why it's used in these discussions at all.

You can use the ultra quality example for every game. This has been discussed for decades, the difference from medium to high, or the difference from high to Ultra, is insignificant to many. If you're someone who belongs to this camp, then go head and reduce the graphics on all the games you play from ultra to high, and buy cheaper GPU's. Whatever makes you happy.

Just remember, there are some with different opinions to you, that want to run things on ultra at 4K. This is the advantage of the world we live in, we get to choose whichever option we prefer. Then there are others that, most often loudly, try to impose their will on others:

"I bought my 10GB top end 3080 and I declare 10GB is more than enough for many years, no matter what other people say! You're all fools if you think a GPU needs more than 10GB VRAM for games!"...... Most of this type will be playing 1080P or 1440P (they've not realised 3080's shine at 4K only) and so the debate doesn't even include them, since 10GB is fine for 1080P and 1440P.
 
Soldato
Joined
17 Aug 2003
Posts
20,158
Location
Woburn Sand Dunes
It's still a terrible example. Doesn't matter what settings people want to use. We shouldn't consider settings that literally gobble vram for no real reason unless we are using them as indications of what could happen in the future but that is not what most people mention DE ultra nightmare in these discussions for, yourself included.

Dave2150 said:
Example - Doom Eternal. Released March 2020. Gets a significant performance hit on 8GB cards, when running 4k, due to running out of VRAM.

And yet when TPU did an extensive review on the game, the 2070 8gb and 1080ti 11gb card were within 2fps of each other at both 2560x1440 and 3840x2160 and that's using ultra nightmare settings.

https://www.techpowerup.com/review/doom-eternal-benchmark-test-performance-analysis/4.html
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,236
It's still a terrible example. Doesn't matter what settings people want to use. We shouldn't consider settings that literally gobble vram for no real reason unless we are using them as indications of what could happen in the future but that is not what most people mention DE ultra nightmare in these discussions for, yourself included.



And yet when TPU did an extensive review on the game, the 2070 8gb and 1080ti 11gb card were within 2fps of each other at both 2560x1440 and 3840x2160 and that's using ultra nightmare settings.

https://www.techpowerup.com/review/doom-eternal-benchmark-test-performance-analysis/4.html

And Hardware unboxed has the 1080ti spanking the 2070 at 4k ultra nightmare (98fps vs 74fps). From comparision it is 98fps vs 84fps at ultra quality.

https://youtu.be/csSmiaR3RVE?t=794
 
Status
Not open for further replies.
Back
Top Bottom