• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
So what did we decide after 209 pages

I'll tell you exactly, nothing, just 209 pages of complete nonsense.

The bottom line is the 3080 came with a whole 1gb less than the 2080ti. If it had launched with 11gb or 12gb etc etc this thread wouldn't even be a thing.

Anyway I'm probably getting in the way of some extreme waffle about something someone isn't qualified to pretend to know anything about
 
I'll tell you exactly, nothing, just 209 pages of complete nonsense.

The bottom line is the 3080 came with a whole 1gb less than the 2080ti. If it had launched with 11gb or 12gb etc etc this thread wouldn't even be a thing.

Anyway I'm probably getting in the way of some extreme waffle about something someone isn't qualified to pretend to know anything about
This. It’s a silly argument. There’s not one game that 3080 can’t run and will be able to run until it runs out of grunt.
 
This. It’s a silly argument. There’s not one game that 3080 can’t run and will be able to run until it runs out of grunt.

I agree here, the VRAM is not a bottleneck between the 10GB 3080 and 16GB 6800xt.

But you can't help to notice that AAA PC game production is scarce, so we are playing old games with this generation of cards, pc gaming is in a wierd state.
 
Last edited:
This. It’s a silly argument. There’s not one game that 3080 can’t run and will be able to run until it runs out of grunt.

And even if you could find an extreme example of a game that unusually loads really high on vRAM but somehow can do that without loading on the GPU significantly, that would be an exception to the rule. The bottom line is that Nvidia are developing the cards not for the extreme cases, they're developing them for the average user and a typical use case. If you're the kind of person that loads 1,500 skyrim mods for example you're in some extreme minority. They're not going to put potentially another £100 worth of GDDR6x RAM on a card and crank up the prices for everyone just so some extreme 1 in 1000 circumstance has marginally better performance.

I've often thought to myself, I should just make one of those cheap asset flip games on steam, where you take a free engine, go to the asset store and spend a few quid on random assets and put them all into a level with trash AI and make something no one would want to buy. And then tweak the settings so it has some kind of absurd system demands. Then all the people strutting about looking for that 1 killer app that would define a card as "not 4k ready" or "not enough vRAM" would just apply universally to everyone. Cause a kind of short circuit in peoples brains who are only capable of black and white thinking.
 
I'm sure game developers will test their games on the 3080 before release? If 10Gb isn't enough perhaps they'll tweak the textures etc?

Many years ago I had a GPU with only 1Gb rvam. I just lowered the settings until the game ran okay. I thought nothing of it, I just enjoyed my games.
 
My 3080 lacks the raw horsepower to max-out my racing sims in VR...now. On the list of reasons to turn down settings, VRAM can get in line and wait.

I'm sure some games, some day, will be too much for the VRAM buffer, but the GPU itself is already struggling....now.
 
Its the same in generally every thread regarding AMD or Nvidia, probably find it was started by the same people who are claiming over in the FSR thread that DLSS is dead, long live FSR etc, it boils down to trolling at some point, lucky to have a fair few experienced people on these forums in both camps who will explain in detail the intricacies of hardware/features etc, only to be shot down by a keyboard warrior from the other camp, happens every time.
 
My 3080 lacks the raw horsepower to max-out my racing sims in VR...now. On the list of reasons to turn down settings, VRAM can get in line and wait.

I'm sure some games, some day, will be too much for the VRAM buffer, but the GPU itself is already struggling....now.

Same with my 3090, but also modded Skyrim VR is pushing 11gb, so I'm already exceeding 10gb VRAM, but luckily I have plenty to spare.
 
We really need to go through this again?

It’s really hard to see how much VRAM games require considering there’s a load of caching going on, usage does not equal need.
It’s real obvious when you do hit the limit though, performance drops through the floor and I don’t believe we’ve got there yet.

So yes, it’s most likely going to be fine for at least a few years. Keep an eye on the 3070 at 4K in reviews.

What "load of caching going" in Vram? MS/NVIDIA/AMD and game engine set up the caches nowadays us 3D programmers DON'T set up caches in Vram.
Most of the Vram is used for Textures then geometry then other things. You need to see a memory map of how Vram is alllocated in a modern game.
 
Seen quite a few back and forth talk online about 10GB cutting it a bit short, others saying it's fine. Usage at 1440p is less than 4K but there is still a few games where usage goes just above 10GB.

So do we think the rumours of extra cards with 20GB will be there to pre-order with the other cards? Or are they waiting to see what AMD does?

If you're happy with blurry textures and low res geometry and 1080p gaming then 10GB is fine or you play a bunch of old games.
If you want 4k gaming and hires textures/geometry then 12GB or 16GB minimum, the industry is making a shift to *MORE* hires textures!
It's not difficult to work out, just one 4096*4096 texture takes a lot of storage even when compressed with it's normal maps etc geometry (triangle count) is going up too.
 
What "load of caching going" in Vram? MS/NVIDIA/AMD and game engine set up the caches nowadays us 3D programmers DON'T set up caches in Vram.
Most of the Vram is used for Textures then geometry then other things. You need to see a memory map of how Vram is alllocated in a modern game.

Looking at analysis of actual vRAM breakdown in many games today shows that's actually not true. Many games have texture pools which are of fixed size and textures swap in and out of, you can typically measure the size of the texture pool through the engine variables like with Doom and see that memory usage is around say 2Gb for textures where as the game takes up to 8-10Gb of vRAM maxed out with raytracing. So it's about 1/4 to 1/5th, and the same is true in many other games. You can often see the memory breakdown in the graphics settings and see this really isn't true.

This is a holdover from the past where games couldn't texture stream and so vRAM limited assets in the world. Once texture streaming became popular vRAM stopped growing with texture size and start growing with other effects that need buffers in memory and also load on the GPU. The idea that you need more than 10Gb to not have blurry textures is kinda crazy, there's no evidence for that at all. High res textures are nice but people cannot tell the difference between UHD textures and something lower quality on a surface that's not right in front of their face, so most of the time the high res textures are just flushed out of the memory pool and lower quality variants are used, and the engine just streams those textures in/out of memory without a problem.
 
If you're happy with blurry textures and low res geometry and 1080p gaming then 10GB is fine or you play a bunch of old games.
If you want 4k gaming and hires textures/geometry then 12GB or 16GB minimum, the industry is making a shift to *MORE* hires textures!
It's not difficult to work out, just one 4096*4096 texture takes a lot of storage even when compressed with it's normal maps etc geometry (triangle count) is going up too.



Educate me please, what games are you talking about?
 
If you're happy with blurry textures and low res geometry and 1080p gaming then 10GB is fine or you play a bunch of old games.
If you want 4k gaming and hires textures/geometry then 12GB or 16GB minimum, the industry is making a shift to *MORE* hires textures!
It's not difficult to work out, just one 4096*4096 texture takes a lot of storage even when compressed with it's normal maps etc geometry (triangle count) is going up too.
That’s absolute tosh
 
If you're happy with blurry textures and low res geometry and 1080p gaming then 10GB is fine or you play a bunch of old games.
If you want 4k gaming and hires textures/geometry then 12GB or 16GB minimum, the industry is making a shift to *MORE* hires textures!
It's not difficult to work out, just one 4096*4096 texture takes a lot of storage even when compressed with it's normal maps etc geometry (triangle count) is going up too.
I've run graphically demanding games at 1080p on the highest settings and used less than 5gb of ram :s
 
What I mean is if you're buying a next gen card get as much Vram as possible so it lasts a few years future games are only going to use more 4k and 2k textures.
Example, Watch dogs legion uses about 9GB on max 4k screen mode and has a *HUGE* hires tex pack, games *WILL* use more texture and geometry going forward.
NVIDIA/AMD are not putting 12/16GB on there for no reason, software houses *WILL* use this so plan properly for the next few years. 10GB is acceptable but better to
get more if you're willing to pay the price of a 3080.
I'm talking about the next few years of game releases.
Texture streaming is slow and can cause "popping" artifacts, going over the pcie bus to system memory is slow, if it worked so well they wouldn't put large amounts of Vram on newer GPUs.
After playing only on a 4k monitor 1080p looks blurry to me! :)
When you render a 2k or 4k texture map on a 1080p screen a *LOT* of the original pixels are lost so the the detail is gone!
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom