• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I am guessing it will be no less than 16gb.
Would mean them having to go with either a 256 or 512 bus though, that's why I think a 12gb with a 384bit bus is more likely or they may just go with 10gb again and leave a big performance gap to the 4090 / 4080ti like was the case with Turing and pascal.

Can see AMD taking the lead next gen especially if they move to MCM on a better TSMC node while nvidia are stuck with Samsung mono die's
 
What kind of performance lift are you expecting it will bring over the 3080 though? I can’t imagine the gap being big. We are likely talking about single digit figures counted on one hand in most cases where it actually benefits a said game?

The only thing I can see is it helping some game by about 10% right. But with like some of the doomers above fixated on the lets call it 'available room' then when the 3080 has 1.1Gb VRAM spare on example game, the 6800 would have 7Gb to make SAM/bar lots of room to obviously make it a benefit. Its just an interesting scene now as although we have the pro pricessfrosty analytics camp, we need to take a look at when it comes to offering this feature for specific games that take advantage of it, will the 3080 when pressed with it basically offer no improvement, when the >10Gb card do offer 10% lifts..
 
The only thing I can see is it helping some game by about 10% right. But with like some of the doomers above fixated on the lets call it 'available room' then when the 3080 has 1.1Gb VRAM spare on example game, the 6800 would have 7Gb to make SAM/bar lots of room to obviously make it a benefit. Its just an interesting scene now as although we have the pro pricessfrosty analytics camp, we need to take a look at when it comes to offering this feature for specific games that take advantage of it, will the 3080 when pressed with it basically offer no improvement, when the >10Gb card do offer 10% lifts..

I haven't read too much about SAM/BAR but I don't think it works quite like that, my understanding is that they are removing a middle man/temporary 256MB swap storage system i.e. with SAM/BAR, the CPU can directly access all of the VRAM on the GPU now rather than swapping 256MB chunks at any given time.
 
Would mean them having to go with either a 256 or 512 bus though, that's why I think a 12gb with a 384bit bus is more likely or they may just go with 10gb again and leave a big performance gap to the 4090 / 4080ti like was the case with Turing and pascal.

Can see AMD taking the lead next gen especially if they move to MCM on a better TSMC node while nvidia are stuck with Samsung mono die's
I guess there are no rumours of GDDR7? Was thinking maybe that may be out next year.
 
The only thing I can see is it helping some game by about 10% right. But with like some of the doomers above fixated on the lets call it 'available room' then when the 3080 has 1.1Gb VRAM spare on example game, the 6800 would have 7Gb to make SAM/bar lots of room to obviously make it a benefit. Its just an interesting scene now as although we have the pro pricessfrosty analytics camp, we need to take a look at when it comes to offering this feature for specific games that take advantage of it, will the 3080 when pressed with it basically offer no improvement, when the >10Gb card do offer 10% lifts..

Sam/Bar doesn't work like that. It doesn't use GPU memory, it just allows the CPU to see/use all the available GPU memory rather than just limited to 256MB slices.
 
Historically speaking, you upgrade your card because the gpu itself is not powerful enough to hit a desirable fps number with decent settings, not because it doesn't have enough vram.

I'm already there with my 3080/Reverb combo.

I didn't jump on the ultra-high FPS train when 1080p and even 1440p started marketing high-refresh rate monitors, but someone must want those frame rates or the monitors wouldn't be on the market.

VR is the first case where 60fps just couldn't cut it for me. (motion sickness) Now I have finally joined the "60fps is not enough" crowd (for VR), only to be told I'm an outlier in this thread.

I am so late to the need-more-than-60fps party, but if the fps bar gets set too high at 4k, the vram buffer becomes a non-issue with the GPU itself begging for mercy....so the need for more fps at 4k must be discarded as an unreasonable expectation I guess.

My 3080 is stuck on medium-ish settings in my driving sims with my Rverb and it's all because it lacks GPU horsepower.
 
I'm already there with my 3080/Reverb combo.

I didn't jump on the ultra-high FPS train when 1080p and even 1440p started marketing high-refresh rate monitors, but someone must want those frame rates or the monitors wouldn't be on the market.

VR is the first case where 60fps just couldn't cut it for me. (motion sickness) Now I have finally joined the "60fps is not enough" crowd (for VR), only to be told I'm an outlier in this thread.

I am so late to the need-more-than-60fps party, but if the fps bar gets set too high at 4k, the vram buffer becomes a non-issue with the GPU itself begging for mercy....so the need for more fps at 4k must be discarded as an unreasonable expectation I guess.

My 3080 is stuck on medium-ish settings in my driving sims with my Rverb and it's all because it lacks GPU horsepower.
Someone will be along shortly to say you represent a small portion of use cases and attempt to play down what you are saying :p

Funny thing is, 10GB not being enough is even more of a small issue than running out of performance is up until now :D
 
For anyone wanting more than 10gb then buy a 3090/6800/XT/6900X or even a 12gb 3060 it's not like the 3080 is the only GPU on the market.
 
How about sampler feedback streaming? That seems to be a technology that can increase effectiveness of VRAM usage. Of course, I think it needs developer work...

I originally thought this was the age of effectiveness, and new tech, and therefore Nvidia did not skim on VRAM, but instead, I thought they would leverage sampler feedback, directstorage and so on, so that games would not need an increase of VRAM.

Of course, with the emergen of 16 GB VRAM consoles, AMD pushing 16 GB on their cards, and of course, rumors of 16 GB cards from Nvidia, my hope on this idea quickly withered. But once can still hope. :)
 
Last edited:
My 3080 is stuck on medium-ish settings in my driving sims with my Rverb and it's all because it lacks GPU horsepower.

Was a guy on the CtrlAltStock discord adamant he would not get the 3080 for the G2 because of VRAM and was looking at 6900XT instead. I tried explaining that given my experience with the Index it was likely to be a GPU grunt problem before a VRAM one but was simply met with a video of some random guy running 2160p with supersampling/resolution scaling on top on a number of games with obviously poor frame rates instead :p
 
I'm already there with my 3080/Reverb combo.

I didn't jump on the ultra-high FPS train when 1080p and even 1440p started marketing high-refresh rate monitors, but someone must want those frame rates or the monitors wouldn't be on the market.

VR is the first case where 60fps just couldn't cut it for me. (motion sickness) Now I have finally joined the "60fps is not enough" crowd (for VR), only to be told I'm an outlier in this thread.

I am so late to the need-more-than-60fps party, but if the fps bar gets set too high at 4k, the vram buffer becomes a non-issue with the GPU itself begging for mercy....so the need for more fps at 4k must be discarded as an unreasonable expectation I guess.

My 3080 is stuck on medium-ish settings in my driving sims with my Rverb and it's all because it lacks GPU horsepower.

Isn't the reverb the highest resolution VR headset?

It's also still IMO early generation technology and at £500 too expensive for most folk just for a headset.

How many gamers actually have VR? I know one at my workplace that did and there was a lot of gamers there and he only does because he has zero dependents and very small outgoings. He doesn't have a car, and all his household bills are split with family. He buys loads of expensive tech on a monthly basis, stuff that even folk earning double wouldn't dream of buying.

Loads of others earning good money don't have one. It's a niche product with even more niche usage.

You have bought into it at an early stage. Personally for me VR won't be mature enough for it to become mainstream until it's 4K, 165hz and £300 as targets but ideally better.

But to drive that will require a £3000 gpu no doubt so even then there is a huge barrier to entry. I reckon 10 years time will be the time and by then I'll have lost interest in it.

VR is pretty much one of those things that like 3D TV's won't really take off and either stay ridiculously expensive or be slowly phased out.

VR isn't a mainstream product. So your very much an early adopter and you have to pay the tax that comes along with that.
 
Isn't the reverb the highest resolution VR headset?

It's also still IMO early generation technology and at £500 too expensive for most folk just for a headset.

How many gamers actually have VR? I know one at my workplace that did and there was a lot of gamers there and he only does because he has zero dependents and very small outgoings. He doesn't have a car, and all his household bills are split with family. He buys loads of expensive tech on a monthly basis, stuff that even folk earning double wouldn't dream of buying.

Loads of others earning good money don't have one. It's a niche product with even more niche usage.

You have bought into it at an early stage. Personally for me VR won't be mature enough for it to become mainstream until it's 4K, 165hz and £300 as targets but ideally better.

But to drive that will require a £3000 gpu no doubt so even then there is a huge barrier to entry. I reckon 10 years time will be the time and by then I'll have lost interest in it.

VR is pretty much one of those things that like 3D TV's won't really take off and either stay ridiculously expensive or be slowly phased out.

VR isn't a mainstream product. So your very much an early adopter and you have to pay the tax that comes along with that.

You missed my point entirely.

90fps is not a crazy frame rate. Just look at all the high refresh monitors on the market.

It's only "crazy" at 4k because GPU's lack the power to run those frame rates at that resolution with max settings.

NOT because the frame rates themselves fail to improve the gaming experience.

While 60fps is acceptable to *me* on a flat screen, the current 1080p and 1440p monitor market seems to indicate that it's not acceptable to a lot of people.

60fps is "acceptable" in this discussion only because it has to be....or the GPU would clearly be the weakest link at 4k and this thread would have died on the first page.
 
You missed my point entirely.

90fps is not a crazy frame rate. Just look at all the high refresh monitors on the market.

It's only "crazy" at 4k because GPU's lack the power to run those frame rates at that resolution with max settings.

NOT because the frame rates themselves fail to improve the gaming experience.

While 60fps is acceptable to *me* on a flat screen, the current 1080p and 1440p monitor market seems to indicate that it's not acceptable to a lot of people.

60fps is "acceptable" in this discussion only because it has to be....or the GPU would clearly be the weakest link at 4k and this thread would have died on the first page.

Yeah my point is don't game at 4K.

I actually convinced the guy at work who used to game at 4k to get rid and he still wanted more so he went ultra widescreen another fail especially as it's an Alienware 36 or 39 inch.

He was always complaining how his £2500 might actually be £3500 pc couldn't play everything at 4k on max. Pre built pc customised to his spec.

I told him he should go 1440p but he was adamant that would be a noticeable drop. It's not especially in twitch shooters.

I'm still a big advocate for 1080p gaming. It's why my 32" screen is only 1440p and my previous 27" screens I had dual were 1440p and 1080p.

You couldn't really tell the difference unless you really looked for it in game.

I calibrate my monitors using a professional tool. Again that makes much bigger difference to PQ than Res does and even my mate with his £5k pc and monitor setup doesn't invest in a calibrator.

VR is different because your eyeballs are literally resting on the screen. Which is why I say you have adopted it too early. It's just not there yet to be viable unless you have extremely deep pockets.
 
I will admit I often wonder why people game at 4k... would there be a big difference compared to my Alienware AW3418DW (3440 x 1440)?
 
I will admit I often wonder why people game at 4k... would there be a big difference compared to my Alienware AW3418DW (3440 x 1440)?
I don't think 4K is worth it on a monitor but if you have a 48" Oled sitting around then no reason not to use it.
 
Status
Not open for further replies.
Back
Top Bottom