• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
That a 3070 can play cyberpunk ok when paired with a better/newer CPU AND the appropriate settings because of someones complaints where their fps is dropping to 30s and saying it is "entirely" down to the vram maybe.... You know this post here, now go and compare the frametime graph [although bit of a pointless comparison given not like for like but still, clearly a better cpu would help out the 3070 here]:
Have you got a 3070 and have tried this game? Looks pretty horrible to me on the 1440P part.

INhSQ6P.png

I'm not going to reply any further, especially to you, because you're really being dishonest at this point.

Case is closed for me. Have a good day.
He's too far gone he can't be saved so wise choice.
 
No, i mean from a VRAM standpoint. The video owner probably only runs the game, and shuts down everything in the background, hence VRAM is not stressed by extra applications. This is what I experience... And this video is not representative of what I experience. I don't want to shut down background apps just because of 8 GB VRAM.

At 11:41, game uses 7.7 GB of VRAM. When the other apps use 700-800 MB vram, game instead will use 6.7 GB vram (as i've evidenced by my own video) and cause stutters due to shared memory usage

As long as you have 16GB System ram or more an a good 6 core CPU background apps like this should not have any impact in the game at all. i even listen to Youtube Videos while playing games, it make 0 difference.
--------------

But on the VRam, people often look at their VRam, see 7GB of 8GB used and think oh that's fine i have more than enough, its not as simple as that, when messing about with game settings there might be something you're turning on that requires more than 1GB, so you turn up the graphics setting and it only goes up by 200MB, and you think, see its fine.
However that 1GB it actually needs has been moved into system memory, or the GPU is now in a cycle of scrubbing streamed out assets to stream others in and then scrubbing again, both will bottleneck the GPU and cause microstutter.

Have you got a 3070 and have tried this game? Looks pretty horrible to me on the 1440P part.

INhSQ6P.png


He's too far gone he can't be saved so wise choice.
 
Also @humbug I notice modern Apps like browsers have 'hardware acceleration' and if this is enabled it eats into your GPU budget. It wasnt a thing years ago but you would be surprised what software would now use them resources, one to look out for.
It doesn't affect GPU performance much, but "electron" apps like Discord, Spotify and Epic Games Launcher uses hefty amounts of VRAM for their worth. It adds up when you casually use them. The Windows compositor dwm.exe also uses more VRAM with more up-time which is not a factor for the benchmarks because every review video out there will be using clean-as-possible configs.
 
However that 1GB it actually needs has been moved into system memory, or the GPU is now in a cycle of scrubbing streamed out assets to stream others in and then scrubbing again, both will bottleneck the GPU and cause microstutter.

That is what re-size bar/sam is supposed to help with i.e. the system no longer has a temp. swap system where 256MB chunks have to be swapped about, instead the CPU can directly access the GPUs VRAM.
 
It doesn't affect GPU performance much, but "electron" apps like Discord, Spotify and Epic Games Launcher uses hefty amounts of VRAM for their worth. It adds up when you casually use them. The Windows compositor dwm.exe also uses more VRAM with more up-time which is not a factor for the benchmarks because every review video out there will be using clean-as-possible configs.

Didnt know that. Thanks for the heads-up. I dont use them apps but they shouldn't need VRAM to operate in my head.
 
The funny thing is nV would have gotten away with just a couple more GB on each card. The 3090 is 24GB so the 3080 at 12GB is presumably doable.

Yes people wanted to see 16GB but I think nV would have scraped a pass at 12GB on a 3080.

They just went super-stingy. Unreasonably stingy. To the point where many feel like nV are simply trying to shaft them for the sake of it. It completely put me off the 3000 series and took them off my consideration list.

I'm sure I'm not alone.
 
The funny thing is nV would have gotten away with just a couple more GB on each card. The 3090 is 24GB so the 3080 at 12GB is presumably doable.

Yes people wanted to see 16GB but I think nV would have scraped a pass at 12GB on a 3080.

They just went super-stingy. Unreasonably stingy. To the point where many feel like nV are simply trying to shaft them for the sake of it. It completely put me off the 3000 series and took them off my consideration list.

I'm sure I'm not alone.

Iirc, at the time it wasn't possible, it was either 16gb slower vram, which would have cost more (?) or cheaper faster but less vram i.e. what we got

Of course everyone would have preferred more vram but alas, it wasn't possible without pushing the price up (£650 was already way more than I wanted to spend so no way I would have paid any more than that) and for those who really wanted more vram they had the choice of the 3090 or if they are still waiting.... they can get a 3080ti with 12gb but probably be at the rumoured price of 1k+. For myself, an extra 2GB VRAM is not worth the additional £350+... Rather save that money and put it toward the next gen of gpus which will obviously be better than all these current gen gpus.....
 
The funny thing is nV would have gotten away with just a couple more GB on each card. The 3090 is 24GB so the 3080 at 12GB is presumably doable.

Yes people wanted to see 16GB but I think nV would have scraped a pass at 12GB on a 3080.

They just went super-stingy. Unreasonably stingy. To the point where many feel like nV are simply trying to shaft them for the sake of it. It completely put me off the 3000 series and took them off my consideration list.

I'm sure I'm not alone.

Yea. But.

I will say this, the more VRam your GPU has the more it can store, or "Cache" it can put more data into the buffer and stream it in and out with out having to scrub it or move it to system ram, the result of that is a smoother more performant game, with in reason you can't actually have too much VRam, the more you have the better it is.
 
Iirc, at the time it wasn't possible, it was either 16gb slower vram, which would have cost more (?) or cheaper faster but less vram i.e. what we got

Of course everyone would have preferred more vram but alas, it wasn't possible without pushing the price up (£650 was already way more than I wanted to spend so no way I would have paid any more than that) and for those who really wanted more vram they had the choice of the 3090 or if they are still waiting.... they can get a 3080ti with 12gb but probably be at the rumoured price of 1k+. For myself, an extra 2GB VRAM is not worth the additional £350+... Rather save that money and put it toward the next gen of gpus which will obviously be better than all these current gen gpus.....

That is just pure greed. AMD had the same choice remember?
 
Iirc, at the time it wasn't possible, it was either 16gb slower vram, which would have cost more (?) or cheaper faster but less vram i.e. what we got

Of course everyone would have preferred more vram but alas, it wasn't possible without pushing the price up (£650 was already way more than I wanted to spend so no way I would have paid any more than that) and for those who really wanted more vram they had the choice of the 3090 or if they are still waiting.... they can get a 3080ti with 12gb but probably be at the rumoured price of 1k+. For myself, an extra 2GB VRAM is not worth the additional £350+... Rather save that money and put it toward the next gen of gpus which will obviously be better than all these current gen gpus.....
When nV is making 60%+ margins, "It wasn't possible without pushing prices up," rings a bit hollow. I'm sure it was technically possible.
 
Well yeah I meant in terms of prices for us, the consumers.... Of course amd and Nvidia are already making a good profit regardless but would as many people have bought a 3080 if it was 750+? Probably not, not to mention, would as many people have bought a 3090 if the 3080 had 12/16gb of vram? Not a chance.
 
At the end of the day, you're still forgetting, it's the end result that matters.
.

Keep up i've been saying that for a while. I literally quoted it to you twice.

. No one cares if a game is reusing assets etc.
.

But people care about RT vs raster but disregard the end result, even if it means that artistically it suffers due to weakass hardware and bandaids known as DLSS and FSR trying to compensate for a lack in the number of bounces possible because of said weakass hardware.;)

then why does it matter if the developer is reusing assets???
.

I never said reusing assets is a problem or bad. They all reuse assets some more than others because of artistic direction.
What i am trying to say but you can't seem to understand is artistic direction leads everything else is secondary unless there are technical hardlimits like GPUs with intentionally gimped VRAM.

For the exact reason we stated why they reuse assets in the first place.... to reduce vram usage and optimise for performance (and obviously for the developer, to save time), with that vram saved, they can use the spare resources for something else which will make a substantial impact to graphics end result....
.
Whose we. You stated **** all and then piggy backed off my explanation.
Textures and models are the biggest users of VRAM (and textures at least are cheap calculations wise). If you are not using VRAM on textures and models what else are you going to use them on? Sound?(It was a joke, i know how slow some people on these forums are.) Everything else is miniscule in comparison.


Please tell us what technical graphical affects had to be cut because these video games spent too much of their VRAM budget on models and textures?

Trying to make a mountain out of a molehill here I see.... It's pretty obvious what "good" and "better" means when talking about graphics.... Now if we were comparing something like borderlands/valheim to metro/resident evil village, it's a bit different but we're comparing games which are aiming to be as close to realistic looking as possible and artistically metro and resident evil are much the same.

If you think distincting between technical achievments and artistic achievments is making a mountain out of a mole hill then there is no helping you and it is pointless having this conversation.
As I hinted at previously artistic achievments have nothing/very little to do with VRAM usage. Look at all the artistically great looking indie games that can be run on a potatoe. Sorry my bad according to you the devlopers of "rdr 2, div, days gone, metro, tomb raider, batman ak" all just need to optmise their games better. Maybe you should go work for them and give them tips and tricks.

The only games that aim to be realistic are simulators. Every other game aims to be a fun and/or interactive works of art. They don't aim to be realistic that's just you confusing your opinion for fact. Graphics wise they aim to look good, which doesn't mean realistic. Something something end result something something.
 
@oguzsoso @LtMatt The limited frame buffer is just one of many reasons why I ditched my RTX 3070 for a 6700XT. I still don't regret it months later. If people want to tell themselves that 8GB is enough then let them and while they sit there in their stutter-filled hell convincing themselves that it's the game's fault and not a poor choice of hardware spec the rest of us can enjoy the show from the sideline.
 
Status
Not open for further replies.
Back
Top Bottom