• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
My personal view point has not changed. I was fine with the 10gb on the 3080 as it played every game at the time just fine and since then we have barely a handful of games that need more and from those any given individual may want to play what, one? Big deal... When you look at the alternative which is had nvidia tried to put more vram at the time it would have made msrp higher.

The vram issue imo is only an issue for people who intend to keep the card for a very long time and those that are inflexible and find the thought of tweaking settings unimaginable. For those you can buy a 16gb or 24gb card though, but even with those there will be situations where you need to tweak settings. Like with an AMD card you will need to tweak RT settings for example.

Well I suppose one way round it is to only ever buy gpus for short term at over half a grand a pop ;)

I dont ever expect to have to use anything below the highest texture quality textures in a game on a xx80 card within 2 years of its release.

There is an argument developers shouldnt need to optimise either, the tech available is capable, it just needs card manufacturers to put more of it on a card, e.g. I think 16 gig of gddr6 would have been better than 10 gig of gddr6x.

We also do agree I would rather have a £650 10 gig 3080 vs a 16 gig 3080 for twice the price. I think they could have put more on it with normal gddr6 which would have been the best of the three options though.
 
You might be on to something there as from what I remember the game used less than 8GB of vram on my laptop as if there was some kind of limit imposed whereas on my desktop it used almost 10GB.
I still get stuttering on my desktop just not as bad.
You can find specs of ny desktop and laptop in my sig.

Yeah there is people with 3090s reporting it uses about 9 gigs, but with a few UE engine settings they can let the game of its leash and it fixed their issues.

This game was originally designed to run on hardware with 16 gigs of VRAM, 12 gigs or so of it usable, so we have to bear in mind despite what some people want to think, the game has been be tweaked to work on under spec'd hardware, so it kind of needs retweaking again for the few who have access to decent amounts of VRAM

There is a guy on nexusmods trying to fix the stutter issues, which if he succeeds hopefully may mean the VRAM is then only a concern for texture quality and pop in.
 
My old 1070 would do Windows 10 at 4k. Therefore I consider it a 4K card!

...Joking aside, I want to know what "4k card" means. Is it latest games at 4k? If so, what framerate? 60hz? 144hz? I have a 3080 powering a 3840x1600 which works perfectly. It's also fine for a 4k TV its also connected to. I'm not about to chuck it away and spend a considerable amount more on a 6900XT or 3090 as it's not worth it to me. Can a 10GB 3080 run everything at 60hz and 4k? Nearly, but no. Can a 2080ti? No. Technology moves on.

I get that it is an enthusiast forum, but sometimes a blind reliance on numbers can detract from the real life enjoyment of a game. I've been playing Horizon Zero Dawn with everything max at 4k. My wife is also fully engrossed by the story (and tells me how to take out broadheads...) and we love it. Have we noticed the times when fps has dropped to 53fps? Not to my recollection. If 10GB is not enough, go for the AMD option - It's pretty impressive also. @TNA had EXACTLY the right idea changing down to a 3070 for a chunk of change.

If you have a 3080 of any type...You have no idea how good you have it. If you don't and you're complaining that 10GB isn't enough...Is there REALLY a situation (gaming wise) that it's actually a PROBLEM? If you have a 10GB 3080 and complaining that it isn't enough...Why did you buy it?!

Am I old and cross? Seemingly so...
 
Well I suppose one way round it is to only ever buy gpus for short term at over half a grand a pop ;)

Spot on.

My old 1070 would do Windows 10 at 4k. Therefore I consider it a 4K card!

Enthusiasts questioning simple stuff. I don't think its down to understanding here. ;)

Turnip gets it.

He definitely does. Others try to be clever instead. Jensen said it was a 4k card. You can only take a horse to water...
 
I have never tried to be clever. This is true.

Before launch and at the reveals, both AMD and nvidia demo'd games being played at 4k with charts on fps. Mr @TNA also played games with his 4k monitor seemingly well enough before he sold it (assuming he bought the 3080.. to play games at 4k!). Dropping res will get you higher fps but one of the key selling points too was DLSS which using it on supported games will allow for very smooth play on big titles to allow people to play at 4k.

Your not going to max out the VRAM if your playing at 2k. :cry:

Yea, and it's mostly an issue for those trying to push the cards at 4K or for VR.

Games are going to get more demanding, especially with the new engines, so people playing at higher resolutions will need to scale back setting to find playable FPS, which funnily enough will reduce VRAM demand.

Spot on.
 
Last edited:
Going AMD RDNA2 is not an option for me, I like to run RT games at a playble FPS. DLSS can't be beat either. So 10GB I'm more than content with.

So for you (and me) 10GB is enough. For someone that doesn't want this and pure, by the numbers rasta performance, perhaps 10GB isn't enough. Neither point of view is necessarily wrong, but the question needs reframing. Asking "Is 10GB enough?" is always, met with "for what?". At this point the arguments tend to get a little bit granular and, mostly, pretty much irrelevant.

Using 4k 60hz as a benchmark, older games (remakes not included) do not have increased requirements, so 10GB runs them fine. Current games MOSTLY (arguments abound on a couple of outliers) work absolutely fine. At some point 10GB won't be enough, absolutely. I didn't buy my card expecting it to last forever. It's a little ridiculous deciding that PC gaming is the way forward due to its customisation, then shunning it because you don't want to face into the customisation.
 
That's the thing, even at 4k, 10GB is still not an issue (and I have a 4k display), also, there are a couple of games where you can max out 10gb even at 1440p (it's not "2k"), and here we are again, back to the "it uses all my vram, therefore, you NEED more vram"..... have a look at games like resident evil village, horizon zero dawn, godfall to see how performance is in those games between a 3080 gb and 6800xt even though one card uses more vram... Also, still waiting for proof to see that the extra 2gb vram is benefitting the 3080 12gb over the 3080 10gb model (outside of the better overall specs)???

So not sure why you keep making out like it is an issue right now gpuerilla??? Still trying to justify the 3090 purchase? :cry:

Only one game I have encountered vram issues (inconsistent fps and frame latency all over the place) and that was cyberpunk because I had added a ton of 4k-8k texture mods, all I had to do was remove a couple of the texture packs in order to bring vram usage down.

He definitely does. Others try to be clever instead. Jensen said it was a 4k card. You can only take a horse to water...

No one is trying to be clever, what is happening is you have certain people making claims with no proof to back up said claims, then you get people that own the thing posting their experience and several pieces of proof to back up that but all of that is disregarded because of this point again "it uses all my vram, therefore, you NEED more vram" and other reasons.

Turnip gets it.

Ding ding, winner winner chicken dinner.

Still waiting to hear what other people class as a 4k card?
 
That's the thing, even at 4k, 10GB is still not an issue (and I have a 4k display), also, there are a couple of games where you can max out 10gb even at 1440p (it's not "2k"), and here we are again, back to the "it uses all my vram, therefore, you NEED more vram"..... have a look at games like resident evil village, horizon zero dawn, godfall to see how performance is in those games between a 3080 gb and 6800xt even though one card uses more vram... Also, still waiting for proof to see that the extra 2gb vram is benefitting the 3080 12gb over the 3080 10gb model (outside of the better overall specs)???

Think you may need some help buddy. Just read what you posted. You have just cemented the 3080 is a 4k card.. as in it was release to be able to play games at 4k which is what you do. Why is it that you latch onto these strange posts on specific words. :cry: If you only put in the effort on this instead of your counting months ability then you would be more credible! "Its been released for 1yr 9months" :rolleyes:

You still did not answer my question which was - why did nvidia release the same card but decide to throw an extra bit of VRAM on it? Surely if its is pointless, waste of money, "doesn't need it".. then why not just stay at 10Gb?
 
Think you may need some help buddy. Just read what you posted. You have just cemented the 3080 is a 4k card.. as in it was release to be able to play games at 4k which is what you do. Why is it that you latch onto these strange posts on specific words. :cry: If you only put in the effort on this instead of your counting months ability then you would be more credible! "Its been released for 1yr 9months" :rolleyes:

You still did not answer my question which was - why did nvidia release the same card but decide to throw an extra bit of VRAM on it? Surely if its is pointless, waste of money, "doesn't need it".. then why not just stay at 10Gb?

Define how it's not a 4k card then? Come on, you have made this claim, so the onus is on you to prove it....

I guess the 6800xt, 6900xt and even the 3090 aren't 4k cards either? Since in certain games they too need to turn settings down.

And unlike you, I do answer questions:

As for why nvidia have released a 12gb model.... perhaps because they are a company who want to make as much money as possible? Shocker, I know. Also, nvidia like to saturate the market with a card for every performance and price sector, this is a pretty common business practice, not to mention it also means they dominate benchmark scoreboards, just look at the 3060 choice and 3070 choice incoming.

We could use that same logic for anything, why do intel release so many cpus, why did amd bother releasing a 5800x 3d model? Why do monitor manufacturers release so many different models of monitors, which use the exact same panel??? etc. etc.

Again, still waiting on this:

Has there been any proof to show that the extra 2GB vram is actually benefitting the 3080 12GB? (outside of the "overall" specs actually being better than the 10GB version.....)

Also, I already said that was a miscount mistake on my end, but nice deflect to show yet again, you got nothing of any substance ever, better keep mining on the 3090 to make sure you get your value for money :cry:
 
Last edited:
Before launch and at the reveals, both AMD and nvidia demo'd games being played at 4k with charts on fps.

Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.

Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.

Amazes me how many people that go on about 4k but have never gamed at it.
 
Reviews and release fps charts are for apples to apples comparisons, so that a metric for % performance increase can be evaluated over earlier cards.

Anyone who actually games at 4k knows that you don't turn some settings to max or on at all, as they are there to enhance potato resolutions. So you turn those off, releasing GPU resource. Anyone using review FPS to claim whether a card is 4k or not doesn't know what they are doing. Probably the same people that use gfx presets, where an experienced PC gamer will spend time setting up a game with all the settings available to get the best fps/IQ available for a particular game and those use higher resolutions. Using presets are for folk who doesn't understand all the graphical settings and what they are for so presets are a choice of 4/5 short cuts for folk not so clued up. The added bonus of PC gaming is the plethora of settings available for fettling to give you the best gaming experience, regarding your monitor refresh rate and IQ. You just cut your cloth accordingly to your set up.

Amazes me how many people that go on about 4k but have never gamed at it.

Yep. I have been gaming at 4K since 2014 and it has been possible due to tweaking settings and 4K being a moving target as I mentioned a few posts ago. What it took to play games of that era at 4K and now is a huge difference.

Many people stuck to 1440p/1080p back then in order to “max out” settings or just needed more than 60fps which is enough for me for the games I play. I did testing at 4K with some settings lowered and crap like DOF, motion blur etc turned off yielded an noticeable improvement on IQ vs max or any settings at 1440p, so I was unable to go back to lower resolutions.
 
Status
Not open for further replies.
Back
Top Bottom