• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Citing a 20 year old game that got a few bells and whistles plonked into it and still having to play it at 1440 to get good framerates isn't exactly what i'd call impressive. Cute showpiece and all but it's a gimmick at best, yes i tried it and while it was obviously different looking it was still a 20 year old game with a coat of paint splattered onto it.

It didn't get a few bells and whistle plonked on. It got a full path tracing engine. Have a look at the excellent posts by @Rroff.

Gpu grunt would have nothing to do with it, they would still be rendering the same image via dlss albeit at different framerates. If artifacts are shown on one gpu, assuming the gpu isn't faulty then theres no reason why it would be absent on faster gpu's, they're all still using the same tech after all.

I beleive DLSS partly uses data from previous frames. If it's not getting enough frames then you will get artifacts.
 
In "everything else" what you really mean is Raytracing. I like how you turn one thing into everything... The 6000 cards for the most part have more VRAM, i personally decided that was more important feature for me than better Raytracing performance this gen.

No, I meant everything else, such as raytracing, DLSS (AI based super sampling), AI, video encoding, etc.

What good is more VRAM when the GPU doesn't have the grunt in the first place? Remember they cut everything out of RDNA2 that wasn't needed for gaming, leaving a console chip.
 
It didn't get a few bells and whistle plonked on. It got a full path tracing engine. Have a look at the excellent posts by @Rroff.

It's still a 20+ year old game with a fresh coat of paint. Had it been a more recent game then most likely it would have made the gpu grind to a halt, it's no coincidence they chose a game that was so old with low resolution assets to put up as an rtx posterchild. Had they done this with say, Doom 3 or something along those lines it would probably be totally unplayable.



I beleive DLSS partly uses data from previous frames. If it's not getting enough frames then you will get artifacts.

No idea if that's the case or not, what is "enough frames"? Enough fps is something that can vary on a game by game and user by user basis. I don't recall dlss ever needing a certain framerate to function properly.
 
I'm a developer. Having 16gb VRAM is very nice. 8/10GB VRAM when dealing with raytracing in Unreal engine is not enough. 3090 was out of my budget.

As a developer you would realise that not many people have more than 8GB of VRAM and that targetting more would equal less potential sales. You would also realise that the consoles, the main appeal of RDNA2, have a total of 16GB of RAM. The XBox has physically split it's RAM in to 6 and 10GB chunks while the PS has one 16GB chunk. Budgeting ~6GB for OS/housekeeping, game and game data is not unreasonable. That leaves 10GB free for graphics.
 
A gaming gpu, oddly enough, take it you've never heard that term before? You ever heard the term gaming rig or gaming pc? This is getting incredibly cringeworthy...

'Gaming rig' I'm undecided on. I also chuckle at the use of the word 'cool' when not talking about the weather. Gaming PC is fine, or as it used to be known, PC.
 
As a developer you would realise that not many people have more than 8GB of VRAM and that targetting more would equal less potential sales. You would also realise that the consoles, the main appeal of RDNA2, have a total of 16GB of RAM. The XBox has physically split it's RAM in to 6 and 10GB chunks while the PS has one 16GB chunk. Budgeting ~6GB for OS/housekeeping, game and game data is not unreasonable. That leaves 10GB free for graphics.

dunno though, landscape can quickly change.

how many gamers had 2 gb vram cards in 2013? i would say a fairly %70-80 of them.

in just 2 years... all those cards become obsolete because vram demands quickly increased. everyone released their 4 gb cards and everyone moved on.

if nvidia releases their own 16 gb 4070s and 4070s, everyone will, again, move on. rtx 3060 already brings 12 gb vram at an entry level. im not saying that its a good gpu. but its a huge upgrade for a gtx 1060 user.

many gtx 1060 users will upgrade to 12 gb rtx 3060/4060s. can you imagine that? majority of gamers will actually have 12 gb vram cards. many of them cant actually afford 3070 and 3080, but they can, if there was proper stocks, afford a 3060 easily.

12 gb will be a mainstream vram amount by the end of 2022, and i suspect that it will be "required" to have at 4k

for 1440p, 10 gb might be revelant for a couple more years, dont know but 8 gb is surelly doomed for 1440p

outside of that, im pretty sure even at 1080p, 12 gb vram cards will still enjoy super high quality textures without any sacrifice at all
 
As a developer you would realise that not many people have more than 8GB of VRAM and that targetting more would equal less potential sales. You would also realise that the consoles, the main appeal of RDNA2, have a total of 16GB of RAM. The XBox has physically split it's RAM in to 6 and 10GB chunks while the PS has one 16GB chunk. Budgeting ~6GB for OS/housekeeping, game and game data is not unreasonable. That leaves 10GB free for graphics.

That's not how it works. Firstly when you have the unpackaged project open in the editor it consumes more VRAM than the packaged build. Secondly games take time to be optimized. Unless you try to optimize everything from the start (which is an impossible way of development) then you will need that VRAM. So mid-project it could consume 12GB VRAM but by the time your ready for release you could have gotten it much lower.

I know someone who tried to open a FREE UE4 project from the Marketplace in raytracing with RTX3070 and it said there was not enough video memory to open it. Even after reducing reflection captures from 2048 bits to 1024. This already tells me that 8GB VRAM is not enough for development with raytracing.
 
It's still a 20+ year old game with a fresh coat of paint. Had it been a more recent game then most likely it would have made the gpu grind to a halt, it's no coincidence they chose a game that was so old with low resolution assets to put up as an rtx posterchild. Had they done this with say, Doom 3 or something along those lines it would probably be totally unplayable.

I'm facepalming so hard right now....

@Rroff had a nice video showing what the path tracing engine was capable of. Need to cook dinner before having another look for it.

No idea if that's the case or not, what is "enough frames"? Enough fps is something that can vary on a game by game and user by user basis. I don't recall dlss ever needing a certain framerate to function properly.[/QUOTE]

There is a video that I posted recently on low resoultion DLSS gaming that mentions it.
 
In "everything else" what you really mean is Raytracing. I like how you turn one thing into everything... The 6000 cards for the most part have more VRAM, i personally decided that was more important feature for me than better Raytracing performance this gen.



Yes that's was the intention. If you just want to keep bringing up Control/CP2077 then I will just bring up godfall since the 90fps my 6800XT gets at ultra settings with raytracing on is very playable thank you very much. :p

If AMD's cars don't meet Nvidia head on with DLSS equivalent then it's a big blow, and I find it hard to see how AMD will, given the fact that Nvidia cards have extra hardware to facilitate DLSS and allow the performance increase to be more than if there were no extra hardware.

DLSS is a very much needed technology at the moment if you want to play with all bells and whistles at 4k.
 
Status
Not open for further replies.
Back
Top Bottom