• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Metro Exodus Enhanced is skimping on RT effects, probably due to it being a free upgrade.

It's when you dial up the effects, such as CP2077, that you see AMD crumple. Even Matt admitted that you have to turn down IQ for AMD to be competitive in that title and that there is the problem. Why would I buy a next gen card that just doesn't do next gen well.

At best RDNA2's RT is comparable to Turing and we know how that turned out.

So back on topic, is 10GB enough? Well why isn't it today and why won't it be tomorrow?

Is that how we are going to cope with this now? Call everything doesn't conform to our argument substandard and point back at Nvidia's older games as proof of truth about AMD's bad RT performance?
 
Why it is a bad thing that RT effects are toned down?

Why do we need it maxed out? Clearly neither 3070 nor 3080 can't handle it. 3080 can get native 4K 80+ FPS in RE Village. That is pristine.

3080 can't get 1440p (in other words, 4K DLSS quality) 60 FPS in Cyberpunk with all the bells and whistles. You need to either use 1440p dlss quality, or 4k dlss performance, which defeats the purpose (kills the sharpness and the overall image quality) And Cyberpunk textures look horrible compared to RE:Village

It is much better to have toned down effects and high fps/high resolution. It still benefits the game greatly with subtle RT effects. There's no need pushing all out on Ray Tracing. Because even Nvidia cards are not capable enough. And they never will be unless they push 4-5 times RT performance in a single generation, because games will require even more gpu power as they evolve. if you add huge RT costs on top of that, no GPU will survive 2 generations. if you really like to shell out money every 2 years, all the power to you. not everyone is content with that kind of attitude.
 
Is that how we are going to cope with this now? Call everything doesn't conform to our argument substandard and point back at Nvidia's older games as proof of truth about AMD's bad RT performance?

Not at all. What we should be doing is defending RDNA2's poor RT and lack of DLSS, while blaming Nvidia :rolleyes:

Or even better, just discussing why 10GB is not enough today or why you think it won't be over the next 1-2 years with some sort of technical reasoning.
 
These GPU's are all rendering the same thing, its not that when you tone down some of the effects that Nvidia's GPU's become weaker.

You know what it is, these games have to run on consoles. This is why Nvidia having it their way in how RT is done was never going to last.

Nvidia and AMD do retracing differently, not better or worse.
 
Agreed, turns out its a lot better than initially proclaimed tho.

PS: don't know the relevance of the link you posted, appears to be a German language site with a crap ton of advert banners and some synthetic RT Benchmark.

Looks like someone doesn't use adblock :p

It basically states what we all know:

  • RDNA-2 GPUs can handle more complex ray tracing calculations better than Turing
  • The more complex the ray tracing in space, the better Ampere performs
  • compare Turing-TU102 as 1st-Gen-RTX with Ampere-GA102, Ampere is up to a factor of 2.7 faster
  • if games have more complex ray tracing going forward, rdna 2 and specially turing are going to suffer a lot more than ampere
Yes it is but you took it off topic with tangents, so I'll leave it there. :p

That's one way to not address any of them perfectly valid points eh ;) :D

I prefer the screenshots on the left lol

They look overblown on the right.

But the difference still shouldn't be that big, seems a poor attempt by the developers.

And point proven! :D There will always be people who "prefer" the look of artificial/fake lighting.

It's by far the best ray traced game we have to date:


https://www.youtube.com/watch?v=sJ_3cqNh-Ag

CPU has NOTHING to do with the topic. We're talking about VRAM here. I'm having VRAM RELATED issues in CYBERPUNK, GODFALL, AND RE VILLAGE AT 1080p/1440p. There's no point bringing up the CPU here (nice trick to derail a discussion, though. props)

And do you play at 4K? If so, weird. I would like to have a 10 minutes of video where you cruise around the city with a frametime graph open. I present real and solid facts. I would suggest you do the same.

Emmmmmm, you're complaining about fps drops, stuttering etc. issues in cyberpunk and other ray traced games... I had the same issues with my 3080 and 2600, I upgraded to a 5600x, problems resolved, as smooth as butter, no system lags etc. A 2700x is not good enough these days especially when paired with nvidia gpus such as the 3xxx, fact. You have paired a weak ass old cpu with a pretty powerful new gen gpu, of course you're going to have issues when your cpu can't keep up with the gpu. Either switch out for a 6800 which works better with older/weaker cpus or upgrade your cpu.

EDIT: And yes I have a 4k oled where I play certain games. There is no point me recording videos when youtube already has plenty of footage out there for you to see.

https://www.youtube.com/results?search_query=cyberpunk+3080+4k

Also suggest you google 2700 vs 5600x cyberpunk to see why your experience is horrible....

No not pretty much what you said, not even close. You also ironically enough ignored this

I guess you just didn't understand the statement. That is also fine.

You then proceed to claim this

How does increasing the number of unique assets mean they are making sacrifices in other areas relating to graphics? The number of unique assets used to produce a certain look, is based on the concept art and/or the artistic direction of the game. That's it.

You also seemed to have ignored this part as well

I see a theme going on. Then again, it is not like you care now do you.

Eh, yes, what I said:

Oh and before you go on about things like big open world games re-using assets over and over again etc., yes I know

:D

At the end of the day, you're still forgetting, it's the end result that matters. No one cares if a game is reusing assets etc. if it looks stunning as is the case with rdr 2, div, days gone, metro, tomb raider, batman ak etc. then why does it matter if the developer is reusing assets???

So we are just going to ignore that certain games, genres and artistic directions lend themselves more to reusing certain assets to help build the enviroment over other games which may require more unique assets per scene to help build the enviroment.

Examples?

How does increasing the number of unique assets mean they are making sacrifices in other areas relating to graphics? The number of unique assets used to produce a certain look, is based on the concept art and/or the artistic direction of the game. That's it.

For the exact reason we stated why they reuse assets in the first place.... to reduce vram usage and optimise for performance (and obviously for the developer, to save time), with that vram saved, they can use the spare resources for something else which will make a substantial impact to graphics end result....

Define "good" and "better"? Are we talking technically or artistically? Because one of those has nothing to do with VRAM.

Trying to make a mountain out of a molehill here I see.... It's pretty obvious what "good" and "better" means when talking about graphics.... Now if we were comparing something like borderlands/valheim to metro/resident evil village, it's a bit different but we're comparing games which are aiming to be as close to realistic looking as possible and artistically metro and resident evil are much the same.
 
3080 can't get 1440p (in other words, 4K DLSS quality) 60 FPS in Cyberpunk with all the bells and whistles. You need to either use 1440p dlss quality, or 4k dlss performance, which defeats the purpose (kills the sharpness and the overall image quality) And Cyberpunk textures look horrible compared to RE:Village

It is much better to have toned down effects and high fps/high resolution. It still benefits the game greatly with subtle RT effects. There's no need pushing all out on Ray Tracing. Because even Nvidia cards are not capable enough. And they never will be unless they push 4-5 times RT performance in a single generation, because games will require even more gpu power as they evolve. if you add huge RT costs on top of that, no GPU will survive 2 generations. if you really like to shell out money every 2 years, all the power to you. not everyone is content with that kind of attitude.
Totally agree, I didn't even use RT in cp2077 on my 3080 as the frame rate hit was too much and I play at 1440p.
 
Why it is a bad thing that RT effects are toned down?

Maybe because higher quality effects can be turned down when needed.?
That makes a game more future proof which can be later enjoyed more on new generation of hardware.?
I don’t know why so many people are obsessed with maxing every setting in game.
 
Why it is a bad thing that RT effects are toned down?

Why do we need it maxed out? Clearly neither 3070 nor 3080 can't handle it. 3080 can get native 4K 80+ FPS in RE Village. That is pristine.

3080 can't get 1440p (in other words, 4K DLSS quality) 60 FPS in Cyberpunk with all the bells and whistles. You need to either use 1440p dlss quality, or 4k dlss performance, which defeats the purpose (kills the sharpness and the overall image quality) And Cyberpunk textures look horrible compared to RE:Village

It is much better to have toned down effects and high fps/high resolution. It still benefits the game greatly with subtle RT effects. There's no need pushing all out on Ray Tracing. Because even Nvidia cards are not capable enough. And they never will be unless they push 4-5 times RT performance in a single generation, because games will require even more gpu power as they evolve. if you add huge RT costs on top of that, no GPU will survive 2 generations. if you really like to shell out money every 2 years, all the power to you. not everyone is content with that kind of attitude.
Fair points there tbf.
 
That has nothing to do with the topic at hand though, you just move the goal post to talk about something else.
He blatantly refuses the clear evidence in front of him.

https://www.youtube.com/watch?v=v74fd_w_AxI

https://www.youtube.com/watch?v=RUdK3W6bYWc&

I'm done arguing with him. There's no point. Look at two screenshots above. Look at average frames, GPU usage, and frametime graphs. It is the 3070's frail VRAM crumbling because a couple of background apps. Why is it so hard to understand? When game starts using RAM as VRAM, it clearly stutters and suffers in performance. What does this have to do with CPU?

JrNsBrE.png Hmnms0o.png

He doesn't even want to accept that these frames drop because of VRAM. Instead, he keeps insulting me and my CPU because that's the only argument he got against me.

I'm getting smooth frames with this combination of CPU and GPU across a variety of games. Only when I use HW accerelation on BACKGROUND APPS, problems arise. I CAN clearly see the UPLIFT in VRAM usage. Frametimes suffer, %1s suffer, SHARED MEMORY USAGE goes up.

And somehow even if he accepts this problem, he will bail out saying it will never happen on 10 GB (which will, eventually)
 
Totally agree, I didn't even use RT in cp2077 on my 3080 as the frame rate hit was too much and I play at 1440p.

I'm old enough to remember when a Phenom II X6 1100T would run PhysX faster than a GTX 480, and Nvidia reacting to that by gimping PhysX calc on the CPU.

I'm sure every one here remembers Nvidia deliberately hurting their own performance, their own users experience in some way if it meant it hurt the competitors performance more.

People have done very deep analysis on Ampere and RDNA2 Raytracing performance, its mostly Linux guys who do this, Windows based Tech Reviewers are a bit dumb and only do face value analysis. The fact is Ampere is better in same aspects of RT and RDNA2 is better in other aspects.

With that in mind remember Nvidia's history.
 
He blatantly refuses the clear evidence in front of him.

I'm done arguing with him. There's no point. .
I know. I've seen this before when I had an 8GB GPU.

No need to do anymore you proved your point.

Some people are very entrenched with their views and nothing, not even owners of said GPUs with first hand experience will change their viewpoint.

It is what it is. :)
 
Maybe because higher quality effects can be turned down when needed.?
That makes a game more future proof which can be later enjoyed more on new generation of hardware.?
I don’t know why so many people are obsessed with maxing every setting in game.
Unless we're developers ourselves, we may never know how easy or hard it is to make a game scalable in terms of Ray Tracing. Cyberpunk is a clear example, it has 4 RT knobs, but it's not enough to scale for non-DLSS cards. It is an all out attempt at RT which fails to function without the aid of DLSS for example.

Most games scale very horribly between low and high settings, for miniscule amount of performance gains, for example.

It is really hard to satisfy both sides for some specific games. You do have a point, but sadly this is what we got now. If AMD can be succesful with their own FSR, I guess we may see more all-out RT effects across more games. FSR should've been ready with the release of RDNA 2 cards.
 
Maybe because higher quality effects can be turned down when needed.?
That makes a game more future proof which can be later enjoyed more on new generation of hardware.?
I don’t know why so many people are obsessed with maxing every setting in game.

Exactly this.

Why not give the ability/options to dial up effects for people who have the hardware to do so?

He blatantly refuses the clear evidence in front of him.

https://www.youtube.com/watch?v=v74fd_w_AxI

https://www.youtube.com/watch?v=RUdK3W6bYWc&

I'm done arguing with him. There's no point.

snip

He doesn't even want to accept that these frames drop because of VRAM. Instead, he keeps insulting me and my CPU because that's the only argument he got against me.

I'm getting smooth frames with this combination of CPU and GPU across a variety of games. Only when I use HW accerelation on BACKGROUND APPS, problems arise. I CAN clearly the UPLIFT in VRAM usage. Frametimes suffer, %1s suffer, SHARED MEMORY USAGE goes up.

Arguing???? :cry: Bud, you're the one posting saying how crap a game is and how you're having issues and blaming it because of vram when there is far more at play with your system than just vram... If you don't like that then there is no helping you.

Again...... I had the exact same issues with very similar hardware, guess what, it was the cpu.... hardware unboxed have gone over the nvidia driver overhead with older/weaker cpus especially with ray tracing....

What do you expect when trying to run a nextgen ray traced game with higher settings than what your hardware is capable of....

It's almost like me trying to run cyberpunk maxed out on a vega 56 and ryzen 2600, even at 1080p, I knew what the limitations were thus I dialled the settings back....

That has nothing to do with the topic at hand though, you just move the goal post to talk about something else other than the original topic.

Fair, but then again, the points you made didn't have anything to do with the thread title either ;)



So back on topic then:

What other games is a 3080 having issues with because of vram @1440/4k? So far we have:

- godfall - an outlier and even then, as per the youtube videos etc. posted before, a 3080 looks to be performing very well here

- hzd - debunked since the one person who did a comparison was using a 5700xt/vega iirc and on an older version of the game, which had issues rendering textures

- re village - as above, seems like there aren't any issues with performance and a 3080 actually performs better than a 6800xt (with ray tracing on)

Anything else?
 
Exactly this.

Why not give the ability/options to dial up effects for people who have the hardware to do so?



Arguing???? :cry: Bud, you're the one posting saying how crap a game is and how you're having issues and blaming it because of vram when there is far more at play with your system than just vram... If you don't like that then there is no helping you.

Again...... I had the exact same issues with very similar hardware, guess what, it was the cpu.... hardware unboxed have gone over the nvidia driver overhead with older/weaker cpus especially with ray tracing....

What do you expect when trying to run a nextgen ray traced game with higher settings than what your hardware is capable of....

It's almost like me trying to run cyberpunk maxed out on a vega 56 and ryzen 2600, even at 1080p, I knew what the limitations were thus I dialled the settings back....



Fair, but then again, the points you made didn't have anything to do with the thread title either ;)



So back on topic then:

What other games is a 3080 having issues with because of vram @1440/4k? So far we have:

- godfall - an outlier and even then, as per the youtube videos etc. posted before, a 3080 looks to be performing very well here

- hzd - debunked since the one person who did a comparison was using a 5700xt/vega iirc and on an older version of the game, which had issues rendering textures

- re village - as above, seems like there aren't any issues with performance and a 3080 actually performs better than a 6800xt (with ray tracing on)

Anything else?

Do you have a motto in life? is it "that'll do" do you still think 4 cores are all you need?
 
Do you have a motto in life? is it "that'll do" do you still think 4 cores are all you need?

Ok so when we hit a point where the "majority" of games are "literally" unplayable even with "appropriate" settings (as in not having to drop 90% of the settings below "high"), what do you think is going to be the cause for that? Lack of vram or lack of grunt?

PS. we all know that 4 cores isn't enough, do you think 6 cores is enough? Bearing in mind a 6 core 5600x performs better in games than 8 core 3700x....

In 1-2 years time, we'll be on the next gen of gpus and not because of more vram but because of far better ray tracing perf. and performance all round....

Or.... are you telling me that a 3090 or a 6900xt are going to perform better than nvidias 4080 and amds 7800xt (or whatever they will be called)? If so, why isn't a rtx turing titan performing better than a 3080???

His specs are well above the recommended for Cyberpunk according to the developer.

bp6iGkG.gif

You mean the specs where they completely lied? :cry: According to them, I should have had no issues with my vega 56 and 2600 but boy I did...
 
His specs are well above the recommended for Cyberpunk according to the developer.


If its VRAM is failing at 4K, there's no reason it shouldn't at 1440p or 1080p The graphs don't lie, there's not a huge reduction of VRAM between 4k and 1440p. He somehow accepts that the 8 gigs failing at 4k but unable to accept that same fate can happen at 1440p/1080p with background apps.

Please don't let him derail the discussion. I guess I need to get a 5600x (which I don't need to, because I get smooth and nice frames across a variety of games with GPU bound settings) and prove the same VRAM bound frame drops, and then he will say 8 GB sucks. The discussion shall go on forever.
 
Status
Not open for further replies.
Back
Top Bottom