Imagine getting 5fps and saying 10gb is enough when 16gb cards get 10 times that
Yeah man and imagine spending around 2K on a GPU and having to turn down settings. Not sure which is more funny
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Imagine getting 5fps and saying 10gb is enough when 16gb cards get 10 times that
I will take a break from the VRAM stuff, we just going to go round and round and round.
Few understand.
Considering the 4070ti is largely marketed as a 1440p card while being 20% faster than a 3080 then no I'd say the 3080 is no longer a 4k card unless you're willing to compromise on setting.Sigh, is a 3080 a 4K, DLSS on its highest settings, with RT also on its highest settings card anyway? Not really would be my answer.
You won't get threw to some, it's a waste of time and they will continue to spam their narrative. When the answer is right in front of them but refuse to accept it.I will take a break from the VRAM stuff, we just going to go round and round and round.
I believe the correct term is FOMO. What’s with everyone acting like a 2 year old card is obsolete?Considering the 4070ti is largely marketed as a 1440p card while being 20% faster than a 3080 then no I'd say the 3080 is no longer a 4k card unless you're willing to compromise on setting.
I tested by playing games at 4K and looking at Afterburner Beta etc I looked at allocated vs actual vRam useage in the games I played at 4k. I didn’t release YouTube videos about it sorry as decided it was a non-story in 2020 - as I still think it pretty much is in 2023. Rather than vRam issues my “issue” is I would like more GPU grunt Than my 3080 can give me.It cannot be over committed, meaning allocated is effectively the same as used.
You also failed to describe the nature of your testing. Its a very vague post.
But you drop texture quality until you don't run out of vram. Just like you drop RT quality until you don't run out of RT performance.There is a big difference. If you run out of VRAM typically a game becomes unplayable even if its a very slight deficit.
If you have a slight deficit on grunt, you get a proportional drop in framerate.
Have you ever ran out of VRAM?
What is right in front of us? I have a 4090 and a 3060ti. In both cards I have to make compromises to play with over 60 fps. What exactly is the issue?You won't get threw to some, it's a waste of time and they will continue to spam their narrative. When the answer is right in front of them but refuse to accept it.
Well I could argue the exact same back at you, without something more substantial all it does is make me realize im just right, lol.
Well I could argue the exact same back at you, without something more substantial all it does is make me realize im just right, lol.
The way you have described the situation means you dont understand the problem.
Thinking about it rationally, its quite obvious, the only question is why we still have people defending it and trying to pretend its something else.
Its ok saying "it doesnt affect me on the games I play and my preferences" but saying "its overblown" or "its a narrative" is been disrespectful and shows a lack of understanding.
No different than people jumping defending AMD with the vram is not enough stuff
Not enough grunt either though in long term. Just look at Harry Potter, seems like you need a 4090 to max it out and even then still need to turn down settings. Lol
The grunt depends on the review or video you look at. At 1080p and 1440p it does seem to have enough grunt but not enough Vram but in others the numbers seem to be way different. In the Hub review the numbers looked decent until Vram ran out so grunt was there. Really don't know what's what with this game.
I bet you a paycheck not a single person went from a 3080 to a 4090 cause of vram. Again, im asking for the 99th time, what's different between me dropping RT settings cause my 3090 can't get 60 fps, and dropping texture settings cause my 3080 can't get 60 fps? Why is one okay and the other not?Quite common at least in threads like this.
This is basically on point condensing pages of waffle. What's comical is some of the highly active defenders have moved on from a 3080 to a 3080Ti and 4090's. This is exactly what we predicted as it avoids the problem.
I bet you a paycheck not a single person went from a 3080 to a 4090 cause of vram. Again, im asking for the 99th time, what's different between me dropping RT settings cause my 3090 can't get 60 fps, and dropping texture settings cause my 3080 can't get 60 fps? Why is one okay and the other not?
The game might need optimizations, but it won't change the fact that 10gb is not enough to run the game maxed out with RT. The question is - who cares. You have to drop settings on a 2 and a half year old card at the latest - great looking RT game, that's just freaking NORMAL. Im sitting here with my brand new just released 2000€ GPU and have to drop settings as well, so I really really can't fathom what the flying duck are people complaining about. Especially considering the actual competitors of the 3080 are doing much worse, why are we even talking about it? Why is it an issue? I also have to drop settings on my laptop with a 1650, so what? What am I missing? It seems to me people have an agenda, there is no way they legitimately consider it a problem that a 2.5 year old 699€ msrp card needs to drop some settings in the latest game. That's freaking nuts. They can't actually mean what they say, I think they have an axe to grind.But I thought we established that the game is not optimised and the results make no sense? Do you have any other examples or just Harry Potter?