• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why - "Is 'x' amount of VRAM enough" seems like a small problem in the long term for most gamers

I disagree that consoles made a difference to that degree.

I could just as easily refer to the 980Ti vs Fury X for example. My point is the drastically unbalanced GPUs hit a VRAM limit before they hit a performance limit. If we look at the 2080 RTX it has 8GB VRAM and honestly it's not an issue because it doesn't have the grunt to play modern games in 4K at full settings. A 3070Ti or 3070 on the other hand can do 4K gaming right now with the same VRAM but I would bet they won't be great for future proofing compared to a 6800 for example.
 
I will stick to 1440p with a 3090 and higher FPS as it helps crappy LCD looks bit more fluid like CRT/Plasma (in addition to Adaptive Sync), do not want lower FPS at 4k.

60FPS was the Golden number back then but not now on LCD, ideally need higher.

Maybe the 4080/90 will be a true 4k card, may even have some decent screen tech by then (lol).
 
If we are talking about upgrading a year or two sooner then I guess people are wanting to keep cards for many years, in which case one might question why they are scrimping in the first place. Like to have a good probability of longevity you really gotta be looking for to the top end of market that saving the pennies on the midrange and maybe falling foul of bandwidth and yes, VRAM on occasion.

7950 did benefit from FineWine as GCN wasn't that well optimised when it came out despite being good hardware, I was reading/watching some modern benchmarks on this the other day as I recently put my 7950 back in my son's PC. The 7950 is actually beating the 780 in a lot of cases which is also a 3GB RAM chip so the reason the 7950 has aged well is more than just VRAM as it beats later generation GPUs with the same VRAM. A nice vintage red that's for sure and yes the 3GB helps but let's be honest I bought the card in 2012 it's not like any card bought then can handle modern demanding games, so 2GB or 3GB any card from that era is too slow by modern standards.

As for a 3070 running out of VRAM at 4k, it's a sub-£500 list price GPU, I think people have unrealistic expectations if they are expecting to be sat here in a few years running 4k smoothly on that sort of tech. I bet the 3060Ti with 12GB VRAM will not last any longer apart from maybe the odd niche case that will be at least counter-balanced by other games that run faster than on the 3070. I mean when has a sub-£500 GPU EVER been able to push 4k for many years regardless of how much VRAM is has?

The 6800 is also in the same price bracket and comes with 16GB. The idea that a £500 GPU is now considered mid-tier and that should be a "throwaway" item is frankly ridiculous. It doesn't change the fact it is a woefully unbalanced GPU with VRAM that will be an issue before relative performance is. The vast majority would consider a £500 GPU an item they should get more than 1 or 2 years use out of.
 
There's a difference between being a throwaway GPU, and one that can't handle extreme resolution many years down the line. You can absolutely get more than 1-2 years of use out of a 3070, if you are realistic about render resolution in new AAA games.

It's horses for courses, if people want super long lasting GPUs, and want to play ultra high definition years down the line, then they need to reset their expectations of how much money they need to spend.

I mean look it this way, if you could buy a sub-£500 today, and still play smoothly at 4k res in 4 years time, then what's the point in all the higher end GPUs that will have been superseded by then anyway?
 
I wouldn't really consider a 3070 or even 3070ti to be a 4k card tbh, even at the time of release.

They both give roughly 2080Ti performance or better and the 2080Ti was hailed as the first true 60FPS 4K. The 3070 holds 60 FPS in most current 4K games in rasterisation performance and most reviews call it a very capable 4K GPU. You also seem to keep missing the point that some buy a GPU to last more than 1 or 2 years. So in 2 years do you say "I wouldn't consider a 3070 a 1440p card"?
 
I remember a photo of a guy who put his ATI (yes ATI) GPU in the crapper and took a pee pee on it, so Kap is doing it all wrong.
 
There's a difference between being a throwaway GPU, and one that can't handle extreme resolution many years down the line. You can absolutely get more than 1-2 years of use out of a 3070, if you are realistic about render resolution in new AAA games.

It's horses for courses, if people want super long lasting GPUs, and want to play ultra high definition years down the line, then they need to reset their expectations of how much money they need to spend.

I mean look it this way, if you could buy a sub-£500 today, and still play smoothly at 4k res in 4 years time, then what's the point in all the higher end GPUs that will have been superseded by then anyway?

Nobody is arguing that their GPU should be able to maintain high FPS at high settings for the next 5 years. Those who buy a GPU to last 3-5 years know they will be having to compromise on settings as new AAA games push GPU limits. What I am trying to point out is that some unbalanced GPUs over the years didn't even allow that and it was purely down to VRAM limits. So the OP suggesting that GPU grunt is always the deciding factor is categorically wrong.

So when I could get playable FPS on a 3GB 7950 at settings that were destroying my 2GB GTX 680, that was 100% down to the fact it had an unbalanced amount of VRAM. The same thing for those who had similar issues on a Fury or Fury X GPU with 4GB VRAM limits compared to a 980Ti with 6GB.
 
The vram argument has only started, since AMD released their 68/69 cards with 16GB, now any lower than that, is not enough, its insane that we've gone from the day before those cards came out, our vram was fine, to the day after when they came out, its not fine, we ain't got enough, our cards have been ****** in 1 single day, and whats funny is, we have a card with 24GB, that hasn't rendered the 16GB vram on those cards obsolete, and not one single post saying that its not enough, and they can still run games to this day fine, bizarre!.
 
They both give roughly 2080Ti performance or better and the 2080Ti was hailed as the first true 60FPS 4K. The 3070 holds 60 FPS in most current 4K games in rasterisation performance and most reviews call it a very capable 4K GPU. You also seem to keep missing the point that some buy a GPU to last more than 1 or 2 years. So in 2 years do you say "I wouldn't consider a 3070 a 1440p card"?

2080ti was released in 2018, that's a 3 year old card.... Main selling point of a 3070 was that it offered similar performance in rasterastion but a considerably better card for ray tracing along with a considerably cheaper MSRP price tag compared to a 2080ti.

Also this largely comes down to the end user and what they are happy with, a 3070 will still be a great card for 1440p 2 years down the line and even at 4k as long as said user doesn't expect that they can keep settings whacked to max and getting 100+ FPS at 1440 or 60+ at 4k, the same way any 6800xt/3090/6900xt/3080 users will need to also adjust settings, be that ray tracing related settings or other graphical settings in 2+ years time.

For myself, I knew I wouldn't be satisfied with the 3070 lack of vram nor that it would have the grunt to satisfy my gaming needs for at least a good 2-3 years.
 
The vram argument has only started, since AMD released their 68/69 cards with 16GB, now any lower than that, is not enough, its insane that we've gone from the day before those cards came out, our vram was fine, to the day after when they came out, its not fine, we ain't got enough, our cards have been ****** in 1 single day.


Not really Titan X(M) and X(P) and Xp had 12GB and 1080Ti had 11.5GB so it should have been 16GB for next Nvidia's not 10-11GB.
 
GTX 770 2 GB was EOL 2 years later after its release PURELY because of its VRAM.

- Stutters
- Frame time issues even with LOWEST POSSIBLE TEXTURES THERE IS TO CHOOSE THAT MAKES THE GAME LOOK LIKE A PRE-XBOX 360 title (for the "you can always reduce the textures argument. it rarely works. most games do not scale back in terms of VRAM consumption. don't make assumptions if you've never experienced a VRAM bottleneck before. mostly you're people who upgrade every 2 years so you're just talking out of speculation, whereas i've played with a 2 GB card in 2013 and it became obsolete in 2 years while 4 GB variants kept much better performance while still having the grunt)
- Much lower average framerate compared to 4 GB variant (hence it makes it seem like it does not have the grunt, yet it does, similar to how RTX 3070 falls flat to 43 FPS in RE:Village. From a normal perspective, you only see 3070 averaging 43 FPS at 4K, in reality, without VRAM constraints, its equal 2080ti renders 77 frames. Running into VRAM wall can hamper the performance by up to 2 times
- Low %1s and %0.1s
- Not being able to max out textures that practically are visual upgrade with no rendering performance cost, a huge sacrifice you're expected to make solely because Nvidia wanted you to do so
 
Isn't the 3070 already struggling in some titles? Not much of a prediction now is it.:p
3070 actually falls flat even at 1440p

(reference info, the values in the GPU Mem Game are 1) Dedicated MEM usage 2) Shared MEM usage (the data that is spilling to normal RAM due to running out of VRAM. observe how frames tank proportionally to the shared mem usage. again you can see turning everything to high does not solve the issue, and again, 2080ti can push 1440p 60 fps at high settings while 3070 suffers at 20-30 fps. you're welcome to the GRUNT city. i don't care if its a ****** game or not. it exposes that the 3070's vram can even suffer/fail at 1440p with rt enabled so early in the generation.)

https://www.youtube.com/watch?v=sJ_3cqNh-Ag

Its days are numbered

Its only a good 1080p card in my perspective. I don't expect it to perform good with nextgen textures in nextgen 1440p games. Maybe it can barely push nextgen textures at 1080p, but maybe.
 
The 6800 is also in the same price bracket and comes with 16GB. The idea that a £500 GPU is now considered mid-tier and that should be a "throwaway" item is frankly ridiculous. It doesn't change the fact it is a woefully unbalanced GPU with VRAM that will be an issue before relative performance is. The vast majority would consider a £500 GPU an item they should get more than 1 or 2 years use out of.

That sounds great until you turn on RT and realise RDNA2 is already outdated.

I wouldn't really consider a 3070 or even 3070ti to be a 4k card tbh, even at the time of release.

It's not.
 
That sounds great until you turn on RT and realise RDNA2 is already outdated.



It's not.

You don't half spout a lot of Nvidia shill nonsense sometimes. Enabling RT even kills a 3090 and needs DLSS to get anything close to playable FPS. If FSR does the same thing for RDNA2 then both are relying on a crutch to remain playable. Not having RT does not make a GPU obsolete considering the vast majority of games are still reliant upon rasterisation performance.
 
Last edited:
You don't have spout a lot of Nvidia shill nonsense sometimes.

How is that Nvidia shill nonsense?


Your the guy complaining that 10GB is not enough at 4k when you tried to run 2 4k panels in VR on a 3080. Real smart...
 
Last edited:
Back
Top Bottom