Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Did we really need another thread on VRAM there was only one way it would go.
The other one is at 208 pages now. Plus before anyone says thats only about the 3080 its long long since lost that subject.
This is not a "is this enough VRAM thread". This is a thread about why people keep asking this question because the OP thinks it has never been an issue in the past.
I reminded him and everyone else that the there have been very unbalanced GPUs in the past with low VRAM. These GPUs hit VRAM limits in many cases before they ran out of GPU grunt. For example when Dragon Age inquisition came out and the 7950 was getting a reasonably playable 40 FPS average and lows of 30ish with medium/high settings and the GTX 680 was getting 25 FPS (and lows of 5).
So will this impact the 3070 which has plenty of GPU grunt but with 8GB VRAM. is what many consider on the cusp of not enough for 4K and even 1440p. Time will tell.
Putting lots of (slow) memory with low end GPU or renamed outdated model is sure one selling tricking.I've always been firmly in the Grunt over VRAM camp; if your card is too slow no amount of VRAM can compensate for that. VRAM can generally be worked around by lowering certain settings, I don't think I've ever had a situation where I've felt screwed by lack of VRAM, but I've definitely been screwed by lack of raw horsepower. That doesn't mean I've never hit a VRAM limitation, it just means whenever I've hit a VRAM limitation it's always been easy to resolve.
I dispute that running low on VRAM is inherently harder to resolve than running out of horsepower though. Like if you lack the fillrate you are screwed, doesn't matter what settings you are choosing only option is to reduce resolution which is catastrophic in the modern era with LCD screens (native resolution).
That's not to deny the possibility you can have a genuinely severe VRAM shortage but more fool you if you are e.g. buying a 4GB card in modern era.
TL; DR
What I was getting at, is that I think people don't need to worry much about buying graphics cards with 8GB of VRAM this year, you can just upgrade next year or 2023, likely at little to no extra cost (the caveat here is the price you brought your graphics card for, which this year has been almost entirely decided by if you got a AIB or reference model).
Maybe this is slightly optimistic and I know not everyone likes to upgrade their GPU every year or so. Before getting a RTX 3070 FE, I was using a R9 390 bought in 2015. So, I waited 5-6 years and got a card with the same VRAM capacity lol. Would I have brought a RTX 3070 with 12/16GB of VRAM for an extra £50? Probably, but no more than this.
But, the resale price of graphics cards is very high, so why not just upgrade every year? The more frequently you do this, generally, the higher the resale price. The rules for buying reference / FE models seems to be 1 per generation at the moment (per household).
Upgrading a graphics card is a piece of p*ss (if you have a half decent power supply). I think the production capacity will improve a bit next year too, it will be interesting to see how switching to 6/5nm GPU dies affects this.
Yes this is the point I was making about the RE Village benchmark that someone cited earlier; you can construct a scenario where a 3060/6700XT with 12/16GB VRAM will outperform a 3070 with 8GB. But in doing so you gotta enable RTX and max settings and you end up not being able to push the high framerate you want anyway. So it would basically suit people with 60hz monitors.as you load up on vRAM usage in modern games you also load on the GPU and the GPU only has so much raw grunt before it's not going to give you playable frame rates. We have numerous examples where you can get near 10Gb of usage on a 3080, but the performance is unplayable at the settings required to do that, and so the point is moot.
Yeah i wouldn’t call 30fps reasonable on a PC in my view that’s definitely running out of grunt.For example when Dragon Age inquisition came out and the 7950 was getting a reasonably playable 40 FPS average and lows of 30ish with medium/high settings and the GTX 680 was getting 25 FPS (and lows of 5).
Yeah i wouldn’t call 30fps reasonable on a PC in my view that’s definitely running out of grunt.
I remember there was this same discussion/argument between GTX970 3.5GB/GTX 980 4GB vs the 390/390X 8GB on whether or not having 8GB matters; then fast forward a couple of years there are games using High-res texture which the 390/390X has no problem using for games, while the 970/980 simply don't even meet the requirement for enabling them.
I think it discussion is not whether or not X amount of enough, but more about if people are already playing above a certain amount money they would expect very clean-cut that vram will not be the factor that bottleneck the graphic card performance, especially with SLI and Crossfire is already discontinued. With even older at sub £200 few years back already having 8GB, it does make people question that something as expensive as the 3070 and the 3080 only have 8GB and 10GB prospectively. It's like building a new system now one could argue that 8GB system memory would still be "sufficient" for gaming, but nobody is comfortable with less than 16GB for their system in this day and age.
In the testing done in this video seems the 2GB and the 4GB 680 perform pretty much the same.
Also from what I remember there was loads of people on here saying back in the day that the older 4GB cards/ Intel CPU's with HT wasn't worth the extra cost over the 2GB cards/Intel CPU's without HT.