• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why - "Is 'x' amount of VRAM enough" seems like a small problem in the long term for most gamers

Soldato
Joined
30 Jun 2019
Posts
8,159
I'd be more concerned with a graphics card's total memory bandwidth, as this will effect all games you play, to some extent (regardless of VRAM usage).

The reason I think it's not the most relevant question to ask regarding performance (especially if you managed to buy a Founders Edition /reference graphics card), is that you should be able to upgrade next year (or 2023), if you brought a graphics card in 2021 with 8GB of VRAM.

Sure, you can buy a high end Nvidia graphics card with 10-12GB of VRAM this year, but you will pay through the nose if you do.

Many of these cards should still be quite valuable next year, perhaps even selling for the same / more than you brought them for. That's because the prices aren't just driven by Etherum GPU mining (a craze that is now on the decline), but also the massive deficiency in semiconductor and VRAM manufacturing capacity this year, which isn't likely to disappear any time soon.

You will likely be able to upgrade to a graphics card with 12/16GB next year, or 2023, then sell your current gen. GPU and upgrade at little to no extra cost. This is one good thing AMD has accomplished, all of their RX 6000 series has 12/16GB of VRAM, 4GB more than the RX 5000 series, which puts a lot of pressure on Nvidia to include higher quantities of VRAM on future graphics cards (which will likely all use GDDR6X VRAM, except for the low end).

Also, on most 8GB cards like the RTX 3060 TI/3070, you don't tend to get VRAM performance issues at 1440p resolution. You can also use various DLSS modes (and FSR soon too) at 4K, which will also reduce VRAM usage.
 
Last edited:
I'd be more concerned with a graphics card's total memory bandwidth, as this will effect all games you play, to some extent (regardless of VRAM usage).
I wouldn't. A Fury X has memory bandwidth out the wazoo, yet is still pretty much garbage these days due to being saddled with 4GB of VRAM. The 980 Ti only widened its advantage over it over time, because those extra couple of gigabytes were far more useful than the additional memory bandwidth, since the latter does you absolutely zero good if you end up swapping into system memory anyway. There's also the fact that a core only needs so much memory bandwidth before something else becomes the limiting factor, meaning not all GPUs are bandwidth-starved.

It's a non issue created by people who can't get hold of a certain card and end up trying to convince themselves that they don't want it anyway.
This is cope from somebody who shelled out for a 3080 and is trying to convince themself that its gimped VRAM won't become an issue a year down the line. ;)
 
Fury X was the worst card AMD produced in the last 10 years, it was struggling for memory from day 1 @2160p.

I was so disgusted with the 4 I bought that I threw them in the dust bin in the end.

Having said that if the Fury X was used at a lower resolution and not in mGPU it was a pretty good card.
 
It's a non issue created by people who can't get hold of a certain card and end up trying to convince themselves that they don't want it anyway.

I don't want a card (or any PC component) that can't be produced at scale, that hardly anyone can actually buy at reference model price. It's a very good graphics card, but I can get good enough performance on a rtx 3070 for now.
 
I don't want a card (or any PC component) that can't be produced at scale, that hardly anyone can actually buy at reference model price. It's a very good graphics card, but I can get good enough performance on a rtx 3070 for now.

All true, and it just reaffirms the point that there are plenty of reasons not to buy a 3080 right now, but the amount of Vram certainly isn't one of those reasons.
 
The problem the OP is missing is that a VRAM limit is not a sudden onset on every game and the majority buy a GPU to last more than 1 or 2 years. VRAM issues are a gradual thing and you initially find it happening on the odd game where you have to lower settings even though GPU grunt seems to be there. It will be a rare event at first, with some random modded game or niche sim but the list of games you have issues with grows.

For example the 2GB GTX 680 ran out of VRAM limits long before the competing AMD 3GB HD 7970. A 7970 could still get playable FPS using medium/high settings at 1080p 3-5 years after release while a GTX 680 was having to use low/medium in most newer games due to VRAM limits. Now that may not sound like much but that is like running a 2060 at medium against a 2080 at high and declaring them competing GPUs. So when you hear the "you run out of GPU grunt long before VRAM" it is usually from someone who upgrades every year or 2.

So it is worth asking when some enthusiast on this forum says "x amount of VRAM is perfectly fine", if they upgrade their GPU every year. Then reminding them that the majority buy a GPU to last them 3-5 years.

8GB on the RTX 3070 is going to be perfectly adequate for the majority of games and the majority of owners for a while yet. especially if they are paired with a 1080p monitor. Most will never even encounter any issues as they will move on in a year or two max. Yet for the majority of gamers who buy a GPU to last for years, this will become a problem before GPU grunt does.
 
Last edited:
Nvidia have always been skimping on VRAM compared to AMD over the years.

My 1070 was the 8gb I believe there were 4 GB versions also that would be struggling now. Nvidia have historically had 2 versions of cards, one with double VRAM, obviously at a premium, bit cheeky really considering the AMD cards just shipped the the higher amount stock, and usually cheaper cards.

Even when I got the card 4 and a half years ago, Fallout 4 with a couple of mods hitting 7.5gb VRAM.

I think Nvidia do it to skimp a bit but also so people have to upgrade more often.

If you look at the 6700xt Vs 3070, ok the 3070 is what, 11% faster on average.

But give it some years or certain games, the 6700x might be better in the long run.
 
The only problem is really people like me who will keep a card a few years(3 or 4 years) and play at 4k.

People like you who hold on to GPUs for 3 or more years are by far the majority. I work in IT and know dozens of PC gamers. Most of them are rocking GTX 1070s, 1080s or even the odd GTX 970 and RX390 and few with 5700XTs and one who will probably hold on to his 1080Ti forever. What I found telling was that most of them thought Turing was a joke of a release. Only myself and one other person would upgrade every year or 2 max and he has a 3070 and I have a 3080 and both game at 4K.

This is an enthusiast forum full of enthusiasts who would have us believe that upgrading every 6 months and paying £1000 - £1500 for a GPU is normal.
 
The debate is as old as the stars, VRAM vs mem bandwidth or just general grunt. Take the GF4 Ti4200 nearly 20 years ago, you had a choice of 64MB or 128MB but the 64MB had faster memory, even back then people were pretty opinionated on it.

I've always been firmly in the Grunt over VRAM camp; if your card is too slow no amount of VRAM can compensate for that. VRAM can generally be worked around by lowering certain settings, I don't think I've ever had a situation where I've felt screwed by lack of VRAM, but I've definitely been screwed by lack of raw horsepower. That doesn't mean I've never hit a VRAM limitation, it just means whenever I've hit a VRAM limitation it's always been easy to resolve.
 
most people keep a card and replace every 3-5 years

hence the 1060 being the most popular card still in use

as for your current 30xx card holding value into the next couple of years who know but by all reports the next gen cards will all be chiplets talk is 2x the performance of the current gen

with Intel also on the horizen the market could be shaken up

not to mention AMD in all the consoles we could see Nvidia fall behind performance wise
 
People like you who hold on to GPUs for 3 or more years are by far the majority. I work in IT and know dozens of PC gamers. Most of them are rocking GTX 1070s, 1080s or even the odd GTX 970 and RX390 and few with 5700XTs and one who will probably hold on to his 1080Ti forever. What I found telling was that most of them thought Turing was a joke of a release. Only myself and one other person would upgrade every year or 2 max and he has a 3070 and I have a 3080 and both game at 4K.

This is an enthusiast forum full of enthusiasts who would have us believe that upgrading every 6 months and paying £1000 - £1500 for a GPU is normal.

lots of people buy a prebuilt system run it untill it no longer plays the games they want

then buy a new one
 
The debate is as old as the stars, VRAM vs mem bandwidth or just general grunt. Take the GF4 Ti4200 nearly 20 years ago, you had a choice of 64MB or 128MB but the 64MB had faster memory, even back then people were pretty opinionated on it.

I've always been firmly in the Grunt over VRAM camp; if your card is too slow no amount of VRAM can compensate for that. VRAM can generally be worked around by lowering certain settings, I don't think I've ever had a situation where I've felt screwed by lack of VRAM, but I've definitely been screwed by lack of raw horsepower. That doesn't mean I've never hit a VRAM limitation, it just means whenever I've hit a VRAM limitation it's always been easy to resolve.

No it isn't, unless GPUs have been around for 13.8 billion years? ;)

The problem with GPUs is finding a balance and when you skimp on VRAM you end up having to upgrade a year or 2 sooner than you would have needed to had the GPU had a bit more VRAM. I quite literally gave you an example of when a GPU hit VRAM limits before GPU grunt was an issue. So even if a GPU has the grunt it could hit VRAM limits.

My mid-range 3GB HD 7950 GPU outlasted my high end 2GB GTX 680 by a few years. Ironically my 7950 was in the spare PC at the time and I upgraded the 680 in my main PC to a 980. I had the GTX 680 for a few years longer than I would have liked as the GTX 780 prices were a joke. I was getting 40-50 FPS in mid/high settings on the 7950 when the GTX 680 had to go low/medium to keep up with (then) modern games VRAM demands.

So yes, it is far more likely a GPU will run out of grunt before it runs out of VRAM but every now and again one crops up that is so unbalanced it is the opposite.

GTX 580
GTX 680 (2GB)
Fury X

I predict the 3070 will do the same at 4K.
 
Last edited:
My mid-range 3GB HD 7950 GPU outlasted by high end 2GB GTX 680 by a few years. I was getting mid/high settings on the 7950 when the GTX 680 had to go low/medium.

I think that is more down to nvidia abandoning older gen cards i.e. not optimising for them as much with newer games or perhaps intentionally crippling them.... My vega 56 before selling it in Jan/Feb there was matching and even beating a 1080ti despite at the time of launch, a vega 56 was matching/beating a 1070. Or of course, could also be a case of amd "fine wine".....
 
It was mostly VRAM limits because when a game did not hit VRAM limits the 680 was a tad faster. So maybe 30% of the games were hitting VRAM limits at 1080p and it was usually the latest new AAA game. I finally had enough when Dragon Age inquisition came out and the 7950 was getting a reasonably playable 40 FPS average and lows of 30ish with medium/high settings and the GTX 680 was getting 25 FPS (and lows of 5).

I upgraded to a GTX 980.
 
The comparison between GTX 680 2GB and 7950 3GB needs to be examined alongside the arrival of new consoles, which had 8GB available. Although this is unified RAM and not all was available to the game it's still a lot more thanthe amount available in the 680 mentioned, therefore it makes sense that it suffered when compared to less well performing 3/4GB cards.

The issue this time is less clear. The best of the new consoles only have 16GB, which when compared to the 8GB cards in question is nowhere near the disparity between cards/consoles of old. The problem is, console game development is quite a different affair now, with games being released on multiple models, as well as there being more techniques available to reduce the issue (DLSS, maybe FSR etc etc), which muddy the waters further. I appreciate people might find the console talk irrelevant, but it is a major factor in PC hardware at the time becoming VRAM limited.

My point is, it's easy to say in hindsight that one's 680 gtx was gimped compared to a weaker 3gb card, and that means current VRAM concerns are valid, but we're talking ~10 years between these, circumstances just aren't the same. I'm not saying VRAM concerns aren't valid, just that the 2gb 680 vs 3gb 7950 isn't a great example.
 
If we are talking about upgrading a year or two sooner then I guess people are wanting to keep cards for many years, in which case one might question why they are scrimping in the first place. Like to have a good probability of longevity you really gotta be looking for to the top end of market that saving the pennies on the midrange and maybe falling foul of bandwidth and yes, VRAM on occasion.

7950 did benefit from FineWine as GCN wasn't that well optimised when it came out despite being good hardware, I was reading/watching some modern benchmarks on this the other day as I recently put my 7950 back in my son's PC. The 7950 is actually beating the 780 in a lot of cases which is also a 3GB RAM chip so the reason the 7950 has aged well is more than just VRAM as it beats later generation GPUs with the same VRAM. A nice vintage red that's for sure and yes the 3GB helps but let's be honest I bought the card in 2012 it's not like any card bought then can handle modern demanding games, so 2GB or 3GB any card from that era is too slow by modern standards.

As for a 3070 running out of VRAM at 4k, it's a sub-£500 list price GPU, I think people have unrealistic expectations if they are expecting to be sat here in a few years running 4k smoothly on that sort of tech. I bet the 3060Ti with 12GB VRAM will not last any longer apart from maybe the odd niche case that will be at least counter-balanced by other games that run faster than on the 3070. I mean when has a sub-£500 GPU EVER been able to push 4k for many years regardless of how much VRAM is has?
 
Fury X was the worst card AMD produced in the last 10 years, it was struggling for memory from day 1 @2160p.

I was so disgusted with the 4 I bought that I threw them in the dust bin in the end.

Having said that if the Fury X was used at a lower resolution and not in mGPU it was a pretty good card.



:cry:
 
No it isn't, unless GPUs have been around for 13.8 billion years? ;)

The problem with GPUs is finding a balance and when you skimp on VRAM you end up having to upgrade a year or 2 sooner than you would have needed to had the GPU had a bit more VRAM. I quite literally gave you an example of when a GPU hit VRAM limits before GPU grunt was an issue. So even if a GPU has the grunt it could hit VRAM limits.

My mid-range 3GB HD 7950 GPU outlasted my high end 2GB GTX 680 by a few years. Ironically my 7950 was in the spare PC at the time and I upgraded the 680 in my main PC to a 980. I had the GTX 680 for a few years longer than I would have liked as the GTX 780 prices were a joke. I was getting 40-50 FPS in mid/high settings on the 7950 when the GTX 680 had to go low/medium to keep up with (then) modern games VRAM demands.

So yes, it is far more likely a GPU will run out of grunt before it runs out of VRAM but every now and again one crops up that is so unbalanced it is the opposite.

GTX 580
GTX 680 (2GB)
Fury X

I predict the 3070 will do the same at 4K.

Yes it was more common a few years ago, you would run out of VRam when you looked at the 960 2GB to 4GB and 1060 3GB to 6GB and 480/580 4GB to 8GB comes to mind, and now with DLSS/FRS they will make the 8GB cards last longer.
 
Back
Top Bottom