• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

is the extra VRAM really that beneficial?

Status
Not open for further replies.

mrk

mrk

Man of Honour
Joined
18 Oct 2002
Posts
101,018
Location
South Coast
Upscaling VRAM differences will be small but does vary between games as well. I have 5 games installed currently and did a quick test, reloaded each game after applying the change so the baseline is fair. All at 3440x1440 at their respective "max" settings. I won't use Cyberpunk as an example as I have mods that use up to 4GB extra VRAM so not a logical baseline. All of these use DLSS 3.7.10 and all are at Preset E for reference.

Alan Wake 2:
DLSS:
yuJ2Pdp.jpeg


DLAA:
2bZeGwG.jpeg


HZFW:
DLSS:
8a1ct3o.jpeg


DLAA:
izbvDRz.jpeg


Still Wakes the Deep:

DLSS:
uQHOsN6.jpeg


DLAA:
o0iiZ2M.jpeg
 
Associate
Joined
31 Dec 2010
Posts
2,487
Location
Sussex
The GPU’s 16x PCI-e bus on the PC is massively underutilised. This is easy to test as dropping down from gen 4 to 3 or 16x to 8x (50% drop) hardly affects performance (< 5%). Asset streaming will have to become a must have thing as VRAM will always be limited and the cost of adding more will increase as node shrinks are not great for RAM.
Of course node shrinks won't be great for system RAM either!
Although if memory ends up as a higher proportion of the cost of a build, then duplicating could be wasteful. Even PCIe 7.0 probably wouldn't be enough for fast asset streaming. Plus, PCIe 4 / 5 / etc. has already demonstrated that interconnect power keeps going up.

Solution: APUs for everyone!
 
Soldato
Joined
16 Aug 2009
Posts
7,834
Soldato
Joined
28 May 2007
Posts
10,100
That chart is taken from launch day review of AW 2, the game has had many patches since to improve performance etc.
The difference is your site is using a benchmark where as TPU are actually playing the game. In game metrics matter more than the baked in benchmarks which can often differ from actual game play. Apart from that only the 4090 possibly the 4080 is worth playing on at those settings. The 3080 should have had more Vram and i said it from day one. It's still a plenty capable card but can see it suffering more than it should.
 
Last edited:
Soldato
Joined
27 Feb 2015
Posts
12,635
Thread still going on? Sadly only DF seems to accurately report implications of low VRAM. Almost everyone else just checks the frame rate.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,338
The difference is your site is using a benchmark where as TPU are actually playing the game. In game metrics matter more than the baked in benchmarks which can often differ from actual game play. Apart from that only the 4090 possibly the 4080 is worth playing on at those settings. The 3080 should have had more Vram and i said it from day one. It's still a plenty capable card but can see it suffering more than it should.

Wrong.


For the benchmarks, we rely on an above-average test sequence for the benchmarks. We found these after some exploration of Bright Falls and Cauldron Lake as Saga Anderson. Since we don't want to reveal anything about the story, we leave it with this information and the reference to the video: We are on the way to the first "Boss", which corresponds to a forest passage worth viewing worth seeing. The image rates to be found here are significantly lower than in most other scenes in Alan Wake 2. Thus, what always applies to PCGH benchmarks in principle: If a graphics card achieves fluid frame rates here, there is nothing to worry about in the rest of the game - and that is good to know.

Also, regardless of scene/how it was benchmarked, the point still stands with the same old of selective quoting to fit narrative i.e. whether intentionally or not using a benchmark of launch day version to show an out of date/no longer relevant version of said games performance..... If anything, this just further proves that games need a patch or 2 to properly address issues as opposed to just brute forcing past issues. Next thing we'll have people taking the likes TLOU and hogwarts to show vram issues when they were also solved with patches. Bit like humbug putting texture etc. issues in forza down to vram when the devs even stated themselves changing drivers solves said issue.

I'm also not really seeing any evidence from anyone to show the 3080 suffering more than it should because of vram either and going by daniel owens videos, you can see the dedicated vram not coming close to 10gb either unless again: you play at 4k and don't use upscaling or/and add loads of texture packs or/and look at games on launch day which were well regarded to be poorly optimised only to be fix with a couple of patches.

No doubt it would have been nice to have had more vram from the get go to cover all bases but again, what alternatives where there?

- rdna 2 gpus at the time which as evidenced have aged considerably worse by not having a consistently good upscaling tech and suffering big time in rt workloads (which is becoming far more common now due to the rt evolution)
- pay the extra £750 for the only other ampere gpu to have more vram at the time? As evidenced by the lack of answers to this question ""has the extra vram proved beneficial enough to justify the extra cost", (except hum who said he didn't go for the 2080ti over the 2070s and 5700xt because it was too expensive....), it's safe to guess what the answer will be.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
10,100
Wrong.




Also, regardless of scene/how it was benchmarked, the point still stands with the same old of selective quoting to fit narrative i.e. whether intentionally or not using a benchmark of launch day version to show an out of date/no longer relevant version of said games performance..... If anything, this just further proves that games need a patch or 2 to properly address issues as opposed to just brute forcing past issues. Next thing we'll have people taking the likes TLOU and hogwarts to show vram issues when they were also solved with patches. Bit like humbug putting texture etc. issues in forza down to vram when the devs even stated themselves changing drivers solves said issue.

I'm also not really seeing any evidence from anyone to show the 3080 suffering more than it should because of vram either and going by daniel owens videos, you can see the dedicated vram not coming close to 10gb either unless again: you play at 4k and don't use upscaling or/and add loads of texture packs or/and look at games on launch day which were well regarded to be poorly optimised only to be fix with a couple of patches.

No doubt it would have been nice to have had more vram from the get go to cover all bases but again, what alternatives where there?

- rdna 2 gpus at the time which as evidenced have aged considerably worse by not having a consistently good upscaling tech and suffering big time in rt workloads (which is becoming far more common now due to the rt evolution)
- pay the extra £750 for the only other ampere gpu to have more vram at the time? As evidenced by the lack of answers to this question ""has the extra vram proved beneficial enough to justify the extra cost", (except hum who said he didn't go for the 2080ti over the 2070s and 5700xt because it was too expensive....), it's safe to guess what the answer will be.
I can only go off what the charts show and it says Benchmark nightingale. No need to bruteforce when you are not close to the Vram limit. Only a select few think 10gb was acceptable and plenty of owners have said its not enough. When getting a flagship card the last thing on your mind should be whether you have enough Vram. I spent the same £650 a few years later and have no worries what so ever as i get 2 times the amount the Flagship 3080 got. Had Nvidia given it 12gb i would have said that should be enough.

Any how as long as you are happy with the 3080 nothing else really matters. The Vram debate is so mixed it will go on forever.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,338
I can only go off what the charts show and it says Benchmark nightingale. No need to bruteforce when you are not close to the Vram limit. Only a select few think 10gb was acceptable and plenty of owners have said its not enough. When getting a flagship card the last thing on your mind should be whether you have enough Vram. I spent the same £650 a few years later and have no worries what so ever as i get 2 times the amount the Flagship 3080 got. Had Nvidia given it 12gb i would have said that should be enough.

Yes because it's a benchmark like tpu also being a benchmark..... That quote is straight from their article, it's gameplay, it's not a benchmark scene. They even link a video to showcase what area they benchmaked, which is in game.

The fact that the questionable games which had vram issues fixed because of there being memory leaks and other issues causing vram issues proves that the likes of 3090 where just avoiding core issues that were present in the games on launch otherwise said issues would not have got fixed/game optimised. As you know.... I go based of evidence to backup claims and outside of the areas I stated, 10gb has not proved to be the car crash that some are insinuating.

As touched upon earlier, yes it would always be great to have bigger/better but at what cost? 3080 12gb showed up and was costing 1+k..... What is your answer to this question?

has the extra vram proved beneficial enough to justify the extra cost

Also, regarding this statement:

When getting a flagship card the last thing on your mind should be whether you have enough Vram

That applies to everything when you are considering a purchase.

In your case, yes you waited and saved until you could get a better gpu i.e. the same way people who waited to get a 3080 msrp saved themselves from spending £750+ to which they can get get a brand new gpu a few years down the line for when more perf is required, personally I would have been ****** spending the extra £750 only to find it doesn't do much better than a £650 gpu in 95+% gaming cases.
 
  • Haha
Reactions: TNA
Soldato
Joined
28 May 2007
Posts
10,100
Yes because it's a benchmark like tpu also being a benchmark..... That quote is straight from their article, it's gameplay, it's not a benchmark scene. They even link a video to showcase what area they benchmaked, which is in game.

The fact that the questionable games which had vram issues fixed because of there being memory leaks and other issues causing vram issues proves that the likes of 3090 where just avoiding core issues that were present in the games on launch otherwise said issues would not have got fixed/game optimised. As you know.... I go based of evidence to backup claims and outside of the areas I stated, 10gb has not proved to be the car crash that some are insinuating.

As touched upon earlier, yes it would always be great to have bigger/better but at what cost? 3080 12gb showed up and was costing 1+k..... What is your answer to this question?



Also, regarding this statement:



That applies to everything when you are considering a purchase.

In your case, yes you waited and saved until you could get a better gpu i.e. the same way people who waited to get a 3080 msrp saved themselves from spending £750+ to which they can get get a brand new gpu a few years down the line for when more perf is required, personally I would have been ****** spending the extra £750 only to find it doesn't do much better than a £650 gpu in 95+% gaming cases.
My answer to the question is Nvidia were taking the **** with the Vram and the 12gb versions pricing. Again the 3080 was classed as a flagship card and should be equipped like one. In all metrics it was but for one which is why you needed patches to sort it for you. I run what i would call a high middle to high end card and i have no worries. You were one tier down from the top and need games patched cause Vram was to low. The 3080 is a fine card just lacking the Vram to be a great card. You like RT i like piece of mind that the only bottleneck my card will ever face is the gpu itself.
 
Caporegime
Joined
4 Jun 2009
Posts
31,338
My answer to the question is Nvidia were taking the **** with the Vram and the 12gb versions pricing. Again the 3080 was classed as a flagship card and should be equipped like one. In all metrics it was but for one which is why you needed patches to sort it for you. I run what i would call a high middle to high end card and i have no worries. You were one tier down from the top and need games patched cause Vram was to low. The 3080 is a fine card just lacking the Vram to be a great card. You like RT i like piece of mind that the only bottleneck my card will ever face is the gpu itself.

That's not answering the question:

has the extra vram proved beneficial enough to justify the extra cost

It's either a yes or no. If I read into your post there, it sounds like you are saying/paraphrasing the pricing was too high for the benefits extra vram provided? (at least on nvidia side anyway?)

Why only focus on VRAM and nothing else in gpu purchase? Is grunt and features not worthy of considering or perhaps could even be viewed as lacking in a similar way to how you consider nvidia to lack vram?

If you consider the 3080 to be a flagship gpu then that means surely the 7900xt should be considered a flagship gpu too given it's one tier down from the best i.e. 7900xtx, same way 3080 was with 3090.

Personally I would argue that your gpu is already the bottleneck (given it's matching 3+ year old nvidia tech) unless you never intend of using RT (which you won't have a choice in with a lot of games going forward although the saving grace is most of the ue 5 titles are still just using software lumen mode)
 
Last edited:

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,306
Location
Greater London
My answer to the question is Nvidia were taking the **** with the Vram and the 12gb versions pricing. Again the 3080 was classed as a flagship card and should be equipped like one. In all metrics it was but for one which is why you needed patches to sort it for you. I run what i would call a high middle to high end card and i have no worries. You were one tier down from the top and need games patched cause Vram was to low. The 3080 is a fine card just lacking the Vram to be a great card. You like RT i like piece of mind that the only bottleneck my card will ever face is the gpu itself.

I would agree that they were taking the **** if not for the price. £650 10gb or £1000 for 12gb or £1400 for 24gb. I know which one I would take.

The 3080 was the real deal imo.

If they release a 5080 with 4090 performance but it only has 12gb and charge £650 for it, I swear i would ******* buy it, even though I think 16gb should be the minimum for such a card.

It isn't because I dislike the vram. But my point is performance matters more if you are price constrained. If price matters not, then just get a 5090.
 
Soldato
Joined
18 May 2010
Posts
22,484
Location
London
This was the image I was trying to find earlier.

It clearly shows that a game such as Alan Wake 2 with FG and PT on even at a modest 1440p uses 14Gb of vram.

Even just RT at 1440p uses 13Gb.

So yes in my opinion just as 32Gb of ram is the new 16Gb, 16Gb of vram should be the defacto standard for any GPU moving forward.

Screenshot-2024-06-24-195212.png



Now that that is crystal clear, lets talk about the elephant in the room. Nvidia.

We all know Nvidia extort their customers, which is why they made the 4080 so expensive.

Because it was the only card in the lineup with both the performance and the vram to match.

You could get 16Gb of vram from competitors but it wouldn't have all the features and RT performance of the Nvidia cards so they priced it as such.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,338
This was the image I was trying to find earlier.

It clearly shows that a game such as Alan Wake 2 with FG and PT on even at a modest 1440p uses 14Gb of vram.

Even just RT at 1440p uses 13Gb.

So yes in my opinion just as 32Gb of ram is the new 16Gb, 16Gb of vram should be the defacto standard for any GPU moving forward.

Screenshot-2024-06-24-195212.png

If vram is there to be used, it can be used, just because a game might use more does not necessarily mean that gpus with lower vram will suffer, this has been shown in multiple videos and even the 12gb video above by DO. If a game is not optimised well at all with regards to vram management, what you'll get is noticeable texture pop in or/and severe stutter or/and fps plummets such as what happened with hogwarts, tlou on launch. BTW, just as a note on hogwarts, in one of bang4bucks recent gameplay videos, he even noted at how slow textures loaded in for hogwarts and that was with a 4090 so just goes to show, even in some cases, having more vram won't "always" resolve things like texture pop in.
 
Last edited:
  • Like
Reactions: TNA
Soldato
Joined
28 May 2007
Posts
10,100
That's not answering the question:



It's either a yes or no. If I read into your post there, it sounds like you are saying/paraphrasing the pricing was too high for the benefits extra vram provided? (at least on nvidia side anyway?)

Why only focus on VRAM and nothing else in gpu purchase? Is grunt and features not worthy of considering or perhaps could even be viewed as lacking in a similar way to how you consider nvidia to lack vram?

If you consider the 3080 to be a flagship gpu then that means surely the 7900xt should be considered a flagship gpu too given it's one tier down from the best i.e. 7900xtx, same way 3080 was with 3090.

Personally I would argue that your gpu is already the bottleneck (given it's matching 3+ year old nvidia tech) unless you never intend of using RT (which you won't have a choice in with a lot of games going forward although the saving grace is most of the ue 5 titles are still just using software lumen mode)
I did Nvidia took the ****. No way 2gb and a few percent was worth the extra cash on the later released 12gb version. Nvidia were screwing people that's the answer. Vram from my understanding does not cost that much. The 3080 with it's overall design was either 10 or 20gb. It should have had 20gb or they should have upped it's design to be a 12 gb card but that would have brough it closer to the 3090. The underdog seems to give out plenty of Vram while being cheaper so i am sure Nvidia could as well. The 3080 deserved more Vram but Nvidia wanted more 3090 buyers as well as most knew 10gb was a joke.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,338
I did Nvidia took the ****. No way 2gb and a few percent was worth the extra cash on the later released 12gb version. Nvidia were screwing people that's the answer. Vram from my understanding does not cost that much. The 3080 with it's overall design was either 10 or 20gb. It should have had 20gb or they should have upped it's design to be a 12 gb card but that would have brough it closer to the 3090. The underdog seems to give out plenty of Vram while being cheaper so i am sure Nvidia could as well. The 3080 deserved more Vram but Nvidia wanted more 3090 buyers as well as most knew 10gb was a joke.

How about 3080 10gb Vs 3090 in the context of the question?

That may well be the case but it didn't stop certain current 3090 owners originally wanting the 3080 but they couldn't get one for MSRP so just went with the 3090.

Also, I could be wrong but I'm pretty sure someone confirmed this before but it wasn't possible to have more than 10gb or less than 24gb gddr6x at the time of release.
 
  • Like
Reactions: TNA

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
28,306
Location
Greater London
I did Nvidia took the ****. No way 2gb and a few percent was worth the extra cash on the later released 12gb version. Nvidia were screwing people that's the answer. Vram from my understanding does not cost that much. The 3080 with it's overall design was either 10 or 20gb. It should have had 20gb or they should have upped it's design to be a 12 gb card but that would have brough it closer to the 3090. The underdog seems to give out plenty of Vram while being cheaper so i am sure Nvidia could as well. The 3080 deserved more Vram but Nvidia wanted more 3090 buyers as well as most knew 10gb was a joke.

If only it was that easy though. The underdog has other things missing instead unfortunately.

How much vram a card has, has little to do with cost of manufacturing in this case. Bit like storage on phones or tablets.
 
Last edited:
Soldato
Joined
19 Sep 2009
Posts
2,770
Location
Riedquat system
Daniel Owen could have tested the 3080 10GB vs the 3080 12GB though might be hard to get a hold of both :p I suspect the 12GB would be fine and some occasional problems on the 10GB version (at settings where playable FPS would be achieved if VRAM were not a consideration). I don't consider benchmarks where titles are getting well under 60 fps with VRAM to spare at all usefull.
 
Status
Not open for further replies.
Back
Top Bottom