• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

is the extra VRAM really that beneficial?

Status
Not open for further replies.
If only it was that easy though. The underdog has other things missing instead unfortunately.

How much vram a card has, has little to do with cost of manufacturing in this case. Bit like storage on phones or tablets.
Actually in this case it was not hard. The 3080/3080 12gb/3090 all use the same chip. So manufacturing doesn't really explain why a 12gb model had such a large mark up.

The underdog doesn't really Ave things missing though. It can do pretty much everything the Nv card can do like RT and upscaling. Think you meant it doesn't do them as good.

Still you are not telling me you believe that unlocking a bit of the chip and adding in 2gb vram was costing Nvidia hundreds of pounds hence the high price of the 12gb model.
 
How about 3080 10gb Vs 3090 in the context of the question?

That may well be the case but it didn't stop certain current 3090 owners originally wanting the 3080 but they couldn't get one for MSRP so just went with the 3090.

Also, I could be wrong but I'm pretty sure someone confirmed this before but it wasn't possible to have more than 10gb or less than 24gb gddr6x at the time of release.
Most likely because the 3090 was a ripoff at the time so there only option was a 3080 on the nvidia side. Then with no availability of the 3080 they opted to pay the man.

On the Vram side that don't make much sense at all but if so could explain Nvidia's choice to low ball the vram.
 
Actually in this case it was not hard. The 3080/3080 12gb/3090 all use the same chip. So manufacturing doesn't really explain why a 12gb model had such a large mark up.

The underdog doesn't really Ave things missing though. It can do pretty much everything the Nv card can do like RT and upscaling. Think you meant it doesn't do them as good.

Still you are not telling me you believe that unlocking a bit of the chip and adding in 2gb vram was costing Nvidia hundreds of pounds hence the high price of the 12gb model.

I don't think you are reading what I am saying if that is your response.
 
Actually in this case it was not hard. The 3080/3080 12gb/3090 all use the same chip. So manufacturing doesn't really explain why a 12gb model had such a large mark up.

The underdog doesn't really Ave things missing though. It can do pretty much everything the Nv card can do like RT and upscaling. Think you meant it doesn't do them as good.

Still you are not telling me you believe that unlocking a bit of the chip and adding in 2gb vram was costing Nvidia hundreds of pounds hence the high price of the 12gb model.

That's true and rdna 3 is certainly a much more usable/appealing rt solution than rdna 2, with fsr 3.1, hopefully that will remove nvidias advantage here too and things will be much more competitive. Although they still can't do ray reconstruction and rtx hdr, which both provide a big boost to IQ.

Most likely because the 3090 was a ripoff at the time so there only option was a 3080 on the nvidia side. Then with no availability of the 3080 they opted to pay the man.

On the Vram side that don't make much sense at all but if so could explain Nvidia's choice to low ball the vram.

So still agree that the extra vram wasn't really beneficial enough to justify the extra cost? (on nvidia side).

I can't remember who it was but they basically debunked humbug as he said the same thing but at the time, there was just no way to do it and if the 3080 was similar price to the 3090 well then there would be no point in the 3080. Nvidia learned this too and also wanted there to be more of a gap in perf hence why they made the 4090 leagues above the 4080 and it worked, loads of people jumped to the 4090 (although didn't help having a xx80 gpu priced at £1200...)
 
A lot of games at 4K seem to be hovering around 10GB to just over 12GB, so in 2024 16GB really should be the minimum your going for. Looks like Nvidia really need to up the VRAM across all the cards for the next gen, wonder if the 5090 will get more than 24GB, wil be surprised if they keep it the same for the 3rd time.

For 1080p gaming it seems 12GB is now the minimum you really need for current and newer games going forward.
 
Last edited:
Actually in this case it was not hard. The 3080/3080 12gb/3090 all use the same chip. So manufacturing doesn't really explain why a 12gb model had such a large mark up.

The underdog doesn't really Ave things missing though. It can do pretty much everything the Nv card can do like RT and upscaling. Think you meant it doesn't do them as good.

Still you are not telling me you believe that unlocking a bit of the chip and adding in 2gb vram was costing Nvidia hundreds of pounds hence the high price of the 12gb model.
The 10gb version landed before the mining boom took off so it didn’t go down well with Nvidia when crypto mining saw aibs charging £1200+ for it yet probably paying Nvidia £400 for a BOM so they launched the 12gb and the ti with much higher BOM costs to get their cut of the pie and also dilute the amount of dies available for the 10gb model.
 
Last edited:
Shame they didn't show any actual side by side comparisons for gameplay. It's pointless just looking at vram usage especially for likes of avatar where the devs did a lot of work/optimisation around vram allocation and the buffer system.
It is helpful as a ballpark, at least, for someone thinking about what to buy. It would be nice if Steve did a follow up with more comparisons like he did at the start. Though, with the time played and the time before a restart can influence things as well, so it will never literally translate to a gamer.
 
This was the image I was trying to find earlier.

It clearly shows that a game such as Alan Wake 2 with FG and PT on even at a modest 1440p uses 14Gb of vram.

Even just RT at 1440p uses 13Gb.

So yes in my opinion just as 32Gb of ram is the new 16Gb, 16Gb of vram should be the defacto standard for any GPU moving forward.

Screenshot-2024-06-24-195212.png



Now that that is crystal clear, lets talk about the elephant in the room. Nvidia.

We all know Nvidia extort their customers, which is why they made the 4080 so expensive.

Because it was the only card in the lineup with both the performance and the vram to match.

You could get 16Gb of vram from competitors but it wouldn't have all the features and RT performance of the Nvidia cards so they priced it as such.

Seems I do speak some sense after all.
 
Last edited:
"If your interested in RT moving forward I'd recommended a minimum of 16Gb of vram."

Yep. And it is a good recommendation that I agree with.

Next gen cards are around the corner now and those should have 16gb minimum from the 5070 and up.

Some of us have been saying 16GB should have been the base standard for years^

Lol. You can look back at posts all the way back from Final Fantasy 15 from years ago where I said it too when it would eat up all 12gb of my titan. But alas it is what it is. Short of going yolo and 4090, it is what it has been.

Had AMD done a better job of being competitive we would not have to put up with it.
 
Last edited:
Is 16gb the reason you went 4080 by any chance? :cry:

Yea for me it was the only option (other than the 4090) because I knew or at least had a hunch that anything less than 16Gb especially when spending up words of £800 + quid was a slight issue considering the AMD cards have 20Gb +.

Of course the truth is we don't have to turn everything up to 11. If one is ok turning one or two things down then 12Gb is plenty.

It's only when we talk about maxing the graphics that 16Gb becomes part of the conversation.

I will say though as someone that generally only buys Nvidia (last AMD card was my 5870) textures have always been the one compromise I've had to make with Nvidia. Nvidia GPUs generally never had as much vram as the AMD cards (at least those I've owned) so I've always had to turn the textures down.
 
Last edited:
Had AMD done a better job of being competitive we would not have to put up with it.
I'm sorry but no. It is not up to AMD to do anything. If they don't want to be competivite and just do mid-range, for example, that is their prerogative. If Nvidia want to ship GPUs with limited VRAM, that isn't anything to do with AMD. It seems like some people like to spin Nvidia negatives into AMD negatives to try and deflect that Nvidia doesn't care if it shafts you. "Nvidia did bad... ooooh must have been because AMD didn't do good." Some of the comments in this thread are just lol. We desperately need some new GPUs fast, because going over this same old ground is getting tedious.
 
Status
Not open for further replies.
Back
Top Bottom