• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
True if looking at it that way, suppose same could then be said about a 3080ti overclocked achieving similar performance to a stock 3090.
3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.
You're buying it for the 24GB of VRam & you're not doing so if you're a rational gamer. You should buy it for content creation, where it's actually pretty good value for money.
 
3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.
You're buying it for the 24GB of VRam & you're not doing so if you're a rational gamer. You should buy it for content creation, where it's actually pretty good value for money.


According to kaapstad it's not good for productivity/content either, he says old titans are better
 
3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.
You're buying it for the 24GB of VRam & you're not doing so if you're a rational gamer. You should buy it for content creation, where it's actually pretty good value for money.
I use my 3090 purely for gaming. It's a great performer at 4k 120htz and I have never regretted buying it for that reason.

Then again, when has sensible and PC gamer ever been in the same sentence;)
 
According to kaapstad it's not good for productivity/content either, he says old titans are better

That's not true unless you need titan drivers for the application, I use my 3090 sli/nvlink setup for work and they destroy titan cards and we replaced all our titan and quadro cards at work with dual 3090 setups too. It depends on the application you use and if they need titan drivers and most applications are even dropping the titan drivers and makes no difference now. Best work cards in a long time the 3090s and you can pool the vram too so I see 48GBs VRAM in my work apps, nothing beats them for the money and performance for the money currently.


My setup is here to see and has recently been updated with 5 x 20TB drives and 3 x larger NVME ssds:-

 
Last edited:
And then you overclock the 3090 and you have the same performance gap again (give or take a few percentage due to the silicone lottery)
Pretty much, same could also be said for the 3060ti comparison to the 3070 as well.

3090 is a terrible value GPU though, you're paying far more for the very little uplift in performance you get.
 
Pretty much, same could also be said for the 3060ti comparison to the 3070 as well.

3090 is a terrible value GPU though, you're paying far more for the very little uplift in performance you get.
Some would say you can’t put a price on the extra vram…

image.png
 
Some would say you can’t put a price on the extra vram…

image.png

In my case the VRAM is just as important if not more than the grunt. For my work I need the VRAM even 16GB cards are not enough now. I have a 3080ti with 16GB VRAM laptop that I got from the mm and was meant for my niece but she found it too large and so gave her my 15" razer with a 3080 and 8GB VRAM for her university use. The 16GB VRAM is not enough for many of my work projects but I can do smaller projects on it and I'm happy with the 17" screen but lost resolution as the 15" was a 1440p screen and the 17" is a 360hz 1080p screen but I'm ok with it for work and some gaming on it once I get it setup as I had some issues with windows 10 or 11 pro activating on it with an oem licence purchased again from the mm as the laptop comes with windows 11 home and I need pro.. Anyways once I get it all set up the Alienware x17 R2 with a intel 2700H, 32GB DDR5 4800mhz ram and the 3080ti 16GB VRAM and want to do some benchmarks comparing it to my desktop with dual 3090s in sli/nvlink and a amd 5950x and 64GB RAM.

Really curious how they compare but I know the 3080ti laptop gpu is about a 3070ti dGPU card in performance but has a larger 103 gpu the laptop with more cuda cores than a 3070ti desktop card. Will see and hopefully post up some benchmarks when I'm done setting it up.
 
In my case the VRAM is just as important if not more than the grunt. For my work I need the VRAM even 16GB cards are not enough now. I have a 3080ti with 16GB VRAM laptop that I got from the mm and was meant for my niece but she found it too large and so gave her my 15" razer with a 3080 and 8GB VRAM for her university use. The 16GB VRAM is not enough for many of my work projects but I can do smaller projects on it and I'm happy with the 17" screen but lost resolution as the 15" was a 1440p screen and the 17" is a 360hz 1080p screen but I'm ok with it for work and some gaming on it once I get it setup as I had some issues with windows 10 or 11 pro activating on it with an oem licence purchased again from the mm as the laptop comes with windows 11 home and I need pro.. Anyways once I get it all set up the Alienware x17 R2 with a intel 2700H, 32GB DDR5 4800mhz ram and the 3080ti 16GB VRAM and want to do some benchmarks comparing it to my desktop with dual 3090s in sli/nvlink and a amd 5950x and 64GB RAM.

Really curious how they compare but I know the 3080ti laptop gpu is about a 3070ti dGPU card in performance but has a larger 103 gpu the laptop with more cuda cores than a 3070ti desktop card. Will see and hopefully post up some benchmarks when I'm done setting it up.
Yeah, that’s a completely different use case. That sounds like 3090/Titan stuff. We are talking about gaming primarily :)

Oh and the way you structured your text here gave me a chuckle:

was meant for my niece but she found it too large and so gave her my 15"

:eek::cry:
 

He's right the hypocrisy is about brand, when Nvidia do it its not a problem but if AMD do it suddenly it is.

And yes the "oh that's because its allocating to memory" what people who say that are implying is that its only doing it because it can, it doesn't need to, well.... to some extent that is true, however, its going to "allocate" that to some form of memory, if it doesn't fit in the GPU buffer it will instead "allocate it to RAM" and your GPU buffer is WAY faster than your RAM and that's why you don't get these random stutters, or hitches in the frame rates nearly as much on the RTX 3090 as you do on the RTX 3080, streaming from RAM is far more likley to cause a frame stall than streaming from the GPU's buffer.
Its not rocket science.
 
It's just beating a dead horse at this stage, however I'm sure it's just a local system issue/using the wrong settings/he has Rebar enabled/GPU slamming into the power limit/games don't count because reason x.

Did I miss any?
Tom:
Anytime I talk about my 3090 using 16 or 12GBs of video memory in say Battlefield or something,
there's a few comments saying Tom doesn't understand allocation, I do (expletive) there was stuttering when I used those settings with the 3070. It was allocating lower FPS and stuttering to my screen. So yes, 8GB has limiations and so does 10GB. 10GGB I've tested 3080s, there's stuttering in some settings my 3090 doesn't have it's not just allocating ram. You have people dying on a hill to justify their purchase that the 3080 with 10GB was plenty and that 8GB was just merketing.
Does that sound familiar?

He's right the hypocrisy is about brand, when Nvidia do it its not a problem but if AMD do it suddenly it is.

And yes the "oh that's because its allocating to memory" what people who say that are implying is that its only doing it because it can, it doesn't need to, well.... to some extent that is true, however, its going to "allocate" that to some form of memory, if it doesn't fit in the GPU buffer it will instead "allocate it to RAM" and your GPU buffer is WAY faster than your RAM and that's why you don't get these random stutters, or hitches in the frame rates nearly as much on the RTX 3090 as you do on the RTX 3080, streaming from RAM is far more likley to cause a frame stall than streaming from the GPU's buffer.
Its not rocket science.
One vocal user everyone has on ignore said that there is hitching/frame time spikes in a lot of games, the quote is still here on this forum and you can see the stuttering on the overlay in most of his videos so at this stage it's not really a surprise anymore is it?

EDIT - Just noticed @Wrinkley got perma banned @gpuerrilla. One of the threads most active users gone, RIP.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom