Soldato
Agree, 3070 reference for his mate. In the mining circles the 3070 was poor value, 3060Ti was the better performer for its cost.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
True if looking at it that way, suppose same could then be said about a 3080ti overclocked achieving similar performance to a stock 3090.3060ti overclocked gives 3070 stock performance so I would add that to the value cost/frame proposition
3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.True if looking at it that way, suppose same could then be said about a 3080ti overclocked achieving similar performance to a stock 3090.
3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.
You're buying it for the 24GB of VRam & you're not doing so if you're a rational gamer. You should buy it for content creation, where it's actually pretty good value for money.
What makes the old Titans better?According to kaapstad it's not good for productivity/content either, he says old titans are better
They are old and still have more than 10GB?What makes the old Titans better?
I use my 3090 purely for gaming. It's a great performer at 4k 120htz and I have never regretted buying it for that reason.3090 has never been sensible to buy for gaming performance, except possibly in the crazy times when it was the only card in stock.
You're buying it for the 24GB of VRam & you're not doing so if you're a rational gamer. You should buy it for content creation, where it's actually pretty good value for money.
According to kaapstad it's not good for productivity/content either, he says old titans are better
Will you run out of vram before grunt or vice versa rather?I use my 3090 purely for gaming. It's a great performer at 4k 120htz and I have never regretted buying it for that reason.
Then again, when has sensible and PC gamer ever been in the same sentence
I'll almost certainly run out of grunt but that won't be for a little while yet. Texture mods can certainly push up usage mindWill you run out of vram before grunt or vice versa rather?
And then you overclock the 3090 and you have the same performance gap again (give or take a few percentage due to the silicone lottery)True if looking at it that way, suppose same could then be said about a 3080ti overclocked achieving similar performance to a stock 3090.
Pretty much, same could also be said for the 3060ti comparison to the 3070 as well.And then you overclock the 3090 and you have the same performance gap again (give or take a few percentage due to the silicone lottery)
Thats if you are in it for the value I guess. I just knew it was really the only card that could do what I wanted so I had to have itPretty much, same could also be said for the 3060ti comparison to the 3070 as well.
3090 is a terrible value GPU though, you're paying far more for the very little uplift in performance you get.
Some would say you can’t put a price on the extra vram…Pretty much, same could also be said for the 3060ti comparison to the 3070 as well.
3090 is a terrible value GPU though, you're paying far more for the very little uplift in performance you get.
Some would say you can’t put a price on the extra vram…
Yeah, that’s a completely different use case. That sounds like 3090/Titan stuff. We are talking about gaming primarilyIn my case the VRAM is just as important if not more than the grunt. For my work I need the VRAM even 16GB cards are not enough now. I have a 3080ti with 16GB VRAM laptop that I got from the mm and was meant for my niece but she found it too large and so gave her my 15" razer with a 3080 and 8GB VRAM for her university use. The 16GB VRAM is not enough for many of my work projects but I can do smaller projects on it and I'm happy with the 17" screen but lost resolution as the 15" was a 1440p screen and the 17" is a 360hz 1080p screen but I'm ok with it for work and some gaming on it once I get it setup as I had some issues with windows 10 or 11 pro activating on it with an oem licence purchased again from the mm as the laptop comes with windows 11 home and I need pro.. Anyways once I get it all set up the Alienware x17 R2 with a intel 2700H, 32GB DDR5 4800mhz ram and the 3080ti 16GB VRAM and want to do some benchmarks comparing it to my desktop with dual 3090s in sli/nvlink and a amd 5950x and 64GB RAM.
Really curious how they compare but I know the 3080ti laptop gpu is about a 3070ti dGPU card in performance but has a larger 103 gpu the laptop with more cuda cores than a 3070ti desktop card. Will see and hopefully post up some benchmarks when I'm done setting it up.
was meant for my niece but she found it too large and so gave her my 15"
It's just beating a dead horse at this stage, however I'm sure it's just a local system issue/using the wrong settings/he has Rebar enabled/GPU slamming into the power limit/games don't count because reason x.
Does that sound familiar?Tom:
Anytime I talk about my 3090 using 16 or 12GBs of video memory in say Battlefield or something,
there's a few comments saying Tom doesn't understand allocation, I do (expletive) there was stuttering when I used those settings with the 3070. It was allocating lower FPS and stuttering to my screen. So yes, 8GB has limiations and so does 10GB. 10GGB I've tested 3080s, there's stuttering in some settings my 3090 doesn't have it's not just allocating ram. You have people dying on a hill to justify their purchase that the 3080 with 10GB was plenty and that 8GB was just merketing.
One vocal user everyone has on ignore said that there is hitching/frame time spikes in a lot of games, the quote is still here on this forum and you can see the stuttering on the overlay in most of his videos so at this stage it's not really a surprise anymore is it?He's right the hypocrisy is about brand, when Nvidia do it its not a problem but if AMD do it suddenly it is.
And yes the "oh that's because its allocating to memory" what people who say that are implying is that its only doing it because it can, it doesn't need to, well.... to some extent that is true, however, its going to "allocate" that to some form of memory, if it doesn't fit in the GPU buffer it will instead "allocate it to RAM" and your GPU buffer is WAY faster than your RAM and that's why you don't get these random stutters, or hitches in the frame rates nearly as much on the RTX 3090 as you do on the RTX 3080, streaming from RAM is far more likley to cause a frame stall than streaming from the GPU's buffer.
Its not rocket science.