Different types of memory though. GDDR6x costs more than GDDR6. 6x has twice the through-put of 6 - though in actual performance terms it doesnt seem to have made any impact compared to good 'ol 6. Never seen any quantifiable evidence that 6x does anything over 6 which is a shame. Dunno if devs didnt make use of it or it just doesnt equate to any difference. Will probably die the same way HBM did for AMD.
Higher memory bandwith can help with performance. Gamers nexus touched upon it in their 3080 review when comparing it to the 2080ti:
Good article explaining the difference too:
Comparison and Difference between GDDR5, GDDR5X, HBM, and HBM2 memory types. These are all high-speed and high bandwidth memories that are used in graphics cards, high-end servers, and advanced hardware units. GDDR5 is the most widely used high-speed memory that you see in the current generation...
graphicscardhub.com
Will be interesting to see if direct storage will do better with say gddr6 vs gdrr6x.
So are we just going to pretend like Nvidia didn't come on stage and tell everyone the 3080 is twice as fast as the 2080 and used doom eternal as one of the benchmarks to prove this. Or did you misread the post you quoted because you were too busy thinking up a joke about smoking weed?
Nobody said that, maybe you need to lay off the ganja
![Wink ;) ;)](/styles/default/xenforo/vbSmilies/Normal/wink.gif)
. People said as time goes on more games with higher VRAM requirements will come out. Stop acting like 2 years is a long time in the context of AAA game development.
Obviously when those games come out you will start saying how everyone who owns a 3080 should have upgraded to the 4000 series anyway. Thereby proving what everyone said was right. It is an artifical limitation to get people to upgrade GPUs.
I can't remember the exact claims nor the "conditions" of said claims but where am I disputing about this?
pretend like Nvidia didn't come on stage and tell everyone the 3080 is twice as fast as the 2080
The bit I am disputing is this:
crippling 2080 to make the 3080 look better than it really is
Shock horror brands might lie in their PR slides/footage to try and promote their new products
There are plenty of benchmarks out there for you to look at. Let me know what you think of the 3080 vs 2080 perf. when you have checked them out especially in RT workloads and again, don't forget the key point, price.....
You're kidding me right.... go and look in the "is 10gb enough thread" and you will see plenty of posts from the usual suspects making out like we would be seeing loads of games suffering because of lack of vram and even had them cherry picking results to suit their narrative i.e. HZD, forza 5, re village, halo, godfall, dishonoured, deathloop etc. and all were "debunked".
As per my earlier post, do you then consider RDNA 2 RT performance to also be an "artificial limitation"?
And nope I won't be, if people don't want to be upgrading so soon to get a better experience "overall", I'll be telling people to reduce 1 setting, same as we tell RDNA 2 users to reduce RT settings or turn off entirely if they desire more performance. I've stated why I am upgrading to a 40xx series card or rdna 3 (if rt perf is there) and that's because I need/want more "grunt" for 2 reasons:
- to get as close to 175hz/fps @ 3440x1440
- better RT performance, even an extra 25/30fps will be a sizeable jump as it'll bring for example DL 2 fps from 70-80fps to my ideal fps of 100/110+
My 3080 is just fine for my 4k gaming as my 4k tv is only 60hz so this is easy to hit a locked 60 with a 3080 especially with dlss, obviously if I had a 120+HZ 4k screen then I would also be wanting more grunt here too.
Everyone will have different needs/wants and what they deem acceptable.
I think the argument for claiming VRAM is wasted money goes out the window when everyone buying a RTX card is forced to buy RT cores which 95% of people wont be using.
I have yet to play a game that utilises ray tracing.
With VRAM it will be utilised at some point unless you only playing a very narrow range of games and just keep the card short term, whilst with RT cores there is a big chance lots of people will never ever use them.
Would I swap a RTX 3080 10 gig for a GTX 3080 12/16 gig, 100% any day of the week. Likewise I would take more of GDDR6 vs less of GDDR6X. The grunt vs VRAM balance was too lopsided.
Yup that's what we have all been saying, if you don't care for RT then obviously this won't be a factor for the user when it comes to buying said hardware. At the time of my 3080 purchase I didn't care "that" much for RT nor dlss tbf, it was only really cp 2077 and control which had worthy RT and really "needed" dlss but a 3080 FE worked out cheaper and easier than trying to get a 6800xt at msrp, not to mention 3080 is matching a 6800xt in non RT workloads anyway so RT tensor cores + dlss were just a bonus, which given that I played more RT titles than rasterization along with dlss, it worked out very well and a much better buy than I was expecting in the end, if I had picked up a 6800xt, personally I would have regretted it massively and probably tried to switch over to a 3080 asap after seeing all the games I am interested in getting rt and dlss.
The VRAM debate has been hotly contetsted over one or two games. You can load up mods to make even a 3090 fall over. So yes, whilst over time games will use more VRAM, lets take FC6 as an example. On release it was reviewed widely that the textures were terrible. PS2 I think was quoted by some game reviewers. THen they issued a fix (called an HD texture pack) that required loads more VRAM and aside from RT it looked no different to FC5 which incidentally also had an HD texture pack. Cards with >10GB of VRAM were £850+ so notwithstanding people are saying to game you need to spend £850 cards. Bye bye PC gaming and get a console.
You make the game for the cards, which incidentally most still play on 8GB cards, though enthusiast forums like this one will have more people with higher cards as they are enthusiasts so will seem skewed.
Load up any mod (which is really a better version of the oprinal usually) and you can run out of VRAM. The point is that vanilla games on release are done so in often unplayable shocking states. Claiming buying a card with more VRAM is the way to go is madness. If the gfx were glaringly better then I can understand it. Aside from RT, they aint. Crysis was teh first card not made for the cards as it made everything fall over - but the gfx were glaringly much better.
To play the latest games you WILL get a better experience on the most expensive cards. If you wanna pay MSRP to be an Alpha tester and X2 the cost of a card to get the best experience but only 15% extra raster perf then crack on. But it was more to do with the supply rather than the games themselves.
Insert the joker meme about "everyone losing their mind" when it comes to having to potentially sacrifice 1 setting in a vram heavy game
at 4k without FSR but when it comes to sacrificing RT settings almost entirely (which makes a much bigger difference to visuals) across numerous games from day 1, no one complains
I do think DLSS is a much more useful feature than RT and ironically make's the rasterization performance vs VRAM even more misbalanced as DLSS reduces the need for pure grunt.
The problem early on is of course DLSS required a game to support it so was limited to specific new titles only (which would most likely only be AAA titles). DLDSR of course changed that and now DLSS is properly useful.
DLSS/FSR also reduces vram usage.
Once RT doesn't require upscaling to max out my monitor's refresh rate then I'll be interested. I can see those who like single player games and don't mind lower frame rates might want it on. DLSS etc only exist because the cards aren't fast enough to run RT at native. I see it as more useful for the low end where it's a choice of DLSS or not play at a sensible framerate.
There are rasterization titles where my 3080 isn't even fast enough for 3440x1440 175hz. I use dlss all the time as it produces better IQ overall compared to native/TAA especially where temporal stability and motion clarity are concerned.