• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RX 7900XT, 90% to 130% faster than 6900XT, MCM, Q4 2022.

No-one should be spending a lot of money on 8GB in 2022, that's low-end 1080p territory now. And really everything current gen from 60 cards and up should have had a mimimum of 12GB VRAM. Not this 8-10GB nonsense. Every 80 card should have had 16GB.
 
No-one should be spending a lot of money on 8GB in 2022, that's low-end 1080p territory now. And really everything current gen from 60 cards and up should have had a mimimum of 12GB VRAM. Not this 8-10GB nonsense. Every 80 card should have had 16GB.
£450 for an 8GB card in 2020 was pushing it.

 
I never knew Vram was a driver issue, alas FC6 is the only game to crap over my 8GB Vram, I just don;t use HD textures in that game, it is above^.

Fallout 4 plus mods plus the HD textures, maxed the 8gb of VRAM on my 1070...... 5 years ago.

I appreciate it's a bit of an extreme example.

Like people that say you don't need more than 32gb if RAM for gaming, my cities skylines save hits 40+
 
RDNA 3 (esp. MCM) sounds really great except for one thing - I still haven't heard anything about RT improvements. For me that's the make or break, heck I wouldn't upgrade at all this next go if not for RT, there's plenty of raster performance to go around even with the current crop of cards. Especially since this year I'm most looking forward to Avatar, which will be reliant on RT (sorta like Metro Exodus EE, but they have a software fallback too - though we know from how that went with Cryengine...) for lighting & they've pioneered a lot of RT techniques for vegetation as well (which is currently a weak spot for the tech in current games). I really don't want to keep playing games with a 6800 but end up closer to a 3060 when RT is on, it's just meh. They need to bridge that gap hard, it's going to be even more crucial in '22.

NwmNB41.jpg
 
RDNA 3 (esp. MCM) sounds really great except for one thing - I still haven't heard anything about RT improvements. For me that's the make or break, heck I wouldn't upgrade at all this next go if not for RT, there's plenty of raster performance to go around even with the current crop of cards. Especially since this year I'm most looking forward to Avatar, which will be reliant on RT (sorta like Metro Exodus EE, but they have a software fallback too - though we know from how that went with Cryengine...) for lighting & they've pioneered a lot of RT techniques for vegetation as well (which is currently a weak spot for the tech in current games). I really don't want to keep playing games with a 6800 but end up closer to a 3060 when RT is on, it's just meh. They need to bridge that gap hard, it's going to be even more crucial in '22.

NwmNB41.jpg
I'm quite confident AMD's RT performance will more than double, just doubling from it being physically more GPU and perhaps up to 0.5 on top of that from improving the per RT core performance.

I don't think they will beat Nvidia for the next round, perhaps not even match them, but i think they will cut Nvidia's lead to something much less meaningless.
 
I'm sure AMD will do well with RT, it has no choice more and more games now use RT so it's becoming a more Important aspect of GPU design

When Nvidia Turing launched I'd have said raster was 70% important and Ray Tracing 30% important.
But now I'd say Raster is a maximum of 50% important and the rest is Ray Tracing in 2022 going into 2023.

Nvidia and AMD need to be allocating enough die space to Ray Tracing so that turning Ray Tracing on more or less has no or little performance hit
 
Last edited:
Very entertaining thread, come of the quips had me laughing, have always had Nvidia cards up to when a family member won a MSI 6900xt Z, in a competition and let me have it for £800 which proved to tempting to resist. Got it pegged to 100 FPS and runs very cool, will look to get a higher refresh monitor to stretch its legs a little.
 
I have no confidence in AMD's driver team to make such a new architecture on MCM work reliably, without BSOD, black screens, flickering and the usual terrible software that AMD provides. Hopefully I'm wrong!

Happy to let others be the guinea pigs and stick to a good old fashioned monolithic die from Nvidia, with their old dated but stable Nvidia control panel, that just works. Don't have enough time outside work to be pestering AMD driver support to fix issues that I've had with AMD drivers in the past.
 
What the market needs is competition, so I for one am hoping that Intel do indeed make a half decent GPU and build on it into the future so that fan boys of all sides can get their particular flavour at a non insane price.
 
I have no confidence in AMD's driver team to make such a new architecture on MCM work reliably, without BSOD, black screens, flickering and the usual terrible software that AMD provides. Hopefully I'm wrong!

Happy to let others be the guinea pigs and stick to a good old fashioned monolithic die from Nvidia, with their old dated but stable Nvidia control panel, that just works. Don't have enough time outside work to be pestering AMD driver support to fix issues that I've had with AMD drivers in the past.

not had an issue with AMD drivers in years. The drivers fail when you have an unstable overclock.
 
I have no confidence in AMD's driver team to make such a new architecture on MCM work reliably, without BSOD, black screens, flickering and the usual terrible software that AMD provides. Hopefully I'm wrong!

Happy to let others be the guinea pigs and stick to a good old fashioned monolithic die from Nvidia, with their old dated but stable Nvidia control panel, that just works. Don't have enough time outside work to be pestering AMD driver support to fix issues that I've had with AMD drivers in the past.

LMAO here we go the old AMD driver nonsense. Yawn.

PS - AMD's software urinates all over Nv's, if you can't work with it, that's on you.

If you're continually having "driver issue" in this day and age, I suspect it's a YOU problem also.
 
RDNA 3 (esp. MCM) sounds really great except for one thing - I still haven't heard anything about RT improvements. For me that's the make or break, heck I wouldn't upgrade at all this next go if not for RT, there's plenty of raster performance to go around even with the current crop of cards. Especially since this year I'm most looking forward to Avatar, which will be reliant on RT (sorta like Metro Exodus EE, but they have a software fallback too - though we know from how that went with Cryengine...) for lighting & they've pioneered a lot of RT techniques for vegetation as well (which is currently a weak spot for the tech in current games). I really don't want to keep playing games with a 6800 but end up closer to a 3060 when RT is on, it's just meh. They need to bridge that gap hard, it's going to be even more crucial in '22.

NwmNB41.jpg
Well said.

It's funny, people keep going on about one very questionable game and nvidias supposed vram issues..... ignoring the fact even 3090 owners are reporting fps drops/stutter in said game.... :cry: But yet here we are with loads of RT games where amd owners just can't get to enjoy RT to the same extent and not just from a performance POV (especially since FSR 1 can't gain the same performance gain as DLSS with RT turned on) but also not even having RT being rendered properly but nope "zOMG FC 6 and VRAM!!!!!!!" :cry: :D

If RDNA 3 can match/beat nvidia on the RT front as well as FSR 2 being as good as DLSS and getting into games, which "matter", I'll be all over it but as it is, that is looking very unlikely.... after all how long did it take amd to catch up with nvidia just for rasterization? And we're still waiting for FSR 2......
 
Back
Top Bottom