AMD have shown you dont need dedicated RT hardware to use RT, if it was required I would agree, so the dedicated hardware is just a modest framerate boost in RT games, its not a requirement to play RT games.
Which card aged better, the 16 gigs AMD 6800XT without dedicated RT cores, or the 10 gigs 3080 with dedicated RT cores?
Except for the fact that their latest and greatest is only just matching 2 year old gpus in RT perf and at times not even matching the flagship 2 year old tech so if you care for RT, you're going to have to upgrade an amd gpu sooner than later due to the lack of rt performance compared to the competition and RT is only gaining more momentum as time goes on now, so far nearly every major title released this year and due this year is using some form of RT.
Well based on the games that I have played over the past 2 years, the 3080 has aged a million times better for my "needs":
- having dlss since the start, amd not having fsr 2+ for so long meant even light RT was a complete no go for rdna 2 and even now, FSR is still very hit and miss
- having the rt grunt there has meant I haven't had to make the same sacrifices to graphical settings as what 6800xt has
Daniel Owen did a good video on this recently, although sadly was the 12gb 3080 model, however, the games he showcased wouldn't have benefited much from the extra 2gb anyway (as shown by other tech reviewers):
Of course if you're someone who mods games with high res. texture packs and never uses RT then a 6800xt will have aged better, each to their own and all that.
Either 10GB is enough at 4K or it isn't and as you allude to, blaming lazy devs is a cop out and always was. I was always of the opinion that while 10GB was just enough at 4K a few years ago, it would become a problem for those on a 3 or 4 year upgrade cycle.
I also strongly disagreed with the nonsense that the 3080 would run out of GPU grunt, long before VRAM at 4K became an issue.
I have been using dlss where possible as the grunt isn't there for gaming at 3440x1440 175hz, funnily though, it's been less of an issue for my 4k display as it's only 60hz so I lock fps to 60 there, had I had a 4k 144 hz screen then I would definetly be buying a 4090 for the grunt it has at this res. for the newer games. I've had to use dlss performance mode a bit more at 4k due to lack of grunt and even with a 3090, I would also have had to use performance mode since it didn't have the grunt for certain games at 4k native nor dlss quality either e.g. dead space:
Until AMD release a card with dedicated RT, thats a clear no. I think they wont release a card with dedicated RT, as its extra cost for something with only moderate benefit. Same reason they proved you dont need expensive GSYNC modules to use VRR. Nvidia always seem to go what costs the most route.
Intel have had no issues here with decent pricing for a dGPU with dedicated RT hardware......
And we have been through this a million times with the gsync module, there were many reasons nvidia had to go with a module and it proved and arguably still does offer some big advantages over freesync, don't take my word for it though:
A detailed look at variable refresh rates (VRR) including NVIDIA G-sync, AMD FreeSync and all the various versions and certifications that exist
tftcentral.co.uk