Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
So a tacked on feature like i said orginally. Of everything you've listed and that i am aware of, cyberpunk is the only game i can think of that might have a "next gen/proper" implementation of RT (no i don't count minecraft) and not just a tacked on feature but we just don't know.
It is also a huge game that has been in development for a very long time. So it may very well be a tacked on feature that is heavily overused at launch to make Ampere look good before it is scaled back a few months later. Only time will tell.
My mistake there are two game in 4 years with full RT experience. One or both of which require DLSS to run properly?Control has the full RT experience. Cyberpunk 2077 is another full RT experience. There are lots of photorealiistic games coming that use RT.
I would disagree with this since the performance penalty with RT on the new cards was pretty much the same % as with turing yet ampere was faster as it was just faster to begin with.AMD beat Nvidias first attempt at ray tracing with their first attempt, but of course Nvidias 2nd attempt is better.
Where does that leave the 3090 ?
You mean when cyberpunk 2077 releases on the 10th December. RT matters now, world of warcraft is a RT game (shadows). Fortnite is RT. Most games are patching to add RT.
Atomic Heart is another RT future tech I am watching.
Lets be real. By the time any game that people are some what excited for comes out with Ray tracing, we will have the next series of GPU's.
4K gaming is still a niche and only for the top few % of consumers.
6800XT gets my money all day long
They won't use RT heavily if the consoles can't handle it. Devs aren't known for wasting a lot of effort on niche audiences in recent yearsThat's not the point, point is the RT performance when RT is used heavily is garbage. Why am i even bothering. You'll see it yourself in future games that heavily use RT.
But if you are not playing at 4k, the memory advantage of the 6800 XT is completely negated.
Wouldn't say destroys it lol , isn't 4k still gen to early for high FPS anyway 1440p seems the sweet spot ?
You realise you keep talking about RT like any usage of it will be at huge levels that will hurt AMD badly. You realise the move for people to patch games is that consoles are getting RT> The existing games were all Nvidia dev games where they pushed to add RT targetted at their hardware. Most of the games coming that will add RT will target consoles and the performance levels they have for RT, a level which is targetted at and optimised for RDNA2.
If 100 games all add RT for console titles, you realise they'll be lighter in RT and not targetting a 3090 ray tracing equipment. If they added performance settings and design that demanded a 3090 level or RT performance then every console would crawl to a halt.
Nvidia has always tended to do stuff like say push devs to make default lighting trash (Metro) to make RT look better, and use inefficient methods and over use said effect for no extra gain to push the hardware more to drive sales and to make their hardware look better. With tessellation they both added a stupidly big tessellation unit and then pushed devs to over tessellate far beyond any visual gain because they knew it would hurt AMD in performance more than them.
RT coming in most games will in no way be targetting the need for a 3080/3090 level RT performance and is much more likely to add subtle/smaller/heavily optimised additions that focus on less performance loss and working great on consoles, this will benefit AMD not hurt them.