• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
Associate
Joined
9 May 2007
Posts
1,283
So a tacked on feature like i said orginally. Of everything you've listed and that i am aware of, cyberpunk is the only game i can think of that might have a "next gen/proper" implementation of RT (no i don't count minecraft) and not just a tacked on feature but we just don't know.

It is also a huge game that has been in development for a very long time. So it may very well be a tacked on feature that is heavily overused at launch to make Ampere look good before it is scaled back a few months later. Only time will tell.

Control has the full RT experience. Cyberpunk 2077 is another full RT experience. There are lots of photorealistic games coming that use RT.

This is a small developer.
 
Soldato
Joined
12 May 2014
Posts
3,328
Control has the full RT experience. Cyberpunk 2077 is another full RT experience. There are lots of photorealiistic games coming that use RT.
My mistake there are two game in 4 years with full RT experience. One or both of which require DLSS to run properly?

Please name the ones that are coming?
 
Associate
Joined
8 Oct 2020
Posts
1,154
Ultimately seems like either side is a good buy. I feel like AMD will get a bit better on the software side as things progress, but who knows when that'll be as they're still working on so many other features.
 
Associate
Joined
4 Nov 2020
Posts
16
Lets be real. By the time any game that people are some what excited for comes out with Ray tracing, we will have the next series of GPU's.
4K gaming is still a niche and only for the top few % of consumers.
6800XT gets my money all day long
 
Soldato
Joined
15 Oct 2019
Posts
8,465
Location
Uk
AMD beat Nvidias first attempt at ray tracing with their first attempt, but of course Nvidias 2nd attempt is better.
I would disagree with this since the performance penalty with RT on the new cards was pretty much the same % as with turing yet ampere was faster as it was just faster to begin with.

Compare RT on with the 3070 vs 2080ti and you will find very little difference.

Also when looking at AMDs RT performance in most cases it drops frames by a much larger % than ampere or turing.

Where does that leave the 3090 ?

The same place it was on release compared to a 3080.
 
Caporegime
Joined
18 Oct 2002
Posts
33,188
You mean when cyberpunk 2077 releases on the 10th December. RT matters now, world of warcraft is a RT game (shadows). Fortnite is RT. Most games are patching to add RT.

Atomic Heart is another RT future tech I am watching.

You realise you keep talking about RT like any usage of it will be at huge levels that will hurt AMD badly. You realise the move for people to patch games is that consoles are getting RT> The existing games were all Nvidia dev games where they pushed to add RT targetted at their hardware. Most of the games coming that will add RT will target consoles and the performance levels they have for RT, a level which is targetted at and optimised for RDNA2.

If 100 games all add RT for console titles, you realise they'll be lighter in RT and not targetting a 3090 ray tracing equipment. If they added performance settings and design that demanded a 3090 level or RT performance then every console would crawl to a halt.

Nvidia has always tended to do stuff like say push devs to make default lighting trash (Metro) to make RT look better, and use inefficient methods and over use said effect for no extra gain to push the hardware more to drive sales and to make their hardware look better. With tessellation they both added a stupidly big tessellation unit and then pushed devs to over tessellate far beyond any visual gain because they knew it would hurt AMD in performance more than them.

RT coming in most games will in no way be targetting the need for a 3080/3090 level RT performance and is much more likely to add subtle/smaller/heavily optimised additions that focus on less performance loss and working great on consoles, this will benefit AMD not hurt them.
 
Associate
Joined
25 Apr 2017
Posts
848
Lets be real. By the time any game that people are some what excited for comes out with Ray tracing, we will have the next series of GPU's.
4K gaming is still a niche and only for the top few % of consumers.
6800XT gets my money all day long

But if you are not playing at 4k, the memory advantage of the 6800 XT is completely negated.
 
Caporegime
Joined
17 Feb 2006
Posts
28,680
Location
Cornwall
That's not the point, point is the RT performance when RT is used heavily is garbage. Why am i even bothering. You'll see it yourself in future games that heavily use RT.
They won't use RT heavily if the consoles can't handle it. Devs aren't known for wasting a lot of effort on niche audiences in recent years ;)

If the consoles can handle it then the 6000 series cards will also be able to handle it.

I think it's too early to draw any conclusions. AMD clearly need some time to work on this.

But in general I think most people won't really care about RT if the consoles don't use it much.
 
Associate
Joined
4 Nov 2020
Posts
16
But if you are not playing at 4k, the memory advantage of the 6800 XT is completely negated.

Still outright beats the 3080 in 1080p and 1440p categories. If you truly care about Ray tracing and the limited showings its given us so far, then yes go for the 3080 / 3090. But bang for your buck, the 6800 takes the cake.
 
Soldato
Joined
1 May 2013
Posts
8,571
Location
M28
Wouldn't say destroys it lol , isn't 4k still gen to early for high FPS anyway 1440p seems the sweet spot ?

Just coining a phrase I see often on here :)

Why is it too early :confused: I certainly would not be going back to lower resolutions as a 3080 paired with an OLED at 4K with Gsync is glorious.
 
Associate
Joined
9 May 2007
Posts
1,283
You realise you keep talking about RT like any usage of it will be at huge levels that will hurt AMD badly. You realise the move for people to patch games is that consoles are getting RT> The existing games were all Nvidia dev games where they pushed to add RT targetted at their hardware. Most of the games coming that will add RT will target consoles and the performance levels they have for RT, a level which is targetted at and optimised for RDNA2.

If 100 games all add RT for console titles, you realise they'll be lighter in RT and not targetting a 3090 ray tracing equipment. If they added performance settings and design that demanded a 3090 level or RT performance then every console would crawl to a halt.

Nvidia has always tended to do stuff like say push devs to make default lighting trash (Metro) to make RT look better, and use inefficient methods and over use said effect for no extra gain to push the hardware more to drive sales and to make their hardware look better. With tessellation they both added a stupidly big tessellation unit and then pushed devs to over tessellate far beyond any visual gain because they knew it would hurt AMD in performance more than them.

RT coming in most games will in no way be targetting the need for a 3080/3090 level RT performance and is much more likely to add subtle/smaller/heavily optimised additions that focus on less performance loss and working great on consoles, this will benefit AMD not hurt them.

Nope the PC version will not have the same limits on PC with a RTX 3080. PC versions on the 3080 will look better. There are videos on the development of the console spiderman game that explain what they had to do to get good performance on a console with RT, these limits can be tweaked or removed for PC giving a far better looking game. You would know I am right if you knew what you were talking about.
 
Status
Not open for further replies.
Top Bottom