• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
If anything, it's Nvidia who should be performing far better at RT than they actually are with how much additional experience they have had. They are now into their second generation and have managed only a 20% or so increase in real-world performance according to the benchmarks? That's hardly impressive.

We are discussing AMD's implementation?
 
a Vs video without an FPS counter, how pointless. And where did he get a RX6000 card to benchmark?

Granted it's a sneaky titled video as there isn't really any comparison of AMD vs nVidia in RT, but the underlying message is that games like Legion use DXR, which is universally compatible and usable to full extent on any RT hardware it is played on long as that hardware supports DX12 Ultimate. You should expect the exact same Raytracing visuals regardless of which team GFX card you are running.
 
Not cheating no... they are using DLSS as a crutch though to support Raytracing, but its extremely clever of them to do so, i have zero issues with it. But it does show that hardware is still not really at the point where it can use RT without having to have workaround etc to make it really playable. Again i will point out, AMD have seemingly beaten Nvidias first attempt at RT in Turing, with their method in RDNA2, i feel this needs more applause than it is getting, people seem to be overlooking this fact.

Yes they dont match Nvidias second go at RT, but they beat Nvidias first go.. this can only bode well for the future no? especially if they bring out a DLSS competitor, it will aid in the uptake of RT as a whole. Sure many will still turn it off (myself being one unless i want to take screenshots).

It's not hard to find benchmarks showing Quake 2 RTX at 1440p running greater than 60FPS on a 3080. That's a full scene denoised path tracing example. So pretending that hardware is not ready makes no sense.

I can't find any demos of AMDs RT performance, but as it uses shared hardware it may not even be meeting the dissappointing 2080Ti levels.
 
Way I see it is RT is a bit like the icing on the cake, nice to eat but doesn't do you any good (ie looks nice but the performance hit is tangible no matter how you spin it) and going by what lots of guys here are saying, they'll just turn it off in fast paced FPS games (looking around at the purdy trees and grass and then boom!, bullet comes your way and its back to square one again).
Next, if AMD can implement a sort of "ecosystem" by using a 5000 cpu to augment performance then why not? Loadsa people here are going to bite the bullet through choice, not 'cos they have to, and upgrade to some series 5000 cpu (I know I am) and get whatever increase in performance, then thats some extra "icing" on the cake that has some tangible benefits (sorry, see what I did there),
My thoughts on the launch info is that AMD have completely torpedoed (for now subject to you tube reviews) Nvidia's complete line up as punters now have the biggest benefit of this launch,......."choice". I mean $500 between the RX6900XT and 3090, if thats not a goodun, I don't know what is.
This brings me now to "what are nvidia going to do and as reply? The 3080ti 'frinstance will drop sooner than later I reckon, but I might be, and probably am, wrong) but where is it gonna sit? theres only 10-15% difference between the 3080 and 3090 so its gonna cut into the 3090 sales as its got to be cheaper than the 3090 I'd have thought, and it'll go up against the RX6900XT at $999. Really not sure what'll happen there to be honest, but I've got a coupla shrewd ideas.
I for one am completely over the moon that AMD have done this well "in one generation" going from RDNA1 to RDNA2 and at last, given us, the enthusiasts a real choice. Just AMD, get your drivers sorted on day one and have stock available to purchase.
Okies, flame away if I'm wrong, stupid, amd fanboy (which I am to a lesser degree)
 
Granted it's a sneaky titled video as there isn't really any comparison of AMD vs nVidia in RT, but the underlying message is that games like Legion use DXR, which is universally compatible and usable to full extent on any RT hardware it is played on long as that hardware supports DX12 Ultimate. You should expect the exact same Raytracing visuals regardless of which team GFX card you are running.
Yup.

I do wonder how this is going to play out going forward. I get the impression that AMDs implementation using DXR is more applicable than Nvidias homebrewed implementation which uses the DXR API. IE will Nvidia cards need better tuning to take advantage and will AMDs just work more out of the box?
 
It's not hard to find benchmarks showing Quake 2 RTX at 1440p running greater than 60FPS on a 3080. That's a full scene denoised path tracing example. So pretending that hardware is not ready makes no sense.

I can't find any demos of AMDs RT performance, but as it uses shared hardware it may not even be meeting the dissappointing 2080Ti levels.
All leaks so far show RT performance better than 2080ti but worse than 3080 which is not bad at all for anyone that cares about RT.
 
How is it that two companies spend years building their own architecture, independently from each other, and they're pretty much the same speed?

Did AMD have enough time to check the speed of Nvidia's cards, and add just enough compute units to equal them? Or is something else going on?
 
We are discussing AMD's implementation?
No worries, clearly I misread or misunderstood your post.

How is it that two companies spend years building their own architecture, independently from each other, and they're pretty much the same speed?
Laws of physics, I guess. They both have experienced engineers working within the same limitations of their field.
 
Did AMD have enough time to check the speed of Nvidia's cards, and add just enough compute units to equal them? Or is something else going on?
Internal leaks - the 2 companies know a shocking amount about the other and what goes on internally. Thats what the rumour mills state between each other anyway
 
When will we see reviews, at release date?
Nobody knows. Although Jay said hes been notified to expect deliveries soon and also that he does not have much time "at all" (his words) to do testing, benchmarks, prepare video. So I imagine we'll see reviews in like a week or so tbh.

Or maybe Ryzen 5000 in one week, then 6000 Radeons the week after.

I'm guessing/speculating though.
 
I know we haven't even had RDNA2 yet but RDNA3 vs 40x0 is going to be awesome because i doubt Nvidia will want to get Intel'ed by AMD so they will probably be quite aggressive with their price and performance.
 
Yup this is the 1st gen of the next gen and we now know where both AMD and nV are and their focus. Intel are releasing their own gaming GFX cards too with RT support so 2021/2022 will definitely be very interesting times as gamers for cost effective high performance cards since competition will be so strong then especially with Intel entering the market.

Yup.

I do wonder how this is going to play out going forward. I get the impression that AMDs implementation using DXR is more applicable than Nvidias homebrewed implementation which uses the DXR API. IE will Nvidia cards need better tuning to take advantage and will AMDs just work more out of the box?

Knowing nVidia and past events in driver updates with optimisations, I imagine they have been riding the RTX wave all this time whilst behind the scenes been working on DXR optimisations and when the time comes to switch because game devs will naturally be using DXR just to keep everything consistent between next gen consoles and PC etc and making their own lives easier/less costly, nV will release a new driver which ramps up performance just like they have done many times in the past.

AMD are saying that their RT will work without the need for any update to driver software. This to me is curious because it kind of points to them saying they have not or will not do much about RT support and leaving that to Microsoft/game devs to simply implement and AMD will simply provide the hardware to make use of it whereas nVidia have been actively promoting and developing RT the last 2 years in the form of RTX, so when switchover to DXR time comes, they simply release a new driver.

Maybe this has been their long-plan all along?

We will have to wait and see.
 
We will have to wait and see.

Indeed.

I haven't seen AMD themselves say its just "out of the box" - and tbh I'd find that hard to believe anyway? But AMD are good at releasing updates for popular titles to optimize performance via drivers, maybe they'll include RT optimizing with their driver releases? that makes the most sense to me.

Use DXR so its out of the box, then AMD further optimize it through a driver update on a per game basis.

*pepehands thinking*
 
Such a good time to be upgrading, I was going to wait another couple of years however a new AMD CPU & GPU is looking very tempting. The competition can only be good for us consumers!
 
Knowing nVidia and past events in driver updates with optimisations, I imagine they have been riding the RTX wave all this time whilst behind the scenes been working on DXR optimisations and when the time comes to switch because game devs will naturally be using DXR just to keep everything consistent between next gen consoles and PC etc and making their own lives easier/less costly, nV will release a new driver which ramps up performance just like they have done many times in the past.

NV has its own devkit (100% DXR compatible and mosty open source) which may have some algorithmic optimisations best suited to its architecture..
AMD has a task cut out..
 
I know we haven't even had RDNA2 yet but RDNA3 vs 40x0 is going to be awesome because i doubt Nvidia will want to get Intel'ed by AMD so they will probably be quite aggressive with their price and performance.

AMD are going to steamroll the complacent companies if they keep going like this. I see a lot of potential in how they're integrating both CPU and GPU.
 
NV has its own devkit (100% DXR compatible and mosty open source) which may have some algorithmic optimisations best suited to its architecture..
AMD has a task cut out..

I mean I get what you are saying but AMD are using DirectX which is the standard implementation plus I presume any console developers will use DXR over RTX to avoid needing to use two different implementations.
 
Status
Not open for further replies.
Back
Top Bottom