• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
This has been happening for a long long time.

Imagine a scenario where you have a game build, you build optimizations for your driver stack into the build you have access to. Things are looking good.

Then, a week or two before you launch you receive a new build, with some additional black box code that's been added.

Suddenly the performance has completely changed.

Yeah, that used to happen a lot. Less so now, thankfully. All AMD hardware in the Consoles, and the eventual bad rep of GameWorks helps. You don't see much GameWorks now. :)
It does look like they are back to their old tricks using dxr this time.

I know AMD say the devs wanted an open source DLSS competitor but what hasn't been discusses is why the devs didn't ask Nvidia to do the same with DLSS and other proprietary tech.
AMD really should not have let devs dictate what they can or can't do when we all know those same devs have always been perfectly happy to take Nvidia's money to add their stuff at the expense of AMD performance.
1000% agree with this. It certainly comes off as subterfuge and amd feel for it.

If anything it should be AMDs way or the highway on console. As there is nothing else devs should be concerned with regarding console games. Make them pay for resources to implement for Nvidia. We all know Nvidia would if the roles were reversed (if not flat cut out amd). Amd can be too passive at times.
 
Its a bit faster than a 2080TI in RT, the 3080 is 29% faster than the 6080XT.

Its AMD's first go at RT, Nvidia's second, if 78% RT performance is unplayable? Then 100% RT performance is barely playable.

I don't understand this huge put down for AMD's RT performance, apart from a couple of outliers that probably need fixing on AMD's GPU's given its a week old architecture thier RT performance is not much worse than Nvidia's, its a little better than Nvidia's previous generation and Nvidia have oly moved it up 30%, is it really so bad?

The AMD card is faster at 1440P and 1080P without RT, it uses less power, its cheaper and it has a lot more VRam which makes it the better card IMO, you have a choice between that and 30% better RT performance. its all good. no need to try and knock AMD down, what's that all about?

The RT performance is decent, its the lack of a DLSS like feature in the games itself. So that you can reach the higher resolutions and lose only a little quality. Even if its just sharping and upscaling like death stranding. I already showed that the AMD 6800xt is not ahead at 1440p. Basically 1080p looks faster but 1440p is were the 6800xt falls behind but its close in some games that you can argue that the 6800xt is ahead if you pick the right mix of games and benchmarks. Once you look at most of the benchmarks and games tested on may different websites. Then you can start to see that the 3080 is likely faster at 1440p overall. The SAM feature does matter in some games, you have games that are close draws on many setups. SAM tips the balance from nvidia to AMD. If nvidia add this feature to the 3080 and get the same performance as the 6800xt then the 3080 will pull further ahead at 1440p.

The thing is that the 6800xt is not bad performance wise, its just the RT games that hit it hard. Nvidia is pushing heavy RT games to undermined AMD by providing better RT performance than consoles on PC. AMD need to get their super esolution into games as fast as possible. AMD will do okay in console ports but if these console ports take advanage of dlss and nvidia more RT performance. Then AMD will suffer. Any game that has DLSS is basically going the be massively faster on nvidia cards. With no arguement that it hurts image quality to much that it should not be counted in benchmark. DLSS 2.x is a game changer. AMD has to stop it becoming a stardard in every game.
 
As RT gets more popular and I do intend to use it, the 6800XT (vs 3080) is going to be to RT what the 2900XT (vs 8800GTX) was to AA
 
Nvidia use their own black box libraries for Ray Tracing, it wouldn't surprise me if some of these games don't actually work well if at all in DXR. Proper DXR that is.

The Consoles, at least the Microsoft ones are a blessing as they use that agnostic DXR API, as for the PS5, well games for that are still developed for RDNA2.

Control is a DXR game, all the features except DLSS work on AMD hardware. DXR is based on nvidia technology.
 
As RT gets more popular and I do intend to use it, the 6800XT is going to be to RT what the 2900XT was to AA

Doubtful, performance can improve as drivers get more optimized. The 2900xt was essentially broken at the hardware level due to the way AA was handled.
 
Comparison with the 3080 at 1440p on an Intel system if anyone is interested (I can't recall seeing it on here);



edit - Holy **** they're comparing it with the eVGA FTW3 Ultra too ($700 my ass!)!!! Not bad AMD, not bad at all!
 
Nvidia has shared it's dlss model with Microsoft which will be unveiled as a dx12 feature but AMD now has to match specs.. I guess it's called super resolution. In future devs will have end to end responsibility to implement dlss.

MS are developing their own version of DLSS that runs on DirectML https://www.guru3d.com/news-story/microsoft-eying-directml-as-dlss-alternative-on-xbox.html

NVIDIA utilized dedicated hardware for DLSS which runs through their Tensor cores, whereas AMD would need to run it over the compute engine.
 
Control is a DXR game, all the features except DLSS work on AMD hardware. DXR is based on nvidia technology.

DXR is based on nvidia technology.

DXR is based on technology that's been around longer than Nvidia, Nvidia call Ray Tracing "RTX" and brand their GPU's like that to make you think they invented it, and clearly you do.

I have some news for you, they didn't.
 
DXR is based on technology that's been around longer than Nvidia, Nvidia call Ray Tracing "RTX" and brand their GPU's like that to make you think they invented it, and clearly you do.

I have some news for you, they didn't.

Nvidia were first to market and worked with MS on DX12u. Everything in DX12u is supported by nvidia in hardware. DirectML is supported via tensor cores. AMD cards have to use compute which is slower.

https://www.nvidia.com/en-gb/geforce/technologies/directx-12-ultimate/

GeForce RTX is the first and only PC platform with support for these game-changing features.

The article is dated but you can get the point.
 
Nvidia were first to market and worked with MS on DX12u. Everything in DX12u is supported by nvidia in hardware. DirectML is supported via tensor cores. AMD cards have to use compute which is slower.

https://www.nvidia.com/en-gb/geforce/technologies/directx-12-ultimate/



The article is dated but you can get the point.

And Microsoft have worked with AMD integrating it's hardware into the new Xbox (that uses a derivative of DX12_2)

You seem to think Nvidia are the be all and end all of computer graphics..... Remember where vulkan originated and what was the catalyst for moving the direct X API closer to the hardware.....
 
DXR is based on technology that's been around longer than Nvidia, Nvidia call Ray Tracing "RTX" and brand their GPU's like that to make you think they invented it, and clearly you do.

I have some news for you, they didn't.

And Microsoft have worked with AMD integrating it's hardware into the new Xbox (that uses a derivative of DX12_2)

You seem to think Nvidia are the be all and end all of computer graphics..... Remember where vulkan originated and what was the catalyst for moving the direct X API closer to the hardware.....

Ray tracing has been around for longer but only for non-real-time rendering. RTX hardware and DXR software is real time RT. DXR is a real time standard for RT on Windows 10 PC's. Off line RT is very old and not designed for real time rendering at all, it requires a lot of performance not possible on PC systems until now. Control is a DX12 DXR game. The 20 series is the first GPU's to support DXR. Every game that supports DXR will run on Nvidia hardware. DXR was released with the Windows 10 October update (version 1809) on October 10th, 2018. GeForce RTX 2080 launched September 20, 2018. So the GeForce RTX 2080 was released before DXR support. Nvidia stated that the 20 series cards were based on 10 years of research into real time Ray tracing. Real time RT is very new on PC's, its not an old technology. Nvidia is the first on PC systems to have real time RT (there could be some super computers that had realtime RT but thats another story https://dl.acm.org/doi/10.5555/772249.772256). Given Nvidia had been working on RT for 10 years, then Nvidia RTX has been under development well before DX12u was conceived.

Remedy is known to have started work on the engine for Control around the same time DXR and the 20 series were in development. Nvidia teamed up with Remedy.

By teaming up with Nvidia, Remedy have managed to bring some of the most impressive cutting-edge visual effects to Control.
https://www.techradar.com/uk/news/h...-to-push-pc-gaming-to-its-limits-with-control

The ray tracing technology used in the RTX Turing GPUs was in development at Nvidia for 10 years. https://en.wikipedia.org/wiki/GeForce_20_series

Remedy has spent an abundance of time working on DXR Ray Tracing for quite some time, even releasing a very interesting powerpoint from their GDC 2018 presentation showing regarding how DXR reflections and shadows would be implemented into their engine, so it only made sense when the RTX lineup from NVIDIA came along that this engine would be prime for showcasing. https://wccftech.com/control-pc-performance-explored-all-in-on-rtx/

The DX12u functional spec is here https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html outlines how a DXR game should work.

The latest DXR version 1.1 is outlined here DirectX Raytracing (DXR) Tier 1.1 https://microsoft.github.io/DirectX-Specs/d3d/Raytracing.html#overview.
 
I think the only thing we can surely all agree on is that NVidia were the first to market with a hardware solution for consumer ray tracing.

Both AMD and NVidia had founding contributions to the DXR APIs even though AMD had no hardware support for it, well that was publicly known at the time anyway.

Here's one article from early 2018 for example:

AMD and Nvidia working closely with Microsoft on DXR API - Graphics - News - HEXUS.net
 
Ray tracing has been around for longer but only for non-real-time rendering. RTX hardware and DXR software is real time RT. ..
snip
...
This along with your other partisan Nvidia politics you posted in this thread is wholly inaccurate and usually off topic to the subject in which you've replied to. Which most of the time is used to campaign for nvidia.
Real time rendering of ray tracing goes all the way back to the late 1970s and guess what it was used on...a computer :eek:.

Ray Tracing has been around for the last 40 something years. It doesn't require a GPU and it's not particular for an API. All of which you will ignore for some random googled search topic in an attempt to continue your logical fallacies.

This is another prime example of your partisan posting for nvidia in a new released Radeon GPU thread attempting to contradict and down play Radeon Products and services. Contradictions and falsehoods which is not true. But have continued to post in this thread anyway.

 
Last edited:
This along with your other partisan Nvidia politics you posted in this thread is wholly inaccurate and usually off topic to the subject in which you've replied to. Which most of the time is used to campaign for nvidia.
Real time rendering of ray tracing goes all the way back to the late 1970s and guess what it was used on...a computer :eek:.

Ray Tracing has been around for the last 40 something years. It doesn't require a GPU and it's not particular for an API. All of which you will ignore for some random googled search topic in an attempt to continue your logical fallacies.

This is another prime example of your partisan posting for nvidia which is not actually true. But have continued to post in this thread.


Problem is that these Nvidia trolls always come to try and downplay AMD products. Must have sad lives I guess.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom