• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
AMD are just gonna come in a couple of hundred lower for equal perf and I'm ok with paying a little more for the Nvidia card, better industry support generally plus the DLSS bonus.

Don't get me wrong had multitude of AMD cards and love them, just sold my 6900XT and that was one of the best :)

But but but.... amds feature set is just as good as nvidias, why would you pay more when you can buy amd who provide the feature set for "free".... ;) :D :cry: :p

I'd wonder about that - the majority of high end games coming out now a console ports, which use AMD tech in them. Not worth developers time to produce a mutli platform game (which are the big sellers) to lock out one manufacturer. Games are always, as things stand, going to be fine on AMD. Future may change things, but not for this gen.

We have heard this for years now ever since amd got into consoles (I also thought the same) that the benefits of this would show on pc and yet here we are, still waiting and if anything, they still seem to love nvidia hardware more, even when no pc specific changes are implemented and if RT is added, nvidia still trounce amd i.e. see all the sony ported games for example and before people say that's because they're dx 11, spiderman is dx 12 ;)
 
Last edited:
I'd wonder about that - the majority of high end games coming out now a console ports, which use AMD tech in them. Not worth developers time to produce a mutli platform game (which are the big sellers) to lock out one manufacturer. Games are always, as things stand, going to be fine on AMD. Future may change things, but not for this gen.
Lockout, no. But I can see Nvidia offering engineer time/support to help implement paths that make use of features where they are present, just like I can see AMD and Intel doing the same.

I think the main advantage from cross platform development is going to be that games are fun to play, regardless of having the latest top vendor hardware or not.
 
We have heard this for years now ever since amd got into consoles (I also thought the same) that the benefits of this would show on pc and yet here we are, still waiting and if anything, they still seem to love nvidia hardware more, even when no pc specific changes are implemented and if RT is added, nvidia still trounce amd i.e. see all the sony ported games for example and before people say that's because they're dx 11, spiderman is dx 12 ;)
I think a lot of people seem to have this idea that the consoles helping amd must mean amd would thrash Nvidia. AMD was broke and on their last legs. Their GPU division was starved of resources and unlike intel, Nvidia did not rest on their laurels and kept their foot on the gas. Yet Nvidia could not dominate AMD the way intel had. Amd managed to just about hang on.
 
i dont know if you have read the ada whitepaper but right now it looks like nvidia is going to dominate RT.. they have added hardware acceleration structures for:
- recursive raycasting
- transparency mapping.. this can literally cripple amd if nvidia goes full-steam with gameworks or similar kinds of encouragement
and then other claims like more number of triangle intersection tests per core which have now become standard fare
while amd would be perhaps building their first dedicated RT core.. i am not holding my breath

though there's a 50:50 chance with the kind of leaks we have seen, amd might match or exceed nvidia's raster performance

Ada may fine wine but by the time those features get used we will be onto hopper/ rdna4
 
We have heard this for years now ever since amd got into console

I think a lot of people seem to have this idea that the consoles helping amd must mean amd would thrash Nvidia
I'm relatively new to this, didn't realise it was a long running theme. I certainly don't think that AMD is going to thrash nVidia purely based on console leanings, but the proportion of console ports these days is undeniably higher. That will only bode well for AMD.

Turn it the other way, if the console hardware was nvidia based, and they were constantly introduced new tech then, it would seem like game over for AMD from a top performance point of view. The consoles mean AMD will be, at the very, very least, permanently relevant.
 
I think a lot of people seem to have this idea that the consoles helping amd must mean amd would thrash Nvidia. AMD was broke and on their last legs. Their GPU division was starved of resources and unlike intel, Nvidia did not rest on their laurels and kept their foot on the gas. Yet Nvidia could not dominate AMD the way intel had. Amd managed to just about hang on.

Well a lot of people did think that (as said, including myself) as it seemed logical in the sense, amd powering consoles, consoles are developers focus given they have the market share, games are generally ported to pc and poorly a lot of the time and we all know consoles can get more from their hardware than a similar specced pc so it was perfectly reasonable to expect to see amd doing better than nvidia but seems to not be the case for whatever reason that may be who knows. Maybe as you said, nvidia kept pushing and never let their guard down and their drivers really are just that much better? Which would make sense given their sole focus/income is the discrete gpu space (although maybe that has changed, haven't looked at what their income breakdown is in a long time)

Ada may fine wine but by the time those features get used we will be onto hopper/ rdna4

So another year or 2 of nvidia having the lead then? :p

I'm relatively new to this, didn't realise it was a long running theme. I certainly don't think that AMD is going to thrash nVidia purely based on console leanings, but the proportion of console ports these days is undeniably higher. That will only bode well for AMD.

Turn it the other way, if the console hardware was nvidia based, and they were constantly introduced new tech then, it would seem like game over for AMD from a top performance point of view. The consoles mean AMD will be, at the very, very least, permanently relevant.

Yup been happening ever since amd got into the ps 4 and xbox and I can't think of any/many games where I would say it has beneffited amd, most recent one is assassins creed Valhalla but I think that is more SAM/rebar benefiting amd than it being a console port thing.

That is true though if nvidia got back into consoles, we probably would be seeing a very big difference....
 
its relative.. some of those bits like the recursive structure used for SER is outside dx12 spec and will be enabled through NVAPI, so its not a matter of crippling the competitor in absolute terms but if devs are encouraged to use new features, amd will look crippled in relative terms
I had the same thought when i read about these features. They are custom so they will only work with NVAPI.
Ada may fine wine but by the time those features get used we will be onto hopper/ rdna4
I don't think it will take a lot of time to see these features added in games. They don't need to be in every game, they need to be added in 2-3 games to be used for benchmarks. If you put them inside CP 2077 and another game then that will show how much faster Ada is in RT.
 
I'm relatively new to this, didn't realise it was a long running theme. I certainly don't think that AMD is going to thrash nVidia purely based on console leanings, but the proportion of console ports these days is undeniably higher. That will only bode well for AMD.

Turn it the other way, if the console hardware was nvidia based, and they were constantly introduced new tech then, it would seem like game over for AMD from a top performance point of view. The consoles mean AMD will be, at the very, very least, permanently relevant.
Well a lot of people did think that (as said, including myself) as it seemed logical in the sense, amd powering consoles, consoles are developers focus given they have the market share, games are generally ported to pc and poorly a lot of the time and we all know consoles can get more from their hardware than a similar specced pc so it was perfectly reasonable to expect to see amd doing better than nvidia but seems to not be the case for whatever reason that may be who knows. Maybe as you said, nvidia kept pushing and never let their guard down and their drivers really are just that much better? Which would make sense given their sole focus/income is the discrete gpu space (although maybe that has changed, haven't looked at what their income breakdown is in a long time)
One thing that isn't explored in reviews are the strength and weaknesses of the architects. I seem to remember someone mentioning that a game came out on consoles (I think it was last gen because I seem to remember some reference to the RX480) and it had amazing particle effects but these had to be scaled back in the PC port because GPUs weren't able to handle them properly.

Imagine if reviewers could do a deep dive into the architect, to find these kind of things out. Maybe a reviewer finds out that GPU A is better at handling explosion effects than GPU B. This could sway a person who likes to spam grenades and rockets in games but hates how the FPS drops.


Side Note: AMD might start leaking stuff tomorrow
 
Last edited:
i dont know if you have read the ada whitepaper but right now it looks like nvidia is going to dominate RT.. they have added hardware acceleration structures for:
- recursive raycasting
- transparency mapping.. this can literally cripple amd if nvidia goes full-steam with gameworks or similar kinds of encouragement
and then other claims like more number of triangle intersection tests per core which have now become standard fare
while amd would be perhaps building their first dedicated RT core.. i am not holding my breath

though there's a 50:50 chance with the kind of leaks we have seen, amd might match or exceed nvidia's raster performance

It can get ugly for AMD if they chose to ignore RT. Seing Hardware Unbox's review, in CB2077, 4k: RT Ultra native 45fps rtx4090 vs 13fps rx6950xt *** in 1440p is 86fps 4090 vs 27fps 6950 xt.
So AMD is getting destroyed right now when it comes to RT (WITHOUT DLSS!) and even if they manage to double current gen performance those 7xxx GPUs will get left behind. They kinda need a miracle at this point.

One thing that isn't explored in reviews are the strength and weaknesses of the architects. I seem to remember someone mentioning that a game came out on consoles (I think it was last gen because I seem to remember some reference to the RX480) and it had amazing particle effects but these had to be scaled back in the PC port because GPUs weren't able to handle them properly.

Imagine if reviewers could do a deep dive into the architect, to find these kind of things out. Maybe a reviewer finds out that GPU A is better at handling explosion effects than GPU B. This could sway a person who likes to spam grenades and rockets in games but hates how the FPS drops.
You mean like RT? :D

Probably the biggest culprit right now would be CPU bottleneck - wasn't that nVIDIA had an issue relative recently with this?

Other issue would be the lack of next gen games.
 
Last edited:
You mean like RT? :D

Probably the biggest culprit right now would be CPU bottleneck - wasn't that nVIDIA had an issue relative recently with this?

Other issue would be the lack of next games.
There's more to RT than just the toggle switch in the settings. People have recently learnt that transparency is a big performance hitter (*cough*Called it months ago*cough*) and are now talking about Nvidia new method to minimise this. Has anyone tried to test this to see how much of an impact it has? What about the other aspects of RT?

All we know is RT on makes reflections shiny and FPS goes down.
 
There's more to RT than just the toggle switch in the settings. People have recently learnt that transparency is a big performance hitter (*cough*Called it months ago*cough*) and are now talking about Nvidia new method to minimise this. Has anyone tried to test this to see how much of an impact it has? What about the other aspects of RT?

All we know is RT on makes reflections shiny and FPS goes down.

It's the type of stuff Digital Foundry does, but now days they're being frown upon and called "pro nvidia". But definitely a good point! ;)
 
Using best AA on offer vs dlss quality:

CP


spiderman



dl 2:


rdr 2 (can't change the dlss file to the newest one so not getting the best from it here)



Which ones look best then? Even though it's a bit pointless as you are only seeing standing still shots i.e. you can't get a real sense of shimmering, aliasing, jaggies (which again just to make clear.... does add to image quality). Where as in motion, the issues with the good/native AA is far worse/more noticeable. Only time MSAA looks good in rdr 2 is when standing still and even then some parts look worse.

In addition to the above post on this topic of native and best AA VS dlss quality and further backing up my claims of where DLSS Q is better than native + AA (that isn't TAA), here are 2 quick videos from spiderman using SMAA and RDR 2 (sadly locked to using an old version of dlss though since social club launcher locked down so results are greatly improved with newer version) using 8x MSAA (aa methods that have been quoted as being the "best for IQ".....).



EDIT: RDR 2 4k one still processing, should be done soon though.....

Regardless of the frame rate, I think it's pretty obvious to see how many reviewers and consumers often come to the conclusion of "dlss being better than native", but again, some aren't as sensitive to shimmering, aliasing, jaggies etc. and some aren't as sensitive to softness in the image so pick your poison, based on my own experience in many games now.... it's why I will always use DLSS over native+AA (of any kind) even when no need for extra performance :)
 
Last edited:
RT that scales linearly with compute could be a big thing if true.. but then AMD is also introducing dedicated RT cores this time around, so that sounds a bit contradictory
They have dedicated RT cores, what you mean is RT cores that can run simultaneously with other aspects of the GPU core.

It's the type of stuff Digital Foundry does, but now days they're being frown upon and called "pro nvidia". But definitely a good point! ;)
To the best of my knowledge they don't deep dive into GPU architects to find their strong points.

P.S. They can do deep dives and be shills.
 
RT cores are supposed to be fixed function blocks so the correlation between compute and RT doesnt sound logical, infact if AMD is going the dedicated route, there shouldnt be any or minimal correlation with compute
eventually though whatever values RT cores return have to go through the rendering pipeline and written to a pixel
given the sequential and diverging nature of RT, this causes threads to stall and GPU utilization to fall off a cliff.. because neither vendor is keen on load balancing both aspects
something like 2 workers working in parallel to create different components for a finished good, and worker 1 is taking 10x the time while worker 2 dozed off
the only way you can address this is by load balancing both RT and non-RT throughput but given the diverging nature of RT and the fact that RT is still in transitioning phase, people are holding off on making big architectural bets
 
Last edited:
You've misread/misunderstood. The jittered frames are one of the inputs to the AI model - it then outputs the AI generated pixels.

Well yes, but that's because you can't get motion vectors from one frame. All techniques that use motion vectors (TAA, SMAA multi, DLSS, FSR2.0 etc.) need input from more than one frame.

No, that's incorrect. What you're describing is basically TAA type upscale without use of a neural net model. DLSS2.0 absolutely reconstructs/hallucinates - they show that quite clearly - but it does so with many more inputs to the model than DLSS1.

That's also incorrect. FSR2 is a TAA type upscale, not DLSS 2.0 (and XESS for completeness) type. Remember DLSS doesn't do AA, unless you downsample afterwards (and then it's called DLSSAA) - but you might get better pixels from the model that look AA'd.
I do not agree. I believe you're mistaken, especially that your words directly contradict what the document states (especially quoted by me words from it) - DLSS 2 does NOT improve textures and it DOES do AA. Document states that clearly. Aside textures there's only (largely) geometry (AA, thin lines improvement etc.) and most other effects are added after DLSS 2 pass - no space to "generate" pixels, it could do that only on textures, which NVIDIA states clearly it doesn't do. We'll have to agree to disagree, then.
 
They have dedicated RT cores, what you mean is RT cores that can run simultaneously with other aspects of the GPU core.


To the best of my knowledge they don't deep dive into GPU architects to find their strong points.

P.S. They can do deep dives and be shills.
They indeed do not. What is more, they did underline in their DLSS 3 FAQ video that they aren't engineers, and they have no clue how exactly these things work. They only know what NVIDIA told them and what they seen themselves in tests, but that's where it ends. Deep dive into architecture seems to be beyond their ability, as per their own words.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom