• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Comparing the PS5 RDNA 2 GPU to the RX 5700 is interesting, it has the same core/render config as the RDNA based 5700. The only difference in the specifications is the clock rate, which has increased from 1725 MHz to 2233Mhz, and the VRAM has increased to 16GB. The TDP is apparently 180w for both GPUs, the PS5 TDP might actually be a bit higher or lower than this (spec from Techpowerup)

Working out a possible RDNA 2 PS5 like desktop GPU:

If we assume adding 4 CUs uses ~40w of power, so 36 + 16 CUs would result in a total of 52 CUs @2233mhz, with a TDP of 340w. This amount is reduced from 45w, which is based on the TDP difference between the 5700 and RX 5700 XT (no difference in clock rate / TMUs to account for here).

Each 4 CUs will result in approx. 10% performance increase. So, +40% for 16 CUs.
This is based on comparisons between the performance of the RX 5700 and RX 5700 XT (36 vs 40 CUs. I reduced this amount from 13% to 10%, to account for differences in the TMUs and clockrate)

There's a 29.4% increase in clockrate from 1725mhz to 2233mhz.

So, around a 69.4% total increase in performance, compared to a RX 5700?

Maybe a little more for IPC improvements?

Hypothesis - The PS5 RDNA 2 GPU seems like a more power efficient RX 5700, which allows for much higher GPU clocks at about the same TDP.
 
Last edited:
The ray tracing performance of ampere really hasn't improved all that much over Turing, as show in hardware unboxed new video, no where near the improvement one would expect with the extra RT and tensor cores. Most of the performance uplift over the 2080ti can be attributed to the raw increase in performance of the 3080.

AMD as far as we know amd doesn't have the infrastructure like NVIDIA to train AI for a dlss type solution. But what if they don't need it?

If AMDs hybrid ray tracing approach is a flat 20% hit to fps, would they even need a dlss solution?

Would you buy RDNA2 if the performance impact of RT on was 10 -20%, even if it meant that NVIDIA could achieve higher effective fps when using dlss?
 
Comparing the PS5 RDNA 2 GPU to the RX 5700 is interesting, it has the same core/render config as the RDNA based 5700. The only difference in the specifications is the clock rate, which has increased from 1725 MHz to 2233Mhz. The TDP is apparently 180w for both GPUs, although I wonder if this is correct for the PS5 GPU.

Whilst the CPU and specs for the PS5 have been disclosed, they did say it is a hybrid technology mix of rdna1 and rdna2. Where sony can pull tricks and squeeze performance, it leads me to believe that a pure RDNA2 component that is a dedicated GPU should have some room in the tank for improvements over the PS5 hardware.
 
If AMDs hybrid ray tracing approach is a flat 20% hit to fps, would they even need a dlss solution?

Would you buy RDNA2 if the performance impact of RT on was 10 -20%, even if it meant that NVIDIA could achieve higher effective fps when using dlss?

No, as we have said on here before the driver enhancements like RIS with the hybrid RT approach is going to be good graphics with hardly a hit to performance.

Yes. I think the thing that people keep overlooking is that DLSS is not in many games, and like in your HU video - they state old implementations of DLSS1.0 like in metro is garbage.
 
2080 Ti is ~45% better than the 5700 XT at 1440p. Not 35%.

https://www.3dmark.com/compare/spy/13929556/spy/13515205

That gap increases to 55% at 4K.

https://www.3dmark.com/compare/spy/13923063/spy/13908297

Even TPU have the same results as me across their game suite. Although their results aren't overclocked for both like mine.

https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html

All this napkin maths built on an incorrect starting point.

3DMark has always favours nVidia cards, we all know that, not saying your wrong, although I believe relative performance is the 2080TI is 42% over a set of games BUT yeah, I've kinda lost faith in 3dmark as it's not representatve anymore...
 
The ray tracing performance of ampere really hasn't improved all that much over Turing, as show in hardware unboxed new video, no where near the improvement one would expect with the extra RT and tensor cores. Most of the performance uplift over the 2080ti can be attributed to the raw increase in performance of the 3080.

AMD as far as we know amd doesn't have the infrastructure like NVIDIA to train AI for a dlss type solution. But what if they don't need it?

If AMDs hybrid ray tracing approach is a flat 20% hit to fps, would they even need a dlss solution?

Would you buy RDNA2 if the performance impact of RT on was 10 -20%, even if it meant that NVIDIA could achieve higher effective fps when using dlss?

Depends a bit - in games that heavily use traditional techniques with a bit of ray tracing for specific features then uplift isn't huge but in games that are built around using ray tracing techniques the uplift is much higher and in games/applications that are built around ray tracing with the latest version of DLSS the increase over the 2080S can be upto 4x.

Personally I'm not a fan of AMD's approach as it is too much one leg in the past and slows down the pace of adoption of more advanced ray tracing use and when people get used to it they will find going back to older rendering is like stepping back to a 20 year old game now - too many people are writing it off without seeing it in action.
 
2080 Ti is ~45% better than the 5700 XT at 1440p. Not 35%.

https://www.3dmark.com/compare/spy/13929556/spy/13515205

That gap increases to 55% at 4K.

https://www.3dmark.com/compare/spy/13923063/spy/13908297

Even TPU have the same results as me across their game suite. Although their results aren't overclocked for both like mine.

https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/28.html

All this napkin maths built on an incorrect starting point.

You're ignoring the actual game results i posted, there in dismissing games and pushing synthetic benchmarks instead, you're not Ryan Shrout by day are you?
 
PS5 has 350W PSU ( official specs)
XSX has 315W PSU ( You can see it in Digital Foundry tear down) 12V 21.25A / 12V 5A ( 255W for soc and 60W for rest of system) of course APU will not pull 255W, It has some power reserve.

PS5 is overclocked and overvolted, weaker system and at the same time consuming more power. Sony kicked power efficiency out of the window in act of desperation to midgate lack of performance vs XSX
Sony principal engineer confirmed that PS5 is based mostly on RDNA 1.
 
PS5 has 350W PSU ( official specs)
XSX has 315W PSU ( You can see it in Digital Foundry tear down) 12V 21.25A / 12V 5A ( 255W for soc and 60W for rest of system) of course APU will not pull 255W, It has some power reserve.

PS5 is overclocked and overvolted, weaker system and at the same time consuming more power. Sony kicked power efficiency out of the window in act of desperation to midgate lack of performance vs XSX
Sony principal engineer confirmed that PS5 is based mostly on RDNA 1.

XBox Series X has a 200 Watt TDP. https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482
The PS5 180 Watt. https://www.techpowerup.com/gpu-specs/playstation-5-gpu.c3480

Digital Foundry lost their credibility the moment they took Nvidia's money and published misleading Ampere benchmarks.

Sorry but in my mind they are done :)
 
I don't know why people are thinking AMD's stock situation will be better (lol), it really can only be worse with how limited 7nm supply is.. even if they deliver a good chip, which is not a given. Plus no AIB models until next year.
 
Yes. I think the thing that people keep overlooking is that DLSS is not in many games, and like in your HU video - they state old implementations of DLSS1.0 like in metro is garbage.

Even with the latest incarnation of DLSS there are still way too many instances of temporal artefacts where it doesn't handle motion well for my liking (which static screenshots don't show) - the only time I'd accept that compromise was if it made advanced ray tracing feasible at higher framerates.
 
Would you buy RDNA2 if the performance impact of RT on was 10 -20%, even if it meant that NVIDIA could achieve higher effective fps when using dlss?
For only a 20% hit yeah. It never interested me before mostly because of how devastating it was on the performance and a little because it didn't seem to make the games look much better (bf5). I'll admit now I can see some big improvements and the games are looking more impressive with it. If you could still get 60-100fps 1080p with RT you'd have my attention for sure.
Right now I'm only playing some of the triple A games but I'm more into VR at the moment and there's no way those games could handle much ray tracing. Not too fussed about that itl happen when it does. But yeah for a 20% knock on frames I think anyone would consider one.
 
XBox Series X has a 200 Watt TDP. https://www.techpowerup.com/gpu-specs/xbox-series-x-gpu.c3482
The PS5 180 Watt. https://www.techpowerup.com/gpu-specs/playstation-5-gpu.c3480

Digital Foundry lost their credibility the moment they took Nvidia's money and published misleading Ampere benchmarks.

Sorry but in my mind they are done :)

Ehhh....
I'm stating facts not lies,

PS5 is rated by sony at 350W, those are official specs from 2 days ago
XSX has 315W power supply, you can see it when they zoom on it , it has 255W for APU + memory and 60W for rest of system.

36CU 2,23Ghz 350W PSU
52CU 1.825Ghz 315W PSU

Xbox looks promising if you are looking for RDNA 2 performance but PS5 is just overvolted and overclocked box, they done it in act of desperation, nothing more, PS5 has nothing to do with RDNA 2.
PS5 is missing lots of RDNA 2 features
 
You're ignoring the actual game results i posted, there in dismissing games and pushing synthetic benchmarks instead, you're not Ryan Shrout by day are you?

Looks like you didn't even read the post. TPU shows larger differences than the synthetic benchmarks I posted :)

Your 35% number is BS. Apply 45% and your calculations change completely.

All your posts in this thread are hopelessly optimistic. Always taking the worst data you can find for Nvidia and applying optimistic assumptions for AMD.
 
Well, the spec of the PS5 GPU is very similar to a RX 5700, but with more VRAM and a much higher clock rate. I think it probably has most of the features people will want, e.g. ray tracing.
 
Looks like you didn't even read the post. TPU shows larger differences than the synthetic benchmarks I posted :)

Your 35% number is BS. Apply 45% and your calculations change completely.

All your posts in this thread are hopelessly optimistic. Always taking the worst data you can find for Nvidia and applying optimistic assumptions for AMD.

TPU are recycling the release benchmarks, the reason i used TechStop is because they use the results from Hardware Unboxed, who have just spent the last two weeks retesting every GPU with the latest drivers and since release the 5700XT's relative performance has gone up 10%+

Ehhh....
I'm stating facts not lies,

PS5 is rated by sony at 350W, those are official specs from 2 days ago
XSX has 315W power supply, you can see it when they zoom on it , it has 255W for APU + memory and 60W for rest of system.

36CU 2,23Ghz 350W PSU
52CU 1.825Ghz 315W PSU

Xbox looks promising if you are looking for RDNA 2 performance but PS5 is just overvolted and overclocked box, they done it in act of desperation, nothing more, PS5 has nothing to do with RDNA 2.
PS5 is missing lots of RDNA 2 features

I think you must be misunderstanding what Sony said, or i would really like to see where they said the PS5 is 350 Watts because if it is there is something seriously wrong with it, its a 36 CU RDNA2 GPU, a 40 CU RDNA1 GPU (The 5700XT) is 225 Watts, So a smaller GPU with the updated RDNa2 GPU uses near 100 Watt more? i';ve take off the 30 or 40 Watts for the Zen 2 CPU.
 
Status
Not open for further replies.
Back
Top Bottom