• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Based on this:
"With RDNA 2 we get basically a 25 percent performance uplift over GCN with no work by developers at all"

Based on (some) evidence:
I think it's safe to assume that a 60 CU RDNA 2 GPU would get a +25% performance improvement, vs GCN GPUs, like the Radeon VII.

If clocked 23.9% higher than the Radeon VII at 2233mhz (PS5 GPU frequency), we would probably get another 12% (at least) of performance (5700 XT performance gains from higher frequencies are about equal to the percentage overclocked, divided by 2. This is based on 3d Mark:Time Spy GPU scores).

So, a minimum of 37% faster than the (GCN based) Radeon VII if RDNA 2 is clocked at 2233mhz, with 60 CUs.

That would put it ahead of the RTX 2080 TI.

Speculation and observations:
An 80 CU RDNA 2 GPU could be upto 33.3% faster than a 60 CU RDNA 2 GPU (similarly, a 90 CU GPU could be upto 50% faster).

So, a maximum of 70.3% increased performance for an 80 CU RDNA 2 GPU, vs the Radeon VII.

Comparing the 5700 to the 5700 XT, the performance is about 12.7% higher in 3D Mark (GPU score), the clocks are about 10% higher and the CU amount is 11% higher. The point is, I think AMD would need to combine the extra 20 CUs with a ~33.3% higher GPU clocks (2233mhz + 33.3%) to see a proportionate increase in performance.

One of the main benefits of RDNA 2 seems to be lower energy consumption. The RX 5700 XT used 45w more power, just for higher clocks and 4 extra CUs vs the RX 5700. The new consoles have significantly lower TDP per CU.
 
Last edited:
I hope AMD beat Nvidia on rasterisation, but I can't see how they can get close with raytracing.

I've got a 3080 on order just for RT. How important is RT to those who are waiting on AMD?
 
That leak looks promising ... while I don't think I'll be drawn to AMDs stuff unless they're crazy cheap+powerful or also have the other bits like DLSS (and maybe something that'll use DirectStorage)... hoooopefully the larger VRAM numbers might make nVidia release stuff like the 16GB 3070 (super?) earlier on...

No doubt AMD are working flat out to get a product out there that enables you to buy more Nvidia cards.
 
Not power efficient? MMmmm, mine pulls on maximum overclocks 192W... Series 3000 granted is twice as quick but it pulling 380W when overclocked... so, no I believe that the 5700XT IS actually pretty darn efficient considering... standard mine never goes about 170W!

It pulls 225W at stock. If you are using software to monitor power consumption then that is wrong. For AMD cards the reported power consumption is the main chip only. That is why at stock it will appear to only use 180W according to GPU-Z. 45W is from VRAM and other components.

Nvidia cards on the other hand report the total board power via software.

So before extolling the virtues of the power efficiency of the 5700 XT, get your measurements right. For AMD cards that means monitoring the 12v cables themselves.
 
Last edited:
I hope AMD beat Nvidia on rasterisation, but I can't see how they can get close with raytracing.

I've got a 3080 on order just for RT. How important is RT to those who are waiting on AMD?

Its going to be good whichever card you buy tbh. AMD have been working with sony/m$:

The whole quote from David Wang of AMD is as follows "We have developed an all-new hardware-accelerated ray tracing architecture as part of RDNA 2. It is a common architecture used in the next-generation game consoles. With that, you will greatly simplify the content development -- developers can develop on one platform and easily port it to the other platform. This will definitely help speed up the adoption [of ray tracing]."

As well as the plural 'consoles' giving a nod to the main next-gen players, the quote also includes a nod to developers being able to port between platforms. The latter being rather handy if you're creating content for multiple game consoles and systems.
 
Anyone who believes AMD is holding off on a release because the architecture looking weak, is at best very misinformed and at worst a bad faith actor.

Original launch window: Q4 2020

Console launch: Q4 2020

Every 2020 launch that's not salvaged dies (Ryzen 3 3100), unavailable (R3 3300x) or OEM only (Ryzen 4000 APUs): Q4 2020

The time window in which Apple moves to 5nm: Q4 2020

I ordered a LAPTOP and it got canceled because they couldn't get an R5 (150mm) CPU for it. I'm pretty sure the supply constraints are very very real leading up to the end of the month. Especially since TSMC had to delay Apple's 5nm products a few weeks to finish Huawei's orders around now. That same laptop is relisted with a mid OCT ship date, telling me there will be an influx of AMD products coming out of TSMC in early October.
 
That's a lovely internet picture but I can tell you now, right at this moment in time and over 4 hours of gaming in warzone, MY 5700XT maxed at 192W running a constant overclock at 2164mhz core (2200mhz set) and 1900mhz memory... so, that bar chart is not what I'm seeing either through GPUZ OR more importantly my actual power supply monitor on the wall. Hey how, we shall differ here.

Standard it's WAYYY lower... and is around 42% slower than a 2080TI based on same site in performance terms... so, no, I don't believe it's inefficient in the slightest for the juice it sucks.

power-gaming-average.png

I can tell you the people working at TPU know what they are talking about and you don't. The bar chart isn't what you are seeing because you don't have a clue. See my previous post.
 
Loving the new rumours. So now either Navi 21 is better than a 3090 or worse than a 3070 :p

In other news, the ground is either under your feet or floating somewhere above the clouds.

Or anywhere in between!
 
You mean the same guy who said the 3080 would have a co-processor on the board?
:D

Now I would have thought that when he caught himself being wrong the 1st time he would re-evaluate his "sources" take a seat and think twice before posting more "rumors". If, and I mean a BIG IF, he was told that a top end RDNA 2, Big Navi, was just as fast as a 3070 something should have clicked, in his thought processes. And, he should have filed it away until he can actually verify that the information was valid. <---Most important part here as he can't validate that the information is true.

However, lets look at what he does do. He doubles down on those same "sources" and now claims thata 6900xt is no faster then a 3070 or there abouts. Claiming that he's getting a "card" when it was rumored that AMD hasn't released those skus to AIBs. All with in 'hours' of being wrong about the co-processor. Now I don't know about you. But I find it highly suspect of his "sources". As it starting to look like he's just making it up.

But with any of this we will see once AMD release their skus.

To be fair, this is going to happen at some point. Lots of ray-tracing work can be separate from other graphics work. If what it does is some really clever compression, same thing. If it's just a general chiplet, except for heat concerns, no reason they MUST be on the same side of the board, especially with a shroud like Nvidia chose. Chances are, since development takes 3 or 4 years, some people could have seen at least conceptual information on the next GPU series 16-24 months out.
 
To be fair, this is going to happen at some point. Lots of ray-tracing work can be separate from other graphics work. If what it does is some really clever compression, same thing. If it's just a general chiplet, except for heat concerns, no reason they MUST be on the same side of the board, especially with a shroud like Nvidia chose. Chances are, since development takes 3 or 4 years, some people could have seen at least conceptual information on the next GPU series 16-24 months out.
Nothing he predicted was accurate about Ampere. End of discussion.

He was the only person claiming the co-processor. No other leak I've come across ever mentioned a separate co-processor on the card. He even had diagrams of it!! LOL.
He claimed far better RT performance then what's shown in reviews. For example.

Now he's trying to use his platform to downplay RDNA 2? Am I suppose to ignore what he said about Ampere, that amounted to nothing more then cheerleading, and take him downplaying RDNA 2 as valid? When all I have to do is look at his prior 4 or so videos show he was wrong about Ampere? Nah :p He's off the list...

Edit:
Wait, wait, wait...lets see about this magical card he saying he's getting before everyone else.
:D
 
Last edited:
Nothing he predicted was accurate about Ampere. End of discussion.

He was the only person claiming the co-processor. No other leak I've come across ever mentioned a separate co-processor on the card. He even had diagrams of it!! LOL.
He claimed far better RT performance then what's shown in reviews. For example.

Now he's trying to use his platform to downplay RDNA 2? Am I suppose to ignore what he said about Ampere, that amounted to nothing more then cheerleading, and take him downplaying RDNA 2 as valid? When all I have to do is look at his prior 4 or so videos show he was wrong about Ampere? Nah :p He's off the list...

Not that I believe him to be particularly accurate, but in all fairness I’m fairly sure he qualified the coprocessor as his own speculation, not sourced info.

Still wrong though!
 
Nothing he predicted was accurate about Ampere. End of discussion.

He was the only person claiming the co-processor. No other leak I've come across ever mentioned a separate co-processor on the card. He even had diagrams of it!! LOL.
He claimed far better RT performance then what's shown in reviews. For example.

Now he's trying to use his platform to downplay RDNA 2? Am I suppose to ignore what he said about Ampere, that amounted to nothing more then cheerleading, and take him downplaying RDNA 2 as valid? When all I have to do is look at his prior 4 or so videos show he was wrong about Ampere? Nah :p He's off the list...

Edit:
Wait, wait, wait...lets see about this magical card he saysing he's getting before everyone else.
:D

You're going way overboard. A second chip like that is possible. A 2 sided shroud and fan on (as was expected then ) 7nm TSMC process would not be necessary for cooling and would absolutely be there for another purpose.

It's also true that AIBs only have an AMD GPU that performs 15% faster than the 2080TI in their hands.


If you think Coreteks assuming that's the flagship is malicious, then you must have watched a maximum of 2 other videos he's ever released.
 
Nothing he predicted was accurate about Ampere. End of discussion.

He was the only person claiming the co-processor. No other leak I've come across ever mentioned a separate co-processor on the card. He even had diagrams of it!! LOL.
He claimed far better RT performance then what's shown in reviews. For example.

Now he's trying to use his platform to downplay RDNA 2? Am I suppose to ignore what he said about Ampere, that amounted to nothing more then cheerleading, and take him downplaying RDNA 2 as valid? When all I have to do is look at his prior 4 or so videos show he was wrong about Ampere? Nah :p He's off the list...

Edit:
Wait, wait, wait...lets see about this magical card he saying he's getting before everyone else.
:D

Haha this guy is something else, in the first vid after ampere releases he starts off by basically saying 'yeah i got the coprocessor wrong, but i got lots right too! But i don't really care about whether i get things right, but I got stuff right!'

Anyway, word on the street is AMD are seeding this 2080ti +15% card to aibs, and that it is the two fan design that's floating around. Aibs only K own about that. 3 fan design, big Navi, made in house only by amd so no one knows anything about it until it releases. So, it might not be entirely his fault he's bad. Still not worth anyone's time though
 
Status
Not open for further replies.
Back
Top Bottom