• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Just want to say that AMD have been working on this for at least 3 years, their first patent was filed at the end of 2017.

But I take it from your post that you think AMD's RT solution and RDNA 2 will be rubbish?

AMD Radeon Technologies Group do exceptionally well for their size. In fact it confuses me that Nvidia with their huge market share, can only beat them by about 25%? That doesn't add up in my opinion.

The situation wont improve until the majority vote with their wallets and buy Radeon cards. Then maybe Nvidia will be less complacent and try harder. Leading AMD to have more cash to compete. I stopped adding to Nvidia's market share in the Radeon 9500 days so 2002 I think.

Instead it will be a repeat of last gen - people will pre-order a 3080Ti and then ask for a refund when reviews are released.

So, if that's your attitude, how did they come up with Ryzen?

Intel must have been capable of today's performance 5 years ago if they tried. Intel stopped caring a decade ago, got lazy and milked the industry.

Ryzen didnt impressively overtake some kind of spaceage technology. Ryzen overtook obsolete tech and they could have done it much sooner if people didn't let Intel's market share get out of hand. I stopped buying Intel in 1999.
 
Last edited:
I bet the engineers building the cards are reading this thread and loving life right now. So much drama. It'll be worth all the broken keyboards though,

Im selling Cornish grass fed popcorn for £10 per 100grams.
 
I bet the engineers building the cards are reading this thread and loving life right now. So much drama. It'll be worth all the broken keyboards though,

Im selling Cornish grass fed popcorn for £10 per 100grams.

They're probably wondering what the tech "rumour" sites have been smoking with their wild OTT "insider info". Everything from "nvidia killer" to alleged performance reeks of a fanboys ultimate wet dream. We'll find out one way or the other, but it'll be funny to see the usual people slating amd because the bs posted on tech sites was taken as an official statement if "big navi" doesn't stomp on the next nv highend offering.
 
Ray Tracing is so none existent in the games I like to play that I wouldn't be hard press for it any time soon. I don't believe I'm missing immersion in a game without ray tracing in it to be honest.
Personally 1440p/4k HDR is something I've been keeping my eyes on as of late.
AMD Radeon Technologies Group do exceptionally well for their size. In fact it confuses me that Nvidia with their huge market share, can only beat them by about 25%? That doesn't add up in my opinion.

The situation wont improve until the majority vote with their wallets and buy Radeon cards. Then maybe Nvidia will be less complacent and try harder. Leading AMD to have more cash to compete. I stopped adding to Nvidia's market share in the Radeon 9500 days so 2002 I think.

Instead it will be a repeat of last gen - people will pre-order a 3080Ti and then ask for a refund when reviews are released.



Intel must have been capable of today's performance 5 years ago if they tried. Intel stopped caring a decade ago, got lazy and milked the industry.

Ryzen didnt impressively overtake some kind of spaceage technology. Ryzen overtook obsolete tech and they could have done it much sooner if people didn't let Intel's market share get out of hand. I stopped buying Intel in 1999.
Refreshing, thanks! :D
 
Ray Tracing is so none existent in the games I like to play that I wouldn't be hard press for it any time soon. I don't believe I'm missing immersion in a game without ray tracing in it to be honest.
Personally 1440p/4k HDR is something I've been keeping my eyes on as of late.

Refreshing, thanks! :D

what games do you play?
 
what games do you play?
Excellent question. I am glad you asked. I've been playing Assetto Corsa Competizione. The funny thing about that game is this game was slated to use RTX Ray tracing. However, the developer decided to work on the game itself.
Our priority is to improve, optimize, and evolve all aspects of ACC. If after our long list of priorities the level of optimization of the title, and the maturity of the technology, permits a full blown implementation of RTX, we will gladly explore the possibility, but as of now there is no reason to steal development resources and time for a very low frame rate implementation.
It warms your heart doesn't it?:D
 
Last edited:
In fact it confuses me that Nvidia with their huge market share, can only beat them by about 25%? That doesn't add up in my opinion.

They are all working with the same physical material (depending on generation) - nVidia does not manufacture its own semiconductors and AMD has divest itself of its manufacturing arm - which largely licensed processes from Samsung, IBM, etc. anyhow rather than their own development from the ground up. AMD's latest hardware is on 7nm while nVidia is still using an ultra optimised "12nm" processes (and some incidental 14nm).

A lot of the gains that nVidia has over AMD come from architecture development and there is only so much you can do there when you still have the same transistor and power/thermal budget to work with.
 
Excellent question. I am glad you asked. I've been playing Assetto Corsa Competizione. The funny thing about that game is this game was slated to use RTX Ray tracing. However, the developer decided to work on the game itself.

It warms your heart doesn't it?:D

So only car racing games?

I can't think of any right now, but I know the upcoming Gran Turismo has RT and it's highly likely the next Forza will too - Project Cars 3 is coming and will probably also use it
 
AMD Radeon Technologies Group do exceptionally well for their size. In fact it confuses me that Nvidia with their huge market share, can only beat them by about 25%? That doesn't add up in my opinion.

The situation wont improve until the majority vote with their wallets and buy Radeon cards. Then maybe Nvidia will be less complacent and try harder. Leading AMD to have more cash to compete. I stopped adding to Nvidia's market share in the Radeon 9500 days so 2002 I think.

Instead it will be a repeat of last gen - people will pre-order a 3080Ti and then ask for a refund when reviews are released.



Intel must have been capable of today's performance 5 years ago if they tried. Intel stopped caring a decade ago, got lazy and milked the industry.

Ryzen didnt impressively overtake some kind of spaceage technology. Ryzen overtook obsolete tech and they could have done it much sooner if people didn't let Intel's market share get out of hand. I stopped buying Intel in 1999.

AMD have nobody to blame but themselves. They made some terrible choices that nearly ruined them. Not helped by several very lacklustre GPU releases.

If you make a good product for a few generations, like AMD have done with Ryzen, people will buy them and people will promote their product. People didn't wake up one morning and decide to buy AMD, they made the choice to buy Ryzen CPUs because they were good CPUs.

People will start buying AMD's GPUs when they start making good GPUs consistently. It really is that simple, Ryzen has proved that.

But all this is just deflection by you. You didn't answer the question I asked. You mentioned AMD's R&D budget, is that you saying that AMD's Ray Tracing solution will be bad? Or is that you making excuses for them already in case it isn't great?
 
Sure it's lot better than having no slightest hardware acceleration.
But if possibly up to half of rasterization HW's resources aren't doing anything useful when raytracing is enabled, maybe design is far from optimal?
We don't have infinite transistor budgets and transistors have also power cost.
So transistors put into chip should be utilized as much and efficiently as possible.
IIRC that's been behind AMD's gaming performance (+power efficiency) lagging behind Nvidia, like Vega having lots of teraflops but no corresponding fps.


Out of Navi 10's 250 mm2 not everything is processing units.
There's some amount of fixed functionality, which doesn't need to be doubled if doubling number of processing units.
Also with Navi 10's relatively small size clocks were probably pushed up to try to compensate, which easily hits power efficiency.
That fewer transistors gives better power effiency only when run at optimal clocks for particular manufacturing tech.

So if Nvidia can bring out supposedly performance major amount from current Turing improving architecture, why couldn't AMD double high end's performance from 5700 XT?

Where do you get the idea that the Rasterization HW is doing nothing when Ray Tracing is enabled? In current games the bottleneck is actually the other way around. The RT cores are waiting on the Rasterization HW.

You mention power cost, Nvidia's 12nm node with dedicated RT hardware and Tensor cores is more power efficient than AMD's 7nm process. If you compare the 5700 (not the XT version) to the 2060 Super. Both have the same power consumption despite the 2060 super having larger die size and more transistors and less efficient node.

As I said in one of my previous posts. For AMD, it all comes down to their performance per watt claim. If it's accurate then they might be able to hit that double performance of the 5700XT figure.

But, this is an aside from the original discussion, I still can't see the next gen consoles beating the 2080Ti in Ray Tracing performance.
 
Some very interesting stuff in the latest MLID video. Certainly sticking his neck out.

https://youtu.be/qMMm9nHFe0Y

Ampere potentially 400W overclocked. 12GB VRAM (18Gps), 2.0 GHz boost, OC to 2.2GHz but then hitting that 400W. 40% increase in rasterization and 3-4 X increase in RT performance. Increased IPC. 8nm Samsung.

The interesting bit is RDNA2 could be 50-60% over a 2080Ti. Sounding like a close fight to me with software and price deciding it.
From the Ampere thread.
60%, any advance! Choo choo :p
 
From the Ampere thread.
60%, any advance! Choo choo :p
So you read Ampere is 40% rasterisation and 4x RT over Turing and you don't bat an eyelid. You read Big Navi is 60% over 2080 Ti and it's "choo choo" hype train?

I fail to see why you think the latter is "choo choo hype train". The 2080 Ti was only 35-40% ish faster than the 5700 XT with only 40 RDNA 1 CUs.
 
So you read Ampere is 40% rasterisation and 4x RT over Turing and you don't bat an eyelid. You read Big Navi is 60% over 2080 Ti and it's "choo choo" hype train?

I fail to see why you think the latter is "choo choo hype train". The 2080 Ti was only 35-40% ish faster than the 5700 XT with only 40 RDNA 1 CUs.
The latter means competition.

Edit: Also the other stuff has already been stated and discussed.
 

That would make Navi X2 48% faster than a 2080TI

Normalized:

5700XT: 100%
2080TI: 152%

Navi X2: 225%
2080TI: 225 * 152 = 1.48 (48%)

To check that 152 + 48% = 224.96

tyu68cu.png
 
The latter means competition.

Edit: Also the other stuff has already been stated and discussed.
Oh, I know it means competition. Yet "choo choo" hype train insinuations just seem childish and unnecessary, especially juxtaposed to equally big numbers from the Nvidia camp that are taken as read, it seems.
 
Status
Not open for further replies.
Back
Top Bottom