• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
People are looking into this next gen 4k 60fps
End of the day these are little mini PCs running APUs that will need to factor in quality or frame rate!

Would people be happy if next gen was the same quality we have now but the res is 4k and game is 60fps no advancements in the quality of graphics! No people would still complain it doesn't look good.

We the PC player base expect nothing less than 60fps but we have the desktop power to demand this, consoles will always while they remain in a small form factor will need to balance quality vs frame rate.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
Cheap :confused: So you know the price, please do spill.

I can get 4K@30 from my XBX now, so much for this next-gen :D

in a very little amount of games with advanced render reduction based on system load. Red Dead 2 was a 4k at below 30fps at the times and with the quality of very low and low and medium vs the PC a perfect example here what would Red dead 2 on new xbox be? 4k 30fps? 4k 60fps but keep same quality? or something in between like 4k 30fps with maxed out PC settings.

Something must give way.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Their AnvilNext 2.0 has draw a lot of complaints about performance issues in open world sandbox games for years now. Wildlands, Breakpoint, Origins, etc to name a few. IMO, they are 'shoehorning' and unproven engine to work on next gen console. If the engine was taking advantage of the bandwidth made available, as well as other key features the game shouldn't be having issues on the XBSX.

I do find it odd I've not found any solid details what game engine AC:Valhalla is using. Any links on that yet? Since I've not found a press coverage on the game engine I wouldn't foster this game as a poster child of nextgen gaming. ;)

A bit out there, but could Navi23 be dual chipped on a single card ( AMD have form for dual gpu cards - Pro Vega II for the Mac Pro for example). AMD moving towards a chiplet type setup in the same way the CPU's have been going. It may explain the smaller size listed above, but also why its the nvidia killer. AMD do seem to be heading towards an overall infinity architecture of linking components in a scalable manner. Why not apply that to the GPU.
I can only see that as a possibility if Navi23 is on 5nm. Then the number CUs per chip would be higher then 40. As it stands now I wouldn't expect more then 40 at that size on n7+.

If that's even close to being true I wonder if AMD will segment Navi23 with the 5700 range? The 5700 will surely be around for a while as it's only been out for a year and surely it still needs to pay for it's investment of producing the mask for those chips.
It might be that Nav14 replacement. But why? They would do better harvesting dies from Nav21. Whatever doesn't make 80 cu would trickle down to 40 cu for example. So to me, this makes the Navi23 a bit of mystery right now.
 
Soldato
Joined
6 Aug 2009
Posts
7,070
I'm getting more optimistic that AMD will have a decent upgrade available for my 5700XT and will be competitive enough to keep me from buying an Nvidia card. The pricing is the deciding factor...
 
Soldato
Joined
6 Feb 2019
Posts
17,466
Nvidia has achieved massively higher transistor density at 7nm than AMD and they aren't finished yet, they've said the 7nm yields still aren't great and it can get better.

Hopefully AMD has made it's own gains with RDNA2 on that front

For reference, the current best @ 7nm AMD has released is the 5700xt which has 10.3 billion transistors over 251mm2, while Nvidia has so far achieved 54 billion over 820mm2.

With AMD's 7nm density, they would need 1300mm2 die to match the transistor count Nvidia has achieved.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Nvidia has achieved massively higher transistor density at 7nm than AMD and they aren't finished yet, they've said the 7nm yields still aren't great and it can get better.

Hopefully AMD has made it's own gains with RDNA2 on that front

For reference, the current best @ 7nm AMD has released is the 5700xt which has 10.3 billion transistors over 251mm2, while Nvidia has so far achieved 54 billion over 820mm2.

With AMD's 7nm density, they would need 1300mm2 die to match the transistor count Nvidia has achieved.
As a consumer I'm more concern with price to performance. Therefore, very apathetic towards transistor density, die size, etc.
 
Soldato
Joined
6 Feb 2019
Posts
17,466
As a consumer I'm more concern with price to performance. Therefore, very apathetic towards transistor density, die size, etc.

Fair enough though its a tech forum :p

but if one product has significant more transistors than another it's usually faster so it's got some relevance for the end consumer products.

I mean, if i use the most recent leaks of the Ampere gaming stuff and apply it to today's announced products - I can use that density extrapolation to roughly say a 3080ti would be between 32b and 36b transistors where as the 2080ti is 18b. So if Navi 2 (RDNA2) is to compete with it, it too needs to double its transistor counts.

So hopefully RDNA2 has them 500 to 600mm2 chips in its pockets and common announce them soon we want it
 
Last edited:
Associate
Joined
21 Apr 2007
Posts
2,483
Fair enough though its a tech forum :p

….where PC gamers are far and away the principle consumers of GPUs and are primarily interested in their hobby/entertainment with the GPU simply being a means to that end. This is why I'm adamantly against the whole luxury goods argument and not opposed to DLSS when/if it applies to lots of titles i.e. if the results look good and perform well I don't really care how they got there but the tech is mildly interesting
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
As a consumer I'm more concern with price to performance. Therefore, very apathetic towards transistor density, die size, etc.

transistor density, die size etc does effect price for sure (less density, the bigger the die will have to be with the same number of transistor bigger die equals higher cost per die. Also has a strong influence on design so performance
 
Soldato
Joined
8 Jun 2018
Posts
2,827
Fair enough though its a tech forum :p

but if one product has significant more transistors than another it's usually faster so it's got some relevance for the end consumer products.

I mean, if i use the most recent leaks of the Ampere gaming stuff and apply it to today's announced products - I can use that density extrapolation to roughly say a 3080ti would be between 32b and 36b transistors where as the 2080ti is 18b. So if Navi 2 (RDNA2) is to compete with it, it too needs to double its transistor counts.

So hopefully RDNA2 has them 500 to 600mm2 chips in its pockets and common announce them soon we want it

But do we see the forest before the bark of a tree?

For the 1st time in decades the console is now a viable upgrade path. And one of the technical elephants in the room not discussed is it's greatest strengths of consoles, bandwidth.

I personally don't subscribe to the 'business as usual' between console vs gpus this go around and it's obvious to me the writing is on the wall how bandwidth sensitive next gen games will play a considerable factor this time. IE: more open world then in prior iterations.

Prior rumors suggest nextgen gaming engines like SlipStream, etc will encapsulate on the consoles strengths to further enhance gaming. Which leaves PC gaming a bit hamstrung. That's something I, on a technical side, cannot ignore until it "happens" (IE: being reactive). I do believe it will happen as it's been rumored for a long time now. Which would leave gaming on the PC, a bit of a quandary. DDR5, which is hardly the answer to these bandwidth concerns, won't even reach infancy until late 2021/2022. Giving consoles roughly 2+ year headstart. And who is going to pay the ram "tax" for ddr 5 8000/9000Mhz at 32GB? I'm sure that will cost about as much as mid range GPU, LOL.

Am I suppose to wait that long? :p

There are other factors too. Like the rumor Nvidia is crippling Cyperpunk 2077 like they did with Witcher3. And other noise that I personally don't want to be bothered with. If, I cannot play the games I want because of the hardware I do have I can now just go console and call it a day!
Which is a delight. :D

Edit:
Oh, I almost forgot to mention that the era of paying a "tax" to have a GPU is over for me. If price/performance matrix is not aligned to my sanctification...console it is. And, I'm not the only one who thinks that. ;)
 
Last edited:
Associate
Joined
30 Jan 2016
Posts
75
But do we see the forest before the bark of a tree?

For the 1st time in decades the console is now a viable upgrade path. And one of the technical elephants in the room not discussed is it's greatest strengths of consoles, bandwidth.

I personally don't subscribe to the 'business as usual' between console vs gpus this go around and it's obvious to me the writing is on the wall how bandwidth sensitive next gen games will play a considerable factor this time. IE: more open world then in prior iterations.

Prior rumors suggest nextgen gaming engines like SlipStream, etc will encapsulate on the consoles strengths to further enhance gaming. Which leaves PC gaming a bit hamstrung. That's something I, on a technical side, cannot ignore until it "happens" (IE: being reactive). I do believe it will happen as it's been rumored for a long time now. Which would leave gaming on the PC, a bit of a quandary. DDR5, which is hardly the answer to these bandwidth concerns, won't even reach infancy until late 2021/2022. Giving consoles roughly 2+ year headstart. And who is going to pay the ram "tax" for ddr 5 8000/9000Mhz at 32GB? I'm sure that will cost about as much as mid range GPU, LOL.

Am I suppose to wait that long? :p

There are other factors too. Like the rumor Nvidia is crippling Cyperpunk 2077 like they did with Witcher3. And other noise that I personally don't want to be bothered with. If, I cannot play the games I want because of the hardware I do have I can now just go console and call it a day!
Which is a delight. :D

Edit:
Oh, I almost forgot to mention that the era of paying a "tax" to have a GPU is over for me. If price/performance matrix is not aligned to my sanctification...console it is. And, I'm not the only one who thinks that. ;)

I'd maybe wait until they release before gobbling up all the pre-release marketing hype like we get every generation it's already gone from 4K/120Hz to 4k/30Hz no doubt the UE5 tech demo was really pretty to look at but lets not forget its not a game and it was running on a RDNA2 APU.

I'm excited for whats to come though its going to be a great year for tech we have new consoles and also Zen 3, RDNA2 & Ampere GPUs, if consoles are pulling that off on a APU slower than a 2080Ti these new GPUs should be monsters.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I'd maybe wait until they release before gobbling up all the pre-release marketing hype like we get every generation it's already gone from 4K/120Hz to 4k/30Hz no doubt the UE5 tech demo was really pretty to look at but lets not forget its not a game and it was running on a RDNA2 APU.

I'm excited for whats to come though its going to be a great year for tech we have new consoles and also Zen 3, RDNA2 & Ampere GPUs, if consoles are pulling that off on a APU slower than a 2080Ti these new GPUs should be monsters.
Oh, Epic has already started. UE5 will be in Fornite by next year. I posted that in the UE5 thread. Yet, it's the tip of the iceburg. There is more to come as we draw closer to nextgen console release.

Out of all the games that Epic could have used (some of which would have been easily budgeted) why would they choose an Open World game like Fortnite? Something to ponder on...
 
Last edited:
Caporegime
Joined
8 Jul 2003
Posts
30,062
Location
In a house
Is everyone still on the 'Nvidia Killer!' train, now that we know games on the new consoles, are also going to be 4k30, and we've also seen a demo, only running at 1440p30, that didn't have RT, because of the impact it would have had, to that already pathetic performance ?
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom