• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
The speculated CUs of 60/64 is interesting. I wonder what made some people in the rumour mill reduce their estimate. Its not even double the CU count of the PS5 / rx 5700.

I agree with Nablaoperator's sentiment about the rumour, which seems to involve box ticking of specs...

The CUs shown in the slide are larger than the broader consensus right now (64 ALUs per CU)
 
what broader consensus? I thought the (rather insistant) rumours were that the top product would have 80 CUs.

Just watching the MLID video, not sure why he thinks AMD would know anything about Nvidia's unreleased Quadro GPU, so this comparison to rdna 2 seems invalid.
 
Last edited:
what broader consensus? I thought the (rather insistant) rumours were the top product would have a 80 CU GPU.

Yes, so 64 ALUs per CU are going to net you 5120 shaders or stream processors or whatever it is that they prefer to call it... the numbers in the slide suggest a much larger CU.. but that too is a speculation
 
ok, im with you now :D

64 ALUs x 64 CUs = 4096 Shader Units.

Maybe AMD can increase the ALUs per CU for RDNA 2?

this is kind of similar to what NV has done, by effectively doubling the shaders per core.
 
Last edited:
Congrats on buying a card that costs 110% more than a 3080 and performs only 10% better while being significantly more huge and heavy. FOMO marketing at its finest. :p

But it's the world's first 8k gaming card. Nvidia cards are cheap with this generation as we're getting a titan-class card for only £1400.

(Let's see how much fake news you can identify in this comment! I'm awarding one silver star per point.)
 
The only fake news is the rumoured perf. difference between the rtx 3080 and rtx 3090. they are made from the same SKU, its not particularly surprising.

hmm, maybe comment above wasnt a serious one :D
 
based on the dual CU config of the series X rdana 2 GPU, it looks like its still 64 ALUs per CU (32 x 2 ALUs for CU 0 and CU 1).
link here:
https://www.techpowerup.com/img/m7WjdguDI7kJLI8C.jpg

What do you chaps think?

Does 'SIMD32' on the diagram refer to the ALU count or something else, e.g something to do with the graphics pipeline?

Edit - 'SIMD32' relates to something else it seems, guess theyd like to keep the ALU count a mystery.
 
Last edited:
The only fake news is the rumoured perf. difference between the rtx 3080 and rtx 3090. they are made from the same SKU, its not particularly surprising.

hmm, maybe comment above wasnt a serious one :D

I should have been more clear. I was poking fun at Nvidia's fake news surrounding 8k gaming and fake MSRPs. If there was ever a lol, it would have to be the ampere launch. It wasn't enough for them to simply market the 3090 and 3080 as "most powerful gaming-focussed GPUs in history".

It would be disappointing if AMD also end up dropping some fake news in their own launch announcements, but we'll have to wait for that.
 
Okay, well 32gb VRAM is pointless for gamers (guess I have to choose my wording more carefully on these forums :p). it just adds unnecessary cost. I dont deny it would be useful for some non gaming tasks, but it looks like a straight upgrade from the other speculated GPUs in the series. AMD is supposed to be about offering decent bang for buck.

8k gaming has not 'arrived' youd be lucky to get 30fps. You cant just add more VRAM to increase the perf.

If sold as a professional, limited card, 32gb is perfectly reasonable.

compare the 3080 and 3090 performance and the extra vram takes it from 10fps to 35fps that's mostly not having the vram bottleneck but yeah 8k gaming isn't really here yet
 
Without any doubt, eh? It's a good job we have tech gurus like you on the forum.
You gotta be absolutely stupid at this point to make that claim, the Navi will trade blows with the 3080 and be cheaper (Even if it only comes close to the 3080 it wins) like I said pound for pound it will be the best value for money this time around for sure, and what have Nvidia done to earn such zealot like loyalty? Hiked prices astronomically, lie outright and botch completely.
 
You gotta be absolutely stupid at this point to make that claim, the Navi will trade blows with the 3080 and be cheaper (Even if it only comes close to the 3080 it wins) like I said pound for pound it will be the best value for money this time around for sure, and what have Nvidia done to earn such zealot like loyalty? Hiked prices astronomically, lie outright and botch completely.

For me, the deal-stealer isnt the raytracing or DLSS promise, its the power consumption and the VRAM. Yes I love gaming, I think we all do, but what other hobbies do you do beside gaming? I use my rig these days for working on (especially now Im working from home its got way more impact), some productivity with media/encoding, watching Prime Video/Netflix, development work and I also mine (overnight).

I dont know many guys that overlap from the mining scene that jib in on here apart from @Vince but he will second that when purchasing a high end GPU I bet the first thing he checks is its VRAM and compute potential? The next card I get is absolutely going to have this at the front of its requirements, gaming is its main job when I want to play games, but it also has to double up as a useful tool.
 
For me, the deal-stealer isnt the raytracing or DLSS promise, its the power consumption and the VRAM. Yes I love gaming, I think we all do, but what other hobbies do you do beside gaming? I use my rig these days for working on (especially now Im working from home its got way more impact), some productivity with media/encoding, watching Prime Video/Netflix, development work and I also mine (overnight).

I dont know many guys that overlap from the mining scene that jib in on here apart from @Vince but he will second that when purchasing a high end GPU I bet the first thing he checks is its VRAM and compute potential? The next card I get is absolutely going to have this at the front of its requirements, gaming is its main job when I want to play games, but it also has to double up as a useful tool.


Ah maybe you're an exception to the rule.

I pretty much just play videogames and watch films on my PC via MADVR.

Otherwise no time for much else.

Maybe might get into video editting and some point, and photo editting but they're not that tied to VRAM even with 4k footage.
 
Status
Not open for further replies.
Back
Top Bottom