• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Percentage wise relative to revenue AMD may have spent more than NVIDIA, dollar wise though NVIDIA spent considerably more than AMD.

AMDs r&d has been steadily rising since 2016 though which is great to see. They have great engineers and a seemingly great management team. I have high hopes for what they're about to deliver
 
Jay has been ******* me off recently which his constant shutting on AMD cpus

Its consistent though, I felt same a long while back. Dip in to check recent stuff (his capture and bouncing off the other guy are actually much better than old stuff) and its not changed. Nvidia/Intel shill.
 
Percentage wise relative to revenue AMD may have spent more than NVIDIA, dollar wise though NVIDIA spent considerably more than AMD.

AMDs r&d has been steadily rising since 2016 though which is great to see. They have great engineers and a seemingly great management team. I have high hopes for what they're about to deliver

Surprisingly no, AMD has spent more $

nNfn63S.png
 
Unless their card beats out the 3080 by a decent amount and is cheaper I see no reason to go amd unless power consumption acutally matters to you, dlss is absolutely amazing in the games I've tried and it's the main thing putting me off amd since they have no answer to it
 
Really steve? Them 4 games hahaha some guys I tell ya.

I mean look at how you start the sentence:

"Unless their card beats out the 3080 by a decent amount and is cheaper". Yeah OK then, not a tall order then. Just buy the 3080.
 
Who'd get one if it was equal to a 3080 in rasterisation but only Turing level RT performance? If yes how much less would you want to pay for that?
 
Really steve? Them 4 games hahaha some guys I tell ya.

I mean look at how you start the sentence:

"Unless their card beats out the 3080 by a decent amount and is cheaper". Yeah OK then, not a tall order then. Just buy the 3080.

It's the normal go-to for an Nvidia fan who tries to act neutral. I will buy if it wins in power, performance, tech and is cheaper.
 
Right like the wierd and wonderful PS3 architecture that game devs had so much trouble wrestling with...
I've read a little bit about Cell/CBE, and it certainly has its admirers.

It was probably a bit ahead of its time.

Funnily enough something that AMD is also accused of.
 
Why are ppl so obsessed with RDNA 2 being 80 CUs?

That probably wouldn't double the performance of say a GPU like the 40 CU 5700 XT. We know that because doubling the shader count alone did not double the performance of the RTX 3080 vs the RTX 2080 TI.

Instead, there is around a 1/3 increase in performance vs the RTX 2080 TI. That's because increasing the core count will only boost the TFlop count, not other areas like Texture rate and Pixel rate.

Assuming the Xbox Series X GPU die size is 170-200mm² (not the whole APU), we could see a RDNA 2 GPU with double the die size so around 400mm².

The question is, would increasing the die size of a Series X like GPU by 100% give a similar boost in performance? We know the Series X GPU is much more powerful than the Series S and the whole APU is about 89.4% larger (both have practically the same CPU).
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom