• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Yeah, who gives a sh_t if its AMD or Nvidia, just buy the best spec you can afford.

People who only buy a GPU based on if it's NV or AMD and ignore the specifications are ignorant or lazy.

To be fair - Buying a graphics card isn't as easy as buying a CPU, motherboard, SSD or memory. There's multiple factors to consider, plus the fact that GPUs are often the most expensive component.

Releases are often delayed / soft launched are we are left wondering when (if ever) a GPU will come out that will be 'good enough'.
 
Last edited:
However with Nvidia releasing marked down prices initially this time around, heres hoping AMD will come in a bit cheaper as well.
To be honest, AMD may not need to come in too much cheaper than FE, if cheaper at all. If Uncle Jensen's masterplan comes to pass, you won't be able to get a FE anyway, equivalently-priced AIB cards will be junk so the decent ones cost a good chunk more. So on paper yes, the FE may be good value for money, but in terms of the card you can actually buy, the Radeon equivalent would be a chunk cheaper.

Nvidia may set RRP to be X amount, but if you're forced into paying Y because of availability or build quality, why does RRP even matter?

It's going to be the same as when people tried to argue the 2070S was cheaper than the 5700XT because a boggo-basic KFA was £20 less than top-end Sapphire.
 
Triple fan is not new for AMD - the Radeon VII had a triple fan design, and was 295w, less than Nvidia's 320w 3080 and 350w 3090.

I'm hoping the RX6000 are a good 50W lower TDP than Nvidia, with similar performance. Fingers crossed!

errr actually

Peak power consuption for

radeon VII was actually 350W :rolleyes::mad::D thats 3090 lvl
Vega 64 was 310W thats 3080 lvl
power_maximum.png


I would expect Big Navi to be inbetween these two badboys. 330W possibly.
 
So long as it competes with Nvidia it doesn't matter.

Doesn't really need to compete with the 3090 imo. 3080, absolutely. Top end 6xxx has to be within the same ballpark performance wise.

Triple fan is not new for AMD - the Radeon VII had a triple fan design, and was 295w, less than Nvidia's 320w 3080 and 350w 3090.

I'm hoping the RX6000 are a good 50W lower TDP than Nvidia, with similar performance. Fingers crossed!

VII was hardly cool and quiet despite the triple fan cooler :p
 
I'd expect a 72 CU graphics card to consume between 300-360w, twice that of the PS5 GPU, which has 36 CUs. The PS5 GPU is rumoured to consume 180w of power, but I think it could be more like 150w.

Maybe it could have 80 CUs if AMD they can improve the power efficiency further.
 
errr actually

Peak power consuption for

radeon VII was actually 350W :rolleyes::mad::D thats 3090 lvl
Vega 64 was 310W thats 3080 lvl
power_maximum.png


I would expect Big Navi to be inbetween these two badboys. 330W possibly.

2080ti is supposedly 250W, and comes in at 285W. Peak wattage is always going to be higher, so going to be interesting to see just how much the 3090 sucks up at peak...
 
I'd expect a 72 CU graphics card to consume between 300-360w, twice that of the PS5 GPU, which has 36 CUs. The PS5 GPU is rumoured to consume 180w of power, but I think it could be more like 150w.

Maybe it could have 80 CUs if AMD they can improve the power efficiency further.

PS5 has curiously high clocks though. Might be the reason for the yield rumours... AMD GPUs never fair well out of their sweet spot and their binning has been super conservative in the last few generations
 
Indeed that's why I want to see the specs or an indication of perf asap, AMD can release in Nov if they want but they had better say something pretty soon if they want me to buy it.

well said. I am with you on this.
We now need a nice leaked spec sheet on Navi. transistor count, die size, clock speed anything.


Not another cooler render in Fortnite. please.
 
Edit - Just remembered that I tried to work this out before for the Xbox Series X RDNA 2 GPU.
15.3 billion transistors divided by an assumed die size of 171mm (whole SOC is 360 mm²), you get a transistor density of 89.4M / mm².


Incorrect...
 
Last edited:
Radeon VII draws no where near that under load in gaming, that's extremely misleading to say the least. 180W undervolted, 200-225W at stock under load from what i see with my Radeon VII.

Yeah, I don't get these numbers that many reviewers have reported cause my own measurement at the wall and with Wattman tells another story. If we are talking spikes, sure maybe, but not average sustained...
 
Yeah, I don't get these numbers that many reviewers have reported cause my own measurement at the wall and with Wattman tells another story. If we are talking spikes, sure maybe, but not average sustained...

They're probably loading up furmark and getting that kind of reading like that's indicative of actual gaming load.
 
They're probably loading up furmark and getting that kinds of reading like that's indicative of actual gaming load.

Nope, I think they use the Peak power consumption in order to show how much the cards can be pushed up, with overclocking and under heavy stress load, and how much the PCB and its components can withstand the power consumption.

Gaming load can be anywhere between near the peak and 35-40% lower.
 
They're probably loading up furmark and getting that kind of reading like that's indicative of actual gaming load.
That's the thing... The ONLY way I can get my Vega 64 to drink 300+ watts is by using Furmark and going from stock driver settings to overclocked with a 50% power slider... I'm pretty ffing sure that's not how they test Nvidia GPU's. Stock settings and I'm seeing 200-210 in Furmark(not at the wall though, so it would be slightly more).
 
Status
Not open for further replies.
Back
Top Bottom