• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 / Navi 31 rumored to be 250% more powerful than 6900 XT and will release in 2022

You think this news is good, well I've got sources that are telling me the 8900XT is going to DESTROY the 7900XT!
 
Oh man it's still sad what happened

We went from "omg it's gonna be 2.5 times faster than 6900xt" to "hey guys, is rdna3 even better than rdna2?"


Navi 33 isn't full RDNA3:

The RX 7600 provides a very interesting look at a low end, low cost RDNA 3 implementation. Like the prior generation’s RX 6600 XT, the RX 7600 implements 16 WGPs. It also has a small 32 MB Infinity Cache, a 128-bit memory bus, and eight PCIe lanes. Thus small configuration is necessary to reduce costs and power consumption compared to higher end cards.


But the RX 7600 goes even further to cut costs. It loses big RDNA 3’s bigger vector register file. It uses TSMC’s 6 nm process, which won’t provide the performance and power savings that the cutting edge 5 nm process would. In prior generations, AMD fabbed x60 and x600 series cards on the same cutting edge node as their higher end counterparts, and used the same architecture too. However, they clearly felt a sharp need to save costs with this generation, compromising the RX 7600 a bit more than would be expected for a midrange GPU.


On top of that, the RX 7600 doesn’t fully benefit from RDNA 3’s decoupled frontend and shader clocks. The two clock domains end up running at around the same frequency, possibly because the RX 7600’s smaller shader array can be well fed at lower shader clocks. In any case, the RX 7600 typically doesn’t reduce power by clocking down the shader array when the frontend’s the bottleneck.


But everything comes back to cost. These cost saving measures prevent the RX 7600 from being a compelling product at its launch price. Typically, a strong product launch will drive down prices for similarly placed products from the previous generation, because the new product delivers better value, performance, and power consumption. The RX 7600 largely fails to do that. It’s barely faster than the RX 6600 XT while offering similar value. Power efficiency didn’t improve either, which isn’t a surprise because TSMC’s 6 nm node doesn’t offer a performance or efficiency gain over 7 nm.


In AMD’s favor, the RX 7600 does give you RDNA 3’s architecture improvements and features. AV1 support and better raytracing performance count for something. But these value adds should be icing on the cake because they only apply to specific situations. A new product needs to provide value, performance and power improvements across the board. It needs to be so good that it pushes down prices for previous generation products in the same segment. The RX 7600 doesn’t do that. Considering all of the cost cutting measures, the RX 7600 really should be cheaper.
 
Last edited:
40 series Architecture is actually really good , but Nvidia decided to gimp cards and place them in wrong tiers
Yes, a different thing than AMD's strategy this gen. Still not sure what AMD's strategy - and it is possible that AMD don't know either - but Nvidia's is plain: make as much money as possible. When their halo card is the only card which offers any "value" (in terms of increase vs the previous gen in performance and transistors) then we know we are being played. Next gen they might not even bother with their halo card - if only the could be 100% sure of AMD's plan - except Nvidia like to be at the top of charts at almost any cost.
 
Power efficiency is superb, I just don't understand the memory limitations placed on some of them. Made zero sense.

It makes perfect sense:
1.)It costs them less to make so higher margins. Less VRAM,etc=more profit!
2.)Enough PCMR will defend the VRAM amounts and buy them. Blame consoles,blame devs,blame AMD,etc when its clear there are potential problems.
3.)Release the next generation with more VRAM.
4.)Suddenly a few big games will need a bit more VRAM.
5.)People will upgrade. OFC,absolutely zero to do with VRAM! :p
6.)Profit!!
 
Last edited:
That's true. I'm just miffed because I game at 3440x1440P. So if I could get something like a 4070/4070ti with 16Gb for a reasonable price I'd be golden as I want the lower power draw than the AMD cards and ideally a smaller form factor.

Not sure I want to compromise the VRAM though with the way things are going, might just wait and look at the second hand market in 6 months or go for something like the 6950XT but it's a hot one!
 
That's true. I'm just miffed because I game at 3440x1440P. So if I could get something like a 4070/4070ti with 16Gb for a reasonable price I'd be golden as I want the lower power draw than the AMD cards and ideally a smaller form factor.

Not sure I want to compromise the VRAM though with the way things are going, might just wait and look at the second hand market in 6 months or go for something like the 6950XT but it's a hot one!
You know, you can just slightly downclock them right? AMD pushes their GPUs to the very limit usually. My 6950XT consumes 200-210 watts on full tilt after a downclock and I only lost about 8-10% performance from it, and I paid the same for a big Red Devil from powercolor than the cheapest, at the time of purchase, plastic 4070 with a .... color and only 12gigs of vram. So I got a better GPU(outside of RT) for the same money and if I ever need those 8-10% I'll just remove the downclock. Hotspot is around 80-85c with almost no fan speed. Using a normal fan curve hotspot is reduced to around 70-75c. That is hotspot temp.. edge is in the low 60ish by then, on air no less.
 
Last edited:
Wow that's a heck of a drop, wasn't expecting it to be so significant. Might need to look into those again they seem like solid bang for buck these days
 
Oh man it's still sad what happened

We went from "omg it's gonna be 2.5 times faster than 6900xt" to "hey guys, is rdna3 even better than rdna2?"

Tbf, the 7600/Navi33 is produced on TSMC 6nm (which is a revised 7nm process) rather then the cutting edge 5nm that Navi31 is partly made from so you can't expect to see massive improvements performance. I think AMD did the right thing by not branding it with the XT moniker, hopefully Navi32 can be used to fill the void of the 7600XT at some point.
 
That's true. I'm just miffed because I game at 3440x1440P. So if I could get something like a 4070/4070ti with 16Gb for a reasonable price I'd be golden as I want the lower power draw than the AMD cards and ideally a smaller form factor.

Not sure I want to compromise the VRAM though with the way things are going, might just wait and look at the second hand market in 6 months or go for something like the 6950XT but it's a hot one!

You would think,but just skim the last few pages of this thread:

12GB is perfectly fine on a RTX4070TI,and to think otherwise is speculation. If this is on a tech forum,then I hate to think elsewhere what people think. I think we are screwed.

Tbf, the 7600/Navi33 is produced on TSMC 6nm (which is a revised 7nm process) rather then the cutting edge 5nm that Navi31 is partly made from so you can't expect to see massive improvements performance. I think AMD did the right thing by not branding it with the XT moniker, hopefully Navi32 can be used to fill the void of the 7600XT at some point.

If you check my reply,Navi 33 is cut down too.
 
Last edited:
You would think,but just skim the last few pages of this thread:

12GB is perfectly fine on a RTX4070TI,and to think otherwise is speculation. If this is on a tech forum,then I hate to think elsewhere what people think. I think we are screwed.



If you check my reply,Navi 33 is cut down too.
Yeah, a quick check on Google and Navi 33 is 14% smaller compared to Navi 23. It certainly looks like AMD but costing cutting front and centre when designing the new chip.
 
Yeah, a quick check on Google and Navi 33 is 14% smaller compared to Navi 23. It certainly looks like AMD but costing cutting front and centre when designing the new chip.
If you check the article I linked to it's not a full implementation of RDNA3:

It should be very cheap to make.
 
Last edited:
It makes perfect sense:
1.)It costs them less to make so higher margins. Less VRAM,etc=more profit!
2.)Enough PCMR will defend the VRAM amounts and buy them. Blame consoles,blame devs,blame AMD,etc when its clear there are potential problems.
3.)Release the next generation with more VRAM.
4.)Suddenly a few big games will need a bit more VRAM.
5.)People will upgrade. OFC,absolutely zero to do with VRAM! :p
6.)Profit!!

:)

3,4,5 we generally wrap up into planned obsolescence. We have seen it too many times now, the only defence is to stop buying them.
 
Nvidias continued lack of vram of entry level and mid range cards has nothing to do with cost it seems, as the cost of gddr6 is dirt cheap but the pessimist in me says it has everything to do with planned obsolescence

That's exactly what it is, the problem with my 2070S is not the housepower, it still does fine at 1440P, its the VRam that's causing problems, the 3070 is a more powerful GPU, i would hate my self right now if i bought one of those.

If my GPU had 16GB of VRam there is a good chance i might keep it for another generation, but as it is i'm forced to upgrade.
 
Its a shame because its a good GPU, i like it, if not for the VRam, that's making it increasingly worthless, its just making E-Waste out of something that was once expensive, quite a beautifully engineered thing becomes junk before its time.

And 12GB also isn't enough for a 3 year or even a 2 year + life cycle.
 
Last edited:
Back
Top Bottom