• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

It looks like the 'real' /affordable RDNA3 + next gen NV desktop launch won't launch until September. Thoughts?

Soldato
Joined
9 Nov 2009
Posts
24,929
Location
Planet Earth
Yeah, some cards were launched during the pandemic, so complains were low. You don't look like that (as in high prices during covid/mining boom) at the general picture just because it favours your darling company as some fans do.

The issue is more complex. For instance, 4070 is about twice as fast vs 2070 and 65% faster than 2080 (TPU numbers). 2070 was $500 and 2080 was $700. Looking like this at the whole picture, with 4070 at $600, you'd say is a great buy 5 or so years later, why people would complain, poor company has to make some money, 12gb are plenty!

Technology can overcome inflation and BOM costs. Some people don't seem to get that, they seem more keen in pumping their favorite company marketing and profits than their own pockets (or who knows, maybe they have some interest holding such views).

The worst thing is Turing was another price rise because of the mining boom during the Pascal days,and Nvidia wanting to sell off Pascal stock at RRP. So even the RX5700XT(originally an RX690) was pushed up in price a bit to meet the RTX2070. Yet the RX5000 series was using expensive TSMC 7NM and GDDR6. But it shows you how bad it is now in comparison things have gotten. TSMC 6NM is a cost optimised TSMC 7NM so is a relatively old node. So with a smaller die and cheaper GDDR6 costs,and the ability to re-use last generation PCBs/coolers it wouldn't surprise me one bit the RX7600 is cheaper than an RX6600/RX6600XT to make,especially with all the extra design work.
 
Last edited:
Caporegime
Joined
20 May 2007
Posts
39,832
Location
Surrey
Ah my bad I saw Canada didn't notice he was talking in USD

$295 works out to £240 I don't think that's too bad for a AIB card at all for that tier of card. Cheaper than I expected.

Yeh, but VAT.

It looks like it is going to have a $299 RRP from all these pricing leaks, which means here it will likely come in at just under £300.
 
Associate
Joined
26 Jun 2015
Posts
719
That's Canada though.

Translates to just under $270 USD which is about £220 that's not that bad of a price if it translates correctly.

Wasn't long go OCUK were selling 6600s for £300-£500
It needs to be well under 200, all GPU sales have tanked and there needs to be decent performance at under 200.

I wonder if both Nvidia and AMD can deal with next to no GPUs selling, OEM like dell won't sell either if the GPU makes up 2/3 of the cost and is high so it impacts the whole industry and is.
 
Associate
Joined
18 Oct 2002
Posts
304
Location
Norwich
People defending a card that can just about do 1080p without all the bells and whistles in 2023 for £300 and saying that represents good value? Yeah sure if you compare it to what was available 10 years ago, but come on. The absolute cheapest barrier to entry for PC gaming can not cost that much.
 
Last edited:
Caporegime
Joined
20 May 2007
Posts
39,832
Location
Surrey
People defending a card that can just about do 1080p without all the bells and whistles in 2023 for £300 and saying that represents good value? Yeah sure if you compare it to what was available 10 years ago, but come on. The absolute cheapest barrier to entry for PC gaming can not cost that much.

I don't know about that, it is tricky.

As an example, the GTX 760 back in 2013 had an MSRP $250. Accounting for inflation that is $325 in today's money.

Ofcourse the real kicker in this country is that we were at ~1.60 usd to the pound back then, making it cheaper for us.
 
Associate
Joined
18 Oct 2002
Posts
304
Location
Norwich
I don't know about that, it is tricky.

As an example, the GTX 760 back in 2013 had an MSRP $250. Accounting for inflation that is $325 in today's money.

Ofcourse the real kicker in this country is that we were at ~1.60 usd to the pound back then, making it cheaper for us.
We just live in a world where an Xbox Series S can do all your low end gaming needs for £250 all in. Obviously a PC can never fully compete on price but how many hundreds of pounds is the ability to run Word and load up a few Skyrim mods worth to the low end gamer?
 
Caporegime
Joined
20 May 2007
Posts
39,832
Location
Surrey
We just live in a world where an Xbox Series S can do all your low end gaming needs for £250 all in. Obviously a PC can never fully compete on price but how many hundreds of pounds is the ability to run Word and load up a few Skyrim mods worth to the low end gamer?

True.

For me, i have to have an up to date/powerful machine with a good CPU and RAM for other tasks, so for me personally it is just the cost of the GPU when it comes to gaming. That is probably where PC gaming appeals the most (ie those that have a decent machine already).
 
Associate
Joined
18 Oct 2002
Posts
304
Location
Norwich
True.

For me, i have to have an up to date/powerful machine with a good CPU and RAM for other tasks, so for me personally it is just the cost of the GPU when it comes to gaming. That is probably where PC gaming appeals the most (ie those that have a decent machine already).
It's the same for me. But kids today are writing their essays on a touch screen. I'm an old man and can't do that! :D
 
Caporegime
Joined
17 Mar 2012
Posts
48,356
Location
ARC-L1, Stanton System
Yeah, some cards were launched during the pandemic, so complains were low. You don't look like that (as in high prices during covid/mining boom) at the general picture just because it favours your darling company as some fans do.

The issue is more complex. For instance, 4070 is about twice as fast vs 2070 and 65% faster than 2080 (TPU numbers). 2070 was $500 and 2080 was $700. Looking like this at the whole picture, with 4070 at $600, you'd say is a great buy 5 or so years later, why people would complain, poor company has to make some money, 12gb are plenty!

Technology can overcome inflation and BOM costs. Some people don't seem to get that, they seem more keen in pumping their favorite company marketing and profits than their own pockets (or who knows, maybe they have some interest holding such views).

And there you have it, the 4070 is a great buy. this is what you wanted people, stop complaining.
 
Associate
Joined
3 May 2021
Posts
1,232
Location
Italy
So what AMD card from that generation was higher than the 480?
None.
The 480 was pretty much a respin of the 380, which then evolved in the 580 and finally the 590.
The higher card closest in time would be Vega 56/64 but not sure it's countable as same gen.
 
Soldato
Joined
9 Nov 2009
Posts
24,929
Location
Planet Earth
None.
The 480 was pretty much a respin of the 380, which then evolved in the 580 and finally the 590.
The higher card closest in time would be Vega 56/64 but not sure it's countable as same gen.
The RX580/RX590 were the respins of the RX480.

The RTX480 was more or less targeting R9 290/R9 290X/R9 390/R9 390X performance from launch:

It was pretty close to the R9 290X/R9 390X,and the R9 390X wasn't that much slower than a GTX980 either:

So basically it was pretty much the performance of the previous generation AMD 2nd tier/3rd tier dGPU(or close to the second tier Nvidia dGPU as the AMD Fury non-X was faster than a GTX980).

Also,it was quite clear the RX480 could have been faster too. Due to WSA,AMD was forced to use Global Foundries 14NM,instead of TSMC 14NM. If Polaris had been made on TSMC 14NM,it would have probably clocked higher. The RX590 made on Global Foundries 12NM could barely hit 1.6GHZ overclocked,whereas the earlier GTX1060 could hit over 1.7GHZ when overclocked.

This is where all the "Beast Mode" RX480 rumours came from I suspect,as they implied Fury non-X matching performance. The RX590 matched a Fury non-X which was the second fastest AMD dGPU of the previous generation on an improved Global Foundries 14NM type process node. I suspect the RX480 was meant to clock higher initially but Global Foundries 14NM sucked. Hence why the stock RX480 cooler was inadequate and the card drew a bit too much power from the PCI-E main connector.

AMD probably missed their performance targets for the RX480.

So to put in context a modern Polaris would have been matching the 2nd or third fastest AMD dGPUs of the previous generations. So if we say the RX6900 and RX6950XT are functionally the same,it would be the RX6800 or RX6800XT with a relatively big framebuffer for the time.

The RX480 8GB was $239. That would be $302 in 2023. So with VAT,that would be £290.

Then merely three years later,the RX5600XT/RX5700 essentially doubled performance over an RX480:

The RX5600XT/RX5700 matched or even exceeded the performance of the Vega 64 based dGPUs which were the second fastest AMD dGPU of that generation. This was with AMD upselling the RX480 replacement as the RX5700XT(originally it was called the RX690). Even with inflation it would price the RX5600XT just over £300. But there were offers for the RX5700 reference model for less than the RX5600XT. The RX5700XT was almost the same performance as the Vega VII for nearly half the price. These dGPUs used a cutting edge TSMC 7NM process node and relatively new GDDR6.

So even if we add inflation into all of this,where are we seeing dGPUs around the £300 mark,with RX6800/RX6800XT level performance and 16GB of VRAM from either AMD or Nvidia?

We are not. People are accepting abnormal combined Pandemic and Mining pricing as the new normal. This is with things such as GDDR6,etc prices going down a lot compared to back then.
 
Last edited:
Associate
Joined
13 Jun 2012
Posts
353
So what AMD card from that generation was higher than the 480?
they didn't, but i think the point was that it was competing in the mid-range.

with a mid-range gpu die, a mid-range memory config, and a mid-range price.

like the 5700xt, it was never attempting to compete at the high-end, because at the time amd lacked the resources to do so.
 
Last edited:
Soldato
OP
Joined
30 Jun 2019
Posts
7,876
Comparing RDNA3 to the RTX 4000 series, it seems like only the RTX 4080 and 4090 have much of a lead in terms of watts per frame:

watt-per-frame.png


There's really not that much in it, I think they will probably catch up in energy efficiency with RDNA4.
 
Last edited:
Caporegime
Joined
17 Mar 2012
Posts
48,356
Location
ARC-L1, Stanton System
Last edited:
Back
Top Bottom