• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What would you regard to be fair current gen prices?

It’s reckoned that AMD are making single digit dollars profit per console. As in in both CPU and GPU. They make their profit in volume.
I'm sure I posted about that here before (possibly in one of financial results threads) but AMD let it slip what % of their total sales Sony are. And we were sort of able to guesstimate what sort of margins that entailed. It was a lot more than a few dollars per console SOC. Not the kind of margins a Zen3 chip can command (TSMC die cost ~$20, retail price up to $300), but still healthy.
 
Fair to assume they would get faster over time and not slower. How much faster could really depend on a multitude of factors. I know one thing for sure the 24Gb VRAM would certainly help, specially at 1440p and 4k with some games using in excess of 20Gb already.

At a higher rate than Nvidia?
 
I think the flagship 4090 prices is OK compared to last gen. ~13% increase in a era of inflation for ~60% FPS 1440p and ~70% 4k according to an old Techpowerup review I just scanned. Perhaps more with the latests CPU's / platforms.

If we'd had that through the rest of the range ~15% price increase for 50-60% FPS increase at relevant resolutions (by tier) + more reasonable amounts of VRAM (12GB on 4060+ up, 16GB 4070+, 20GB 4080+ I'd be fine with it. The days of £200 mid range cards have long gone especially since Nvidia made the die much larger to add all the RTX fluff that we all have to pay for.
A 1060 6GB was £240 at launch, add in 40% inflation and 10% deterioration of exchage rate then it would cost ~£370 today. Cheapest 4060TI at OC is £380 so pretty much in the ball park. I'd match the cut down 1060 3GB with the base 4060.

On the cost of the 4090... if you get the use out of it, then £1600 at launch, £600+ sell through used 2 years later when you replace is £10/week, £40/month. Lots of people burning that sort of cash on Sky, Ultrafast broadband, Gym memberships they barely use, daily Costa, Mobile phone contracts with a 'Free phone', football, golf, beer, cars they barely drive 8,000 miles/year in an all the toys trim etc. The list is near endless so for the performance and uplift it provides, I can live with the price.

What I do strongly disagree with is the deliberate removal of value from the lower end where GPU's are basically the same price for the same performance and VRAM as 3 years ago. I don't have the link but I watched a youtube some time ago where they quoted Jenson saying that 'People pay $500 for a gaming console, a GPU is a gaming console in their PC so we'll target our average selling price at $500. That's exactly what they have done, all but the lowest tier cards are north of $500 while the cards below $500 are not much below the price point and fairly undesirable due to only really being for 1080p.

I'd want a 50%+ gen upgrade on my 3070 so that would be 4080 at 2.5x the MRSP. That's a big fat no... not only do I not game enough to warrant it, A 4080 / 4090 likely doesn't even fit in my case.
I usually skip a gen anyway but the 3070 was a bit of a stop gap in the GPU drought and It's a bit of a comprimise for VR.

Still at least the 50xx or 80xx will be an actual generational leap for a 200-250W card next time I get my pants pulled down.
 
I'm sure I posted about that here before (possibly in one of financial results threads) but AMD let it slip what % of their total sales Sony are. And we were sort of able to guesstimate what sort of margins that entailed. It was a lot more than a few dollars per console SOC. Not the kind of margins a Zen3 chip can command (TSMC die cost ~$20, retail price up to $300), but still healthy.
Retail that may be true but you can be sure Sony and Microsoft aren’t paying $100+ to AMD per console built. Not sure if it’s the case now but consoles are usually sold at a loss, in some cases in the past a significant one. They’d be pushing AMD hard for their best price.
 
I think Nintendo are the exception to the rule of consoles selling at a loss.

Yeah they basically always* launch hardware that is a few years out of date but make up for it with a gimmick/novel value add that their competitors wouldn't take the risk on. The PS5 etc. will definitely have sold at a loss given the chip shortages. Bit of a kick in the teeth for Sony and Microsoft considering how much scalping went on around launch.

*After the GameCube at least.
 
Last edited:
Yeah they basically always launch hardware that is a few years out of date but make up for it with a gimmick/novel value add that their competitors wouldn't take the risk on. The PS5 etc. will definitely have sold at a loss given the chip shortages. Bit of a kick in the teeth for Sony and Microsoft considering how much scalping went on around launch.

Tbh I think Nintendo have a smart model: focus on gameplay, fun, collaborative gameplay, and hardware UX innovation over trying to compete on graphics and computing power (which they can't). The Wii and Switch have been brilliant.
 
Tbh I think Nintendo have a smart model: focus on gameplay, fun, collaborative gameplay, and hardware UX innovation over trying to compete on graphics and computing power (which they can't). The Wii and Switch have been brilliant.

Yeah, plus I think Nintendo are masters at perfectly tailoring the art style of their first party games to the hardware they have available so that they look very pleasing even if the graphics aren't as technically advanced as games on rival systems.
 
  • Like
Reactions: TNA
Yeah, plus I think Nintendo are masters at perfectly tailoring the art style of their first party games to the hardware they have available so that they look very pleasing even if the graphics aren't as technically advanced as games on rival systems.

Yes my wife has a Switch, given its hardware limitations games on it look amazing.
 
Thats not really worked well for AMD though, they are behind on performance and efficiency and costs are not really much cheaper.
Not today but for future generation i see that changing, they had to bite the bullet at some point and it's better to iron out the teething problem now rather than 2-3 generations from now. The whole GPU chiplet thing is currently where Zen 1 was when it first launched, it's got its oddities, bugs, and inefficiencies but i suspect it's only going to improve as time goes by.

The simple fact is, based on current technology, we can't go on making ever smaller nodes on ever increasing die sizes without costs getting out of hand.
 
Not today but for future generation i see that changing, they had to bite the bullet at some point and it's better to iron out the teething problem now rather than 2-3 generations from now. The whole GPU chiplet thing is currently where Zen 1 was when it first launched, it's got its oddities, bugs, and inefficiencies but i suspect it's only going to improve as time goes by.

The simple fact is, based on current technology, we can't go on making ever smaller nodes on ever increasing die sizes without costs getting out of hand.
It’s great that AMD are trying something new but as a consumer where is the benefit right now? They use more power, their 7900 series only matches Nvidia’s 700/800 series which are really 600/700 cards rebranded and they still want £800-1000.

At least with the early zen you got more cores at a cheaper price but I see no real advantage over Nvidia in prices right now.

Also costs are already out of hand despite the dies not costing much more than they did a few years back but again I don’t see AMDs switch to chiplets as a benefit for consumers but rather they will pocket the savings themselves and keep charging even more especially if they do get a performance advantage like with Zen 3 where they were happy to charge more than intel despite their chips costing much less to produce.
 
Last edited:
appens:
AD102 is 73.3 billion transistors
Navi31 (GCD and MCDs) is 57.7 billion transistors.

So less than 80% of AD102 transistor budget. In raster, a 20% larger design would probably have beaten AD102
It depends how well the architecture scales as a 57billion transistor 7900XTX only sits around a 46 billion transistor 4080 in performance also don’t forget the 4090 is using a cut down 102 die and if Nvidia decides to release a 90ti/titan then that would probably be another 15-20% faster than a 4090.
 
It’s great that AMD are trying something new but as a consumer where is the benefit right now?
Erm, there isn't one as pointed out by me when i said "Not today". I would guess the reason early Zen was cheaper is because they had to sell the idea to not only customers but motherboard makers, that and if Zen had not been a success it would've ended AMD as a company.

The only reason we got more cores is because Intel had become complacent and more cores was probably a good selling point, like a recent GN video says they literally bet the company on Zen. They're not in that position any more so whether chiplet GPUs are a success today or in 5-10 years time no longer a life or death situation, in fact in terms of reputational damage and hassle they'd probably prefer it not to be.
 
Last edited:
It depends how well the architecture scales as a 57billion transistor 7900XTX only sits around a 46 billion transistor 4080 in performance also don’t forget the 4090 is using a cut down 102 die and if Nvidia decides to release a 90ti/titan then that would probably be another 15-20% faster than a 4090.
With all respect, you seem to have put a lot of eggs in next gens basket? Surely shaky ground considering manufacturers contempt for consumers? Although I suspect the 3080 might even get you through through another gen after this one in terms of pure graphical grunt which is frankly amazing value and I’m surprised Nvidia gave it away so cheaply lol
 
Last edited:
With all respect, you seem to have put a lot of eggs in next gens basket? Surely shaky ground considering manufacturers contempt for consumers? Although I suspect the 3080 might even get you through through another gen after this one in terms of pure graphical grunt which is frankly amazing value and I’m surprised Nvidia gave it away so cheaply lol
£650 isn't cheap for a GPU when you consider that's what most of the top cards cost over the last 10 years. Nvidia and AMD want you to think its cheap but really its the normal pricing.
 
£650 isn't cheap for a GPU when you consider that's what most of the top cards cost over the last 10 years. Nvidia and AMD want you to think its cheap but really its the normal pricing.
Fair. We have all become accustomed to £650 as a bargain price for 2nd from top tier cards.

It’s a lot for one component.

Suspect though it’ll never come down to £400 as it was 10 years ago..
 
Last edited:
Halo cards (and hardware products in general, AMD FX-51 etc) have existed for decades, it's only recently where pricing across the range has went to crap.
Ah but there in lies the trap, the smoke & mirrors gotcha... Products like the 2080Ti and 4090 aren't halo cards they were the upgrade for their generation. Halo cards gave you small increases, features or benefits (to gamers) for a very high price they are not the same thing. Look at the die being used there is a reason why the 3080 performed well for its generation, its all segmentation designed to leverage the highest prices its not like its the same die with minor benefits with a huge price that's the difference that is the halo product that you can rightly ignore.
 
Back
Top Bottom