• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

What would you regard to be fair current gen prices?

What does cost reductions equal if the savings are not passed on to the end user? If AMD are able to make their GPU's cheaper they are either pocketing the extra as profit or passing it along to their customers.

If you expect the 4090 to be £650 and they are not benefiting from chiplets, what is it you think the 7900 XT should cost?
One possibility:
Overall profits in the GPU division is R&S plus all the other fixed costs less total profits from sales.
Meanwhile, total profit from sales is volume * margins.
With AMD's marketshare of dGPU sales now around the 10% level, their volumes are pitiful. So they try to increase margins per sale instead.
Result? Next year AMD might be down to 5% marketshare... And increased margins can only make up for dwindling volume for so long!
 
  • Like
Reactions: TNA
One possibility:
Overall profits in the GPU division is R&S plus all the other fixed costs less total profits from sales.
Meanwhile, total profit from sales is volume * margins.
With AMD's marketshare of dGPU sales now around the 10% level, their volumes are pitiful. So they try to increase margins per sale instead.
Result? Next year AMD might be down to 5% marketshare... And increased margins can only make up for dwindling volume for so long!

Agreed. Seems stupid. But we don't have the full picture maybe.

They should be trying to break even or enough and get some market share.
 
Sure and if they sold a 4090 at £650 and the rest of the line below that then they would not be recouping much from the gamer market.
Depends how many they sell. .
But can also be around 900-1000 bucks and the rest, probably, about half of what they are now.
 
Last edited:
Agreed. Seems stupid. But we don't have the full picture maybe.

They should be trying to break even or enough and get some market share.

I don't think they really care about PC domestic marketshare, they already have 99.94% of the console market which is probably their biggest revenue source besides the commercial GPU/Server market.
 
Last edited:
The whole die size issue shouldn't be relevant, if nvidia wanted to maximise efficiency and performance they would be going down the chiplet design path that AMD have chosen. Not only does it become more efficient, but you end up with similar performance numbers and a significant cost reduction as your yield rates improve dramatically.
Thats not really worked well for AMD though, they are behind on performance and efficiency and costs are not really much cheaper.
 
Thats not really worked well for AMD though, they are behind on performance and efficiency and costs are not really much cheaper.

That really depends on where/what your looking at.

The 7900XTX for example vs the 4090, yeah no contest and to be expected considering the price difference of nearly £550, Versus the 4080 where the performance gap is around 10% in favour of the 4080. The price difference reflects it with the 4080 costing around £1100 vs the £970 for the 7900XTX, and from a future proofing standpoint I'd pick the 7900XTX for its 24Gb of VRAM over the 4080. Because well VRAM is king at high resolution gaming and its become clear that nowadays 8Gb ain't enough even at 1080p for some games. Obviously if you have the disposable income, then the 4090 is probably worth the extra cost, but it is a big cost difference for a ~40% performance increase.
 
That really depends on where/what your looking at.

The 7900XTX for example vs the 4090, yeah no contest and to be expected considering the price difference of nearly £550, Versus the 4080 where the performance gap is around 10% in favour of the 4080. The price difference reflects it with the 4080 costing around £1100 vs the £970 for the 7900XTX, and from a future proofing standpoint I'd pick the 7900XTX for its 24Gb of VRAM over the 4080. Because well VRAM is king at high resolution gaming and its become clear that nowadays 8Gb ain't enough even at 1080p for some games. Obviously if you have the disposable income, then the 4090 is probably worth the extra cost, but it is a big cost difference for a ~40% performance increase.

What about future proofing on the 4090? It still looks limited in some real world benchmarks, like they're is sometimes still potential in it that's not being exploited in the test. It's RT abilities probably help with future proofing too?
 
I don't think they really care about PC domestic marketshare, they already have 99.94% of the console market which is probably their biggest revenue source besides the commercial GPU/Server market.

I don't buy that. It's like saying their shareholders don't care about money.. :)

Depends how many they sell. .
But can also be around 900-1000 bucks and the rest, probably, about half of what they are now.

I think £999 would be fine for that card personally. Not that I would pay that, but that is what I think. £650 just does not compute :D
 
What about future proofing on the 4090? It still looks limited in some real world benchmarks, like they're is sometimes still potential in it that's not being exploited in the test. It's RT abilities probably help with future proofing too?
Will be as much future proof as something like 2080ti was.

Personally, I think that 4090 and 4080 will age best. 7900xtx lags behind in RT and upscaling too much.
 
Will be as much future proof as something like 2080ti was.

Personally, I think that 4090 and 4080 will age best. 7900xtx lags behind in RT and upscaling too much.

I think I agree. Tougher call with the 4080 but I think the 4090 will certainly age better. Over £500 in it though!
 
Thats not really worked well for AMD though, they are behind on performance and efficiency and costs are not really much cheaper.
If AMD truly only care about consoles now, then the real thing the need to ahead at or at least even is not dGPU perf/efficiency/cost but rather custom SOC perf/efficiency/cost. This is where they have to be competitive.

As it happens:
AD102 is 73.3 billion transistors
Navi31 (GCD and MCDs) is 57.7 billion transistors.

So less than 80% of AD102 transistor budget. In raster, a 20% larger design would probably have beaten AD102. Perf/watt is poorer and that is a weakness of chiplets but a few changes might change that, e.g.: lower clocks and the ability to while on idle (but when do consoles run at idle?) to do everything in the cache of MCDs and have main memory at 1MHz or so.

Yes, they could be complacent in the knowledge that they can provide both the GPU and the CPU and that Nvidia are in both Sony's and Microsoft's bad books from previous console contracts - and that Intel is unlike to get their act together to win the next console round (in terms perf/watt/cost(area) ARC is way behind Radeon and Intel's CPU could only really win this with some kind of E-core monster with at most 2 P cores as their CPUs run hot and the P cores area massive).

However, I think the go all chiplets for the true RDNA3 (Navi33 is still on 6nm afterall) might be to demonstrate to their semi-custom partners that for the next gen consoles AMD will be able to provide them a chiplet design.
 
I think I agree. Tougher call with the 4080 but I think the 4090 will certainly age better. Over £500 in it though!
The difference could be higher than that as you'll need a beefier PSU and a more powerful system to actually take full advantage of that power.

If 4080 delivers sufficiently for one's needs, then best to save the money for a future upgrade(s).
 
  • Like
Reactions: TNA
Will be as much future proof as something like 2080ti was.

Personally, I think that 4090 and 4080 will age best. 7900xtx lags behind in RT and upscaling too much.

AMD have always aged like fine wine, as nice as RT is currently, it's still only a gimic and not really a mainstream feature, once the next gen of consoles come out with raytracing as a baseline feature then it would be something worth considering as a requirement.

Of the games I play currently only a few have RT and neither of them have any real reason for it other than it looks a bit nicer, and well I grew up playing pong and space invaders so graphics aren't exactly important to me :)
 
I've heard this before, what do people mean by it? Driver maturity.

Basically yes, over time the drivers often improve performance to the point where the cards sometimes outpace their direct competitor by a fair margin.

It's basically been that way since forever. Partly because NV are notorius for neglecting older generation hardware and partly because AMD drivers often start out a bit lacking and mature over time.

For example the ATi R9 290X at launch was about 10% behind the GTX 780Ti, but over the course of time it's performance improved to the point where it's now ~15% ahead.
 
Will be as much future proof as something like 2080ti was.

Personally, I think that 4090 and 4080 will age best. 7900xtx lags behind in RT and upscaling too much.
This is assuming all games actually release with either tech.

So I don't consider this future proof if games actually don't have these.

The big games on PC kinda have them or don't at all.
 
I've heard this before, what do people mean by it? Driver maturity.
I think people always say maturity but I think it's just AMD driver's being more compatible with games since from what I understood, a lot of games were developed on Nvidia hardware so driver compatibility with Nvidia usually was better performance wise.
 
I don't think they really care about PC domestic marketshare, they already have 99.94% of the console market which is probably their biggest revenue source besides the commercial GPU/Server market.
It’s reckoned that AMD are making single digit dollars profit per console. As in in both CPU and GPU. They make their profit in volume.
 
I'm more focused on Nvidia due to cuda so personally I'm ok with the 4090 being 1,500 range (if I ignore the potential fire risk from the power connector...), the 4080 should really be 1,000 tops imo though.

The issue is when you go lower than that and you start considering the direction games are going with vram requirements I'm not even sure they're suitable as current gen products anymore, especially considering their prices, maybe the 4070ti if you're doing high fps 1080/1440p if it was available for around 500 or less.
 
Basically yes, over time the drivers often improve performance to the point where the cards sometimes outpace their direct competitor by a fair margin.

It's basically been that way since forever. Partly because NV are notorius for neglecting older generation hardware and partly because AMD drivers often start out a bit lacking and mature over time.

For example the ATi R9 290X at launch was about 10% behind the GTX 780Ti, but over the course of time it's performance improved to the point where it's now ~15% ahead.

So there is an expectation that the 79s will get noticeably faster?
 
Back
Top Bottom