• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

The problem with that is Nvidia cards are currently overpriced by 50% so for AMD come come in 20% cheaper would still mean they are 30% overpriced, AMD should have matched Nvidia’s last gen MSRP pricing.

Exactly. One launches at very high prices,the next launches at slightly lower prices. Then the other will do a refresh with slightly lower prices and so on. Then the next generation arrives,another price hike. It's like comparing one energy company with another one. Some might do slightly better deals than the others,but the electricity/gas is still overpriced.

It reminded me of Apple and Samsung pushing out minor phone updates with barely better cameras,etc. People made numerous explanations why it happened, when it was quite clearly they were doing it because they could. When the Chinese companies started entering the high end,things suddenly changed very quickly.
 
Last edited:
Zen 3 was faster than Coffeelake.
Yeah I meant 4 generations. 1000-3000 was behind in gaming but had far better price performance, sometimes 50% cheaper for equal cores. Do you think ryzen would have done well if AMD priced it 10-20% than intels HEDT when it first released.

Intel was selling 6 cores for $450 and 8 cores for $1000, AMD came in with 6 cores for $218 and 8 for $330, now imagine AMD looked at intels pricing and thought let’s sell our ryzen 7 1700 for $800 and our ryzen 5 1600 for $400, do you think they would have gained any market share? as that’s pretty much what their doing with GPUs.
 
Yeah I meant 4 generations. 1000-3000 was behind in gaming but had far better price performance, sometimes 50% cheaper for equal cores. Do you think ryzen would have done well if AMD priced it 10-20% than intels HEDT when it first released.

Intel was selling 6 cores for $450 and 8 cores for $1000, AMD came in with 6 cores for $218 and 8 for $330, now imagine AMD looked at intels pricing and thought let’s sell our ryzen 7 1700 for $800 and our ryzen 5 1600 for $400, do you think they would have gained any market share? as that’s pretty much what their doing with GPUs.
The fact they are bowing out of high end for RDNA4 says it all that they don’t think it’s financially beneficial to chase high end (perhaps due to production costs/R&D?).

Hopefully chiplet will be a Ryzen Zen 2 to Zen3 moment for RDNA5 and we all get the performance we so badly crave for the cheapest possible price.
 
The fact they are bowing out of high end for RDNA4 says it all that they don’t think it’s financially beneficial to chase high end (perhaps due to production costs/R&D?).

Hopefully chiplet will be a Ryzen Zen 2 to Zen3 moment for RDNA5 and we all get the performance we so badly crave for the cheapest possible price.

It is all rumours on exactly what AMD is doing. Don't people consider it weird we have a Navi 44 next generation when there is no Navi 34? For all we know Navi 43 might be a chiplet based design too but less ambious than Navi 42 or Navi 41. Navi 44 might be a laptop and desktop OEM orientated design. If AMD literally shrunk the Navi 33 GCD down to 3NM with RDNA4 improvements,it would be consider midrange. But instead of one GCD,they used two GCDs on a single card then it wouldn't be midrange. They already have done the work on the 6NM MCDs. I find it weird they would just suddenly jump from one GCD to nine!

You would need to test a two GCD design at the very least to validate the system works.

If you look at Zen2 and Zen3 the CPU chiplets are tiny.
 
Last edited:
Yeah I meant 4 generations. 1000-3000 was behind in gaming but had far better price performance, sometimes 50% cheaper for equal cores. Do you think ryzen would have done well if AMD priced it 10-20% than intels HEDT when it first released.

Intel was selling 6 cores for $450 and 8 cores for $1000, AMD came in with 6 cores for $218 and 8 for $330, now imagine AMD looked at intels pricing and thought let’s sell our ryzen 7 1700 for $800 and our ryzen 5 1600 for $400, do you think they would have gained any market share? as that’s pretty much what their doing with GPUs.

I don't think CPU's are comparable to GPU's, you need a CPU to make your system work, beyond that it just sits in the background doing its thing unnoticed.

The fact they are bowing out of high end for RDNA4 says it all that they don’t think it’s financially beneficial to chase high end (perhaps due to production costs/R&D?).

Hopefully chiplet will be a Ryzen Zen 2 to Zen3 moment for RDNA5 and we all get the performance we so badly crave for the cheapest possible price.

The reason Intel can't put AMD back in the box is because they can't compete with AMD's chiplet technology, Intel can't make them cheap enough to cause a problem for AMD's bottom line, and not for trying which they are.

The way Intel almost finished AMD off in the 2000's was to give chips away and then pay people not to use AMD CPU's, denying AMD sales of their better CPU's, they had an enormous cash stash to do it, if you look at Intel financials you can see they are trying something similar again, and failing, Intel are making 0 margins on their Xeon product line, AMD are making around 30 to 40%. because to this day Intel are still incentivising people to buy their Xeon's instead of EPYC where they can, while gradually losing marketshare, they have been doing it for a while, the money is gone, Intel sacked 30% of its work force, is closing down many of its businesses and boring billions every quarter.
And they are still trying, stubbornly, or maybe just to maintain some sort of grip in datacentre.

If AI kicks off Intel will have even more problems because the fight there will be between Nvidia and AMD, those two have by far the best products and some of the stuff AMD has in development is ##### astonishing, Nvidia will struggle against it but again they have the mindshare, Intel are nowhere, AI is the sort of thing that Nvidia and AMD are particularly specialist at. its data and number crunching, very very high performance number crunching. Out side of fabs i don't see a long term future for Intel.

AMD need chiplets because Nvidia will not be able to compete with that using monolithic. without chiplets they have no advantage over Nvidia and with that Nvidia can always price match AMD out of the market, a bit like Intel did to AMD all those years ago, that's why AMD are all in on chiplets. Perhaps even pushing too hard too fast which is why they have had to back off for Navi 4.
 
Last edited:
This was Navi 4C

6o1AB8D.jpg


lTlcPBR.png
 
I don't think CPU's are comparable to GPU's, you need a CPU to make your system work, beyond that it just sits in the background doing its thing unnoticed.
Maybe not technically though both are processing units built in a similar way but the concept of a smaller competitor overturning a larger competitor’s mindshare stranglehold on the market by offering comparable products at less than half the price still stands.
 
Last edited:
Maybe not technically though both are processing units built in a similar way but the concept of a smaller competitor overturning a larger competitor’s mindshare stranglehold on the market by offering comparable products at less than half the price still stands.

People don't care beyond performance, with GPU's its all about features, and AMD don't have good RT and DLSS. Even though that's only true for Cyberpunk but that is what everyone references, including tech journalists, where ever AMD do quite well its always "this is an AMD game" with Cyberpunk its always "this is the base line for RT, look how bad AMD are"
 
People don't care beyond performance, with GPU's its all about features, and AMD don't have good RT and DLSS. Even though that's only true for Cyberpunk but that is what everyone references, including tech journalists, where ever AMD do quite well its always "this is an AMD game" with Cyberpunk its always "this is the base line for RT, look how bad AMD are"

The FSR situation is the fault of AMD - RDNA3 has been out since last year and they haven't launched it. It only needs FSR3 to get close enough,to have the same tick box "value added features" as Nvidia.
 
The FSR situation is the fault of AMD - RDNA3 has been out since last year and they haven't launched it. It only needs FSR3 to get close enough,to have the same tick box "value added features" as Nvidia.

FSR3 is only one aspect, Nvidia are still 3X faster with RT, or at least in one Nvidia RT showcase title, but that's all that matters.
 
FSR3 is only one aspect, Nvidia are still 3X faster with RT, or at least in one Nvidia RT showcase title, but that's all that matters.

3X faster,when its 5FPS on one and 15FPS on the other. Just use upscaling from 720p to 1080p with fake frames,and you can get a soft,latency driven mess. That applies to both DLSS and FSR3. Wonder if Ampere will also suck in this game too? Kepler sucked in the Witcher 3,which was ever so convenient because Maxwell didn't(and even the AMD cards could handle some effects better due to the ability to control tessellation).

Wonder if all the people who say games are poorly optimised(I expect it will leveraged at Starfield),will say the same when Phantom Liberty ends up being worse optimised than the original game! :p

Whats worse it appears the next CDPR games will use UE5,which appears to recommend more VRAM too.
 
Last edited:
3X faster,when its 5FPS on one and 15FPS on the other. Just use upscaling from 720p to 1080p with fake frames,and you can get a soft,latency driven mess. That applies to both DLSS and FSR3. Wonder if Ampere will also suck in this game too? Kepler sucked in the Witcher 3,which was ever so convenient because Maxwell didn't(and even the AMD cards could handle some effects better due to the ability to control tessellation).

Wonder if all the people who say games are poorly optimised(I expect it will leveraged at Starfield),will say the same when Phantom Liberty ends up being worse optimised than the original game! :p

Whats worse it appears the next CDPR games will use UE5,which appears to recommend more VRAM too.
All the 4070/70ti/3080/3080ti owners will probably claim poor game optimisation! :D:D
 
Damn this is irritating, is the 7800XT going to be worth waiting for, or should I just buy a 6800XT for £500?

I don't think it is, because I don't think it's going to beat a 6800XT for £500.

But there's no way of knowing aaaaaarrrgggghhhhhh!!!

:cry:
 
Damn this is irritating, is the 7800XT going to be worth waiting for, or should I just buy a 6800XT for £500?

I don't think it is, because I don't think it's going to beat a 6800XT for £500.

But there's no way of knowing aaaaaarrrgggghhhhhh!!!

:cry:
I think it’ll be over £500 and more like £550 with similar raster and a bit better RT, no doubt AMD are planning to use FSR3 to make it seem like an upgrade over the 6800XT.
 
Damn this is irritating, is the 7800XT going to be worth waiting for, or should I just buy a 6800XT for £500?

I don't think it is, because I don't think it's going to beat a 6800XT for £500.

But there's no way of knowing aaaaaarrrgggghhhhhh!!!

:cry:
My tip is buy something and enjoy it now. Any cost/performance difference is offset by the hours of fun you'll be having in the meantime. I strongly suspect the 780000XT will be underwhelming, so much so I just bought a 69500XT months ago.
 
Last edited:
Maybe not technically though both are processing units built in a similar way but the concept of a smaller competitor overturning a larger competitor’s mindshare stranglehold on the market by offering comparable products at less than half the price still stands.
A bit hard to do when both are using the same foundry and if anything Nvidia have perf/transistor advantage (mainly because nobody can explain what all the extra transistors on Navi31 actually accomplished over what a straight port of Navi21 to 5nm wouldn't have done far better).

So humbug's point that until they can crack chiplets they won't be able to compete with Nvidia. What they really need is Nvidia being way behind in chiplets, getting desperate and throwing area at the problem (but with AD102 already being over 600mm² that is hard).

In other words, if they could paint themselves in a corner like Intel did when all they could think of was to make their P cores huge.

Nvidia seldom mis-step and even chiplet dGPUs will have lower margins per wafer than even consumer Ryzen, but that's AMD's only real chance at dGPU IMO.
 
Last edited:
Back
Top Bottom