• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any news on 7800 xt?

Soldato
Joined
15 Oct 2019
Posts
12,048
Location
Uk
The name is meaningless. The performance and cost is what matters
The name is important and influences peoples buying decisions, how many people who bought a 4070ti would have done so if it was called a 4060 for £800, Giving lower end GPUs higher tier names to hide much larger than inflation price increases seems to be the latest fad from both manufacturers.
 
Last edited:
Associate
Joined
7 Nov 2017
Posts
1,923
The name is important and influences peoples buying decisions, how many people who bought a 4070ti would have done so if it was called a 4060 for £800, Giving lower end GPUs higher tier names to hide much larger than inflation price increases seems to be the latest fad from both manufacturers.

not mine. I just care how much it costs and how it performs.
 
Associate
Joined
22 Nov 2020
Posts
1,479
By fine wine you mean by finally getting the much delayed fsr 3 or maybe fixing the idle power draw with multiple displays

Serious note my previous card was AMD 5700xt was very pleased with it and was cheaper than the 2070 super performed just as well :)
No one cares about some silly power draw- most only have 1 screen lol.

Seriously if next gen the 5070 was 4080 performance for £700 would you buy it?
 
Last edited:
Soldato
Joined
28 Oct 2009
Posts
5,679
Location
Earth
No one cares about some silly power draw- most only have 1 screen lol.

Seriously if next gen the 5070 was 4080 performance for £700 would you buy it?

Doesn't matter if not many use multiple displays it's still an issue since launch that needs fixing I for one use multiple displays so of course it would concern me if looking to buy

I will assess what both have to offer next gen before deciding I have no loyalty to either if anything I would like AMD to stick it to Nvidia
 
Last edited:
Soldato
Joined
26 May 2014
Posts
2,979
Power consumption gains will be minimal, just like the 7600, and will be plagued by the higher idle pw cns just like N31, as for better ray tracing, that's mostly smoke because we saw with N31 & N33 there's no significant gain there relative to RDNA 2, whatever gains it had was mostly from N31 just being bigger (more cores). Really RDNA 3 = RDNA 2 + ability to go multi-chip. Fundamentally the changes were to facilitate them doing that transition & to preserve their profit margins, but at an architectural level the gains were mostly pathetic in all other aspects.

And yes, RDNA 3 is 10x worse than Fury/Vega years, because at least back then there were more differentiating factors for which you could consider AMD, but now there are essentially none. F.ex. 980 Ti vs Fury X, they were close in performance and there were no real feature advantages for Nvidia like today (RT/DLSS etc.), plus pricing was sometimes more attractive for Fury & it had nice UV gains. For Vega, it had a definite compute advantage vs the 1080, and OC vs OC they were very well suited, with Vega even pulling ahead at 4K, and again, feature parity more or less. Overall Nvidia still had the better cards for gaming but it was easy for AMD to drop the price a bit in order to make it competitive, as that's all that was required to balance the scales. With RDNA 3 the prices would have to drop a LOT more in order to make an argument in many scenarios, in RT it's at least 2 gens behind Nvidia, in DLSS it's worse both in terms of upscaling & it completely lacks FG, a middling alternative to Reflex, then we have ancillary things like CUDA (which unless you want to be an ignoramus in the era of ML/DL, is a real PITA to do with AMD, ask me how I know...), better media engine, way better efficiency etc. etc.

Honestly, the most obvious example of how much worse RDNA 3 is than Fury/Vega, is to just look at outlier results. I can find not-broken (and not sub 30 fps scenarios) game examples where Lovelace is >=2x faster than RDNA 3 (their equivalents) but I can't recall a single such example for either Fury or Vega. There's simply no way they'd be able to price themselves out of such performance deficits, they need a whole new architecture to actually compete in such scenarios (and close the feature gap). The crazy thing is they've lost even more distance than last gen WHILE Nvidia gimped most of their own products even harder in order for more obscene profit margins. So in reality it's even worse than what we see.
The 7600 (which of course you cherrypicked) is inefficient because it's manufactured on an inferior process to the bigger RDNA 3 chips. We've known for a long time that Navi 32 will be on 5nm the same as Navi 31. The 7900 XTX sees large efficiency gains over even the most efficient RDNA 2 card (the 6800 non-XT), and the 7800 cards will be the same.

I can't be bothered to address every bit of the rest of you rewriting history to paint Fury and Vega as successes, but they were an absolute joke at the time, inferior to Nvidia's offerings in every way and everybody knew it. AMD's current VRAM advantage is a bigger selling point in the real world than any fantasies about Vega's compute capabilities, which were labelled irrelevant at the time (and indeed were to 95% of people interested in Radeon/GeForce cards). As was AMD's supposed "feature parity", since Nvidia fanboys always find an excuse on that front and so it was simply never a thing. These days it's DLSS and ray tracing... and drivers. Back then it was GameWorks and G-Sync... and drivers. Oh, and power efficiency of course, which Nvidia zealots only care about when they're winning. AMD have never had "feature parity" in the mind of the vast majority of people (i.e. Nvidia customers), regardless of the reality of the situation. The vast majority of people still don't give a crap about ray tracing either. You can sit in the hyper-enthusiast bubble of this forum and pretend otherwise, but it's not what I see elsewhere in more casual spaces. That's why RDNA 2 clearance stock has been being shilled so heavily (and not just by the shills) versus the Ada mid-range, despite all Nvidia's "features". Most just want a card that performs well and has enough VRAM that they don't need to worry about it for the next five years. The rest is just fanboy noise and purchase justification.

As for more cherrypicking... yeah, not interested. Your narrative about RDNA 3 being "10x worse" than Fury/Vega is deranged, frankly. I don't know whether you're an AMD loony in a pit of despair or an Nvidia one who's lost sight of reality, but you need to give your head a wobble either way.
 
Caporegime
Joined
17 Mar 2012
Posts
48,719
Location
ARC-L1, Stanton System
If it ends up as 60 CU then that basically guarantees it's on par (or +<10%) with a 6800 XT. At that point RDNA 3 can go down in Radeon history as their worst GPU generation ever and it's not even close.


Ok, lets analyse this.

RX 6650XT:
32 CU @ 2500Mhz, 128Bit, RDNA2

RX 7600
32 CU @ 2600Mhz 128Bit, RDNA3 (+3% actual performance tested)

RX 6800XT 72 CU @ 2250Mhz, 256Bit, RDNA2
RX 7800XT? 60 CU @ ?????, 256Bit, RDNA3

So, the 7800XT? has 83% of the shaders of the 6800XT, for that ^^^ to be 10% faster it would need to be clocked at at least 3100Mhz, probably 3200Mhz or 3300Mhz.

I doubt it will do that so is this a 7800XT if it even exists? The chances are this is clocked around 2500Mhz to 2600Mhz which would make it about 10% faster than the RX 6800.

Edit... what happened to the 70 CU Pro W800?


7800XT my arse..... its not difficult to dismiss this.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
12,048
Location
Uk
Ok, lets analyse this.

RX 6650XT:
32 CU @ 2500Mhz, 128Bit, RDNA2

RX 7600
32 CU @ 2600Mhz 128Bit, RDNA3 (+3% actual performance tested)

RX 6800XT 72 CU @ 2250Mhz, 256Bit, RDNA2
RX 7800XT? 60 CU @ ?????, 256Bit, RDNA3

So, the 7800XT? has 83% of the shaders of the 6800XT, for that ^^^ to be 10% faster it would need to be clocked at at least 3100Mhz, probably 3200Mhz or 3300Mhz.

I doubt it will do that so is this a 7800XT if it even exists? The chances are this is clocked around 2500Mhz to 2600Mhz which would make it about 10% faster than the RX 6800.

Edit... what happened to the 70 CU Pro W800?


7800XT my arse..... its not difficult to dismiss this.
AMD are just lucky Nvidia didn’t use the same size dies and prices as last gen else AMD would have had their top XTX card beaten by a £469 70 class.

Nvidia must have known how poor the RDNA3 arch was which is why they felt they could get away with using tiered down dies as they have no real competition and maybe this is why AMD also renamed their cards as they knew if they came in to aggressively then Nvidia could just refresh with the real dies and unlock the 70%+ ADA performance gains across all the line up.
 
Soldato
Joined
6 Feb 2019
Posts
17,976
I'm not sure I buy the Nvidia knew story; if Nvidia knew exactly how rdna3 would perform then why even release the rtx4090. The 4090 was the first card and it was designed to maximise the architecture and then it's like after they designed that they realised they don't need the performance and cut the rest down, the 4090 looks like a major outlier on every chart, it looks like Nvidia thought rdna3 would be a beast and they had to push Ada as hard as they could
 
Last edited:
Associate
Joined
28 Nov 2005
Posts
370
Location
Wrde
I'm not sure I buy the Nvidia knew story; if Nvidia knew exactly how rdna3 would perform then why even release the rtx4090. The 4090 was the first card and it was designed to maximise the architecture and then it's like after they designed that they realised they don't need the performance and cut the rest down, the 4090 looks like a major outlier on every chart, it looks like Nvidia thought rdna3 would be a beast and they had to push Ada as hard as they could
Or Nvidia just needed to set the increased pricing tier expectations further down the stack from day one. It doesn't matter how vocal people are regarding the price increases they still keep buying, so it becomes normalised. Regardless expect further rises next gen for slowing real world performance outside of the increasingly priced halo card to justify it all.
 
Soldato
Joined
16 Sep 2018
Posts
12,731
It doesn't matter how vocal people are regarding the price increases they still keep buying, so it becomes normalised.
They don't...
 
Associate
Joined
28 Nov 2005
Posts
370
Location
Wrde
They don't...
Ah but wait till next gen when this will all seem reasonably priced in hindsight. With the focus on AI, lower volume, increasingly priced GPUs will be the norm as per all of the comments about the jackets appreciation of the Apple model. Also many will have skipped a gen for the first time in a long time with the current gen. Do you seriously expect people to start buying AMD?
 
Soldato
Joined
16 Sep 2018
Posts
12,731
Ah but wait till next gen when this will all seem reasonably priced in hindsight. With the focus on AI, lower volume, increasingly priced GPUs will be the norm as per all of the comments about the jackets appreciation of the Apple model. Also many will have skipped a gen for the first time in a long time with the current gen. Do you seriously expect people to start buying AMD?
I don't expect anything as I'm not in the habit of guessing about the future when it's nothing more than a guess, maybe if you could give me reasons for why you think increasingly priced GPUs will be the norm and people will not 'start' buying AMD then i could get on board but as it is you seem to be dealing in absolutes and making some massive assumptions.
 
Soldato
Joined
15 Oct 2019
Posts
12,048
Location
Uk
I'm not sure I buy the Nvidia knew story; if Nvidia knew exactly how rdna3 would perform then why even release the rtx4090. The 4090 was the first card and it was designed to maximise the architecture and then it's like after they designed that they realised they don't need the performance and cut the rest down, the 4090 looks like a major outlier on every chart, it looks like Nvidia thought rdna3 would be a beast and they had to push Ada as hard as they could
You can be sure Nvidia know exactly where AMD are at months before the cards are announced, take last gen when Nvidia used the 102 for the 3080 to combat AMDs much stronger performing RDNA 2, this time they used a 104 renamed to a 103 and still charged £1200 as they knew AMDs best effort was no better.

Nvidia build the architecture and have multiple versions of each card tested and ready to go and then decide on final spec once they know where the competition is at, this time they realised RDNA3 was a failure but wasn't going to put a 4090 on a 103 die as the top card always uses the 102 so they just cut this down more than usual as so its far more cut down than any of the ampere 80ti/90 cards from last generation and this equals higher yields + more profits while Nvidia has plenty more in the bank on that one if needed.
 
Associate
Joined
22 Nov 2020
Posts
1,479
You can be sure Nvidia know exactly where AMD are at months before the cards are announced, take last gen when Nvidia used the 102 for the 3080 to combat AMDs much stronger performing RDNA 2, this time they used a 104 renamed to a 103 and still charged £1200 as they knew AMDs best effort was no better.

Nvidia build the architecture and have multiple versions of each card tested and ready to go and then decide on final spec once they know where the competition is at, this time they realised RDNA3 was a failure but wasn't going to put a 4090 on a 103 die as the top card always uses the 102 so they just cut this down more than usual as so its far more cut down than any of the ampere 80ti/90 cards from last generation and this equals higher yields + more profits while Nvidia has plenty more in the bank on that one if needed.
Sounds like you need to work for Jensen as you have it all worked out lol

Seriously what gpu do you roll with now? And when do you see yourself upgrading?
 
Last edited:
Soldato
Joined
18 Feb 2015
Posts
6,492

Ah yes, the classic, everything you can't argue against is cherry picking. Meanwhile, in the real world, AMD dropped to its lowest point ever and can't even keep market-share vs Intel (LOL) let alone challenge Nvidia. Truly a great success for AMD! :cry:

di0ReKd.png
 
Soldato
Joined
30 Mar 2010
Posts
13,110
Location
Under The Stairs!
Back
Top Bottom