• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

Do these Moths come with the AMD card? :D

Maybe we should just let people buy what they want and not try to tell anyone they're wrong or they could've got something better. Maybe it's objectively better but not better for them, different people will have different needs. Also, and I think this can play a part in this (as with other things), sometimes people have a bad experience with a brand and that puts them off. I have friends that had troubles with AMD cards in the past so now they prefer to buy Nvidia, I think people on here have probably had similar or opposite experiences which will add to their bias. Had had poor experiences with Gigabyte and MSI motherboards in the past so I've been reluctant to try them again, even though they might be better than the Asus and ASRock boards I've bought instead.
There's so many factors that might affect this that we can't know about, it's not going to be the same for everyone.

I don't think there's an easy answer for AMD, I'm sure if there was they'd have done it. I'm sure somebody there has considered "make them cheaper" but it's not that simple and making them better is easier said than done. I know AMD have a smaller budget than Nvidia, but that's not really a consumer concern, we just care about how good the end product is not how good it is considering...

No but we should care about tech jurnoes pushing the prices of these things up.
 
I believe we'll only get 7900xt level performance, with 4070ti level RT at most.

But most likely we'll end up with 7900gre raster and 4070 RT.

Pricing I'm expecting to be same launch price as 7800xt.

I'm not boarding the hype train.

I still firmly believe that AMD pushed all their cards a tier up from where they should have been,

Wait... so you think +10% in raster and +20% in RT vs the 7800 XT? Or +10% RT vs the 7900 GRE.

fZLqwtL.png
SAfEV43.png
 
Last edited:
The next 8800xt card will be 7900gre level, with the RT of a 4070ti, maybe a 7900xt level of performance with 4070ti level RT but most likely the former.

I'm more than happy to be proved wrong, and I hope AMD do prove me wrong, but history suggests they'll mess it up, they've an opportunity here,

Yeah you said that, 7900 GRE level is basically no change, i don't remember AMD ever doing that so i don't know what history you're referring to?

Also, seriously with the Mores Law is Dead'esc disclaimers? Commit to what you're saying or you're really not saying anything at all.... You clearly don't believe enough in what you're saying, so maybe don't? Do you really want to refer to your But...But.... i also said this disclaimers? Its why many people find him obnoxious.

Which is it?
 
Last edited:
Each time AMD have an open goal they botch the launch one way or another, either price their cards to high, the performance uplift isn't there, take too long to release the range of cards etc etc

From the 6800xt to the 7800xt there's wasn't much of a generational uplift other than slight RT uplift there wasn't much between the cards.

RX 6700 XT: $479.99
RX 6800: $579.99
RX 6800 XT: $649.99

RX 7800 XT: $499.99

The nearest release price comparison to the RX 7800 XT is actually the RX 6700 XT, i think it was intended to be a reduced price RX 6800 successor.
 
Last edited:
I'm guessing these were launch prices?
Do we know what those 6000 series cards were selling for when the 7800XT was released?

Using the wayback machine, an entry on a competitors site on the 14'th October 2023, so 1 month after 7800 XT launch, i found a 6800 XT for £540, it was the cheapest, also found a 6950 XT on the same day for £570.

Again on the same day the 7800 XT was £515, the Sapphire Pulse. I think they may have been cheaper before 7800 XT launch.

It didn't capture on 7800 XT launch day, 6'th September 2023.

The thing is if you base your successor card pricing on the cheaper EOL card you end up at $0 eventually, let say the 6800 XT was $400 at the time so you launch the 7800 XT at $400, when that goes EOL its $300 so the 8800 XT is then $300..... you see how that doesn't work?

Interesting view. I just tried doing similar for 7>8 series and can't find quite the right comparison. Basically comes down to whether it's above or below GRE launch price at $549 I think?

It all depends on the performance of the thing, if its between the 7900 XT and 7900 XTX, as rumoured, it needs to be... yeah, around $550 to be next gen, $500 would make it pretty good.
 
Last edited:
I don't think Nvidia get a pas as such, people complain about their prices plenty. It's just that if you want to buy a new GPU what choice do you have? So people end up paying Nvidia's prices.

If they think the only choice is Nvidia then what do they card about AMD's pricing?
 
The only Nvidia software trick I'd give any value to is DLAA, of which I'd like to see an AMD equivalent.
As for everything else, I see them only as a way to pad developers margins by cutting costs on optimization, we went from using MSAA to actually downsampling and I'm beyond baffled for that.

 
Yeah the Mark Cerny presentation on ps5 pro yesterday was very interesting


Anyone looking at this just watch the original video by Mark Cerny, he does a very good job of explaining it himself, Daniel Owen really isn't adding a lot to this, all he's doing is playing the Mark Cerny video at double speed so you can't really understand it and then repeating what Mark Cerny said in slow motion with a sprinkling of stating the obvious, its obnoxious.
 
Last edited:
Unfortunately for AMD's data centre GPU's, there is a bit of a difference between claimed and actual performance. But it's understandable as there are many more variables to benchmarking these things, it's not as easy as starting windows and loading up a game. Due to these factors, companies will choose the hardware that best meets their software needs, and that's that, trying to compare amd and Nvidia data centre GPUs directly is problematic https://www.forbes.com/sites/karlfr...d-is-not-the-fastest-gpu-heres-the-real-data/

That's very narrow performance testing and it reads more like Nvidia cope marketing for their investors.

The most powerful super computer on earth has AMD GPU's, not Nvidia, the people who build these billion $ computers don't take their advice from forbs.
 
Last edited:


As per this review that was just released - going by AMD's marketing and GPU specs, the MI300X should be a slam dunk against anything Nvidia has - yet as the review shows, reality is very different and it lags significantly behind Nvidia, and that's after AMD sent teams of engineers to update software top improve on site performance, against Nvidia's out of the box performance with no engineers required

Key Findings​

  1. Comparing on paper FLOP/s and HBM Bandwidth/Capacity is akin to comparing cameras by merely examining megapixel count. The only way to tell the actual performance is to run benchmarking.
  2. Nvidia’s Out of the Box Performance & Experience is amazing, and we did not run into any bugs during our benchmarks. Nvidia tasked a single engineer to us for technical support, but we didn’t run into any Nvidia software bugs as such we didn’t need much support.
  3. AMD’s Out of the Box Experience is very difficult to work with and can require considerable patience and elbow grease to move towards a usable state. AMD's stable releases of AMD PyTorch is still broken and we needed workarounds.
  4. If we weren’t supported by multiple teams of AMD engineers triaging and fixing bugs in AMD software that we ran into, AMD’s results would have been much lower than Nvidia’s.
  5. For AMD, Real World Performance is nowhere close to its on paper marketed TFLOP/s.
  6. Training performance is weaker, as demonstrated by the MI300X ‘s matrix multiplication micro-benchmarks, and still lags that of Nvidia’s H100 and H200
  7. AMD’s training performance is also held back as the MI300X does not deliver strong scale out performance. This is due to its weaker ROCm Compute Communication Library (RCCL) and AMD’s lower degree of vertical integration with networking and switching hardware compared to Nvidia’s strong integration of its Nvidia Collective Communications Library (NCCL), InfiniBand/Spectrum-X network fabric and switches.

This really isn't worth doubling down on.

Nvidia own 99% of the software, so Nvidia's GPU's are going to run better / faster than AMD's even if the compute power on them is lower, the first Key Finding almost gets to that conclusion, almost, but not quite, they don't actually understand what they are writing about, like 90% of writers these days, or they do and that's not the point, the point is reassurance to Nvidia's equally dumb investors that no matter how much AMD can capitalise on their breakthrough efforts Nvidia still has the bigger willy.

Not everyone uses or even likes Nvidia's ECO system, Intel used to do the same, they created an ECO system and dictated it to their customers, its kind of good because you get a stable ready made environment for your hardware, For Intel.... they get to lock you in, so you become dependant on that ECO system and find it difficult to switch even if you wanted to.

AMD came along with a new idea, here it is: You tell us exactly what you need and we will build it for you, AMD have been doing that long enough now so other people idea's that AMD then turn in to reality are actually good appealing idea's to many other people.
Beyond that not everyone wants you to do it for them, sometimes, quite often actually all you need is the hardware, if the hardware is not black boxed then you can program it yourself, old school.
Nvidia's hardware or ECO system is not the be all and end all, it has its own problems and yes AMD's hardware is more powerful in TFLOP/s, that's why you're seeing that nonsense from Nvidia's marketing ARM's plastered all over the Internet now, People are starting to take notice of AMD's hardware and idea's, enough so that its rattling Nvidia.
 
Last edited:
I'll tell you another thing Intel have have figured out that is now starting to become and anvil around their necks, which Nvidia are clever enough to see as a concern.

Developing and maintaining a vast ECO system infrastructure is very very expensive, you can only do it with a combination of two things, very high margin returns and market domination.

Intel are losing vast amounts of money not because the CPU's cost more to make than they sell them for, certainly not with multi thousand $ chips, no matter how big or complex they are, they cost $3K to $15K a pop because of the costs associated with maintaining that ECO system.

Intel have been forced to compete with much faster and more efficient CPU's, aside from them actually failing in that its still pushed the development and manufacturing costs of their chips to heights never before seen, while at the same time reduce the price of those chips, effectively reducing the ECO system service fee, by quite a chunk, on top of that Intel have now lost significant market share, AMD are now at 30% product share and 40% revenue share.

All that amounts to a significant chunk of Intel's revenue transferred to AMD, years ago Intel made the same arguments you're seeing in these articles you're posting.
AMD don't have as many engineers and middle managers available on speed dial, they don't have as many people embedding their compilers in to every bit of software and they don't have as many marketing people bamboozling people with very underhanded slide presentations.

So AMD will never replace Intel, they don't have to, all they have to do, what they are doing is making that unsustainable for Intel and Intel will lose its advantage, its working.

Nvidia see what AMD are doing and they do recognise it as a real threat, AMD don't need to take large chunks of Nvidia marketshare, all AMD need to do is make Nvidia fight for it in the same way they made Intel fight for it, you only need to nibble away at your competitors 95% marketshare and reduce the revenue they gain from it to make it unsustainable.

Again these articles are the very same noises Intel was making years ago, Intel made them because they were rattled by what AMD was doing. Intel was right to be rattled.
 
Last edited:
Not sure what data centre has got to do with a gaming cards,but it appears Oracle and IBM seem to be fine with the AMD offerings:





The larger customers with dedicated teams who have their own software ecosystems are buying whatever AMD is making. That means longer term,future AMD cards will integrate far easier into their ecosystems. A bit like how Sony works with AMD on its consoles.

The smaller teams probably are relying more on Nvidia for support and nobody is shocked if Nvidia has mature software.

Anyway if you are a gamer,you shouldn't be cheering on any of these companies doing "well" in AI. It only means that the gaming cards will cost more,be delayed more,have less VRAM and be more cut down.

Despite all the "loyality" marketing these companies make,the reality gamers are treated second rate now. Mining,AI,Supercomputers,Commercial sales,etc all seem to get priority over gamers now. If they do worse in the latter it's better for gamers. Not sure why people are trying to flex non-gaming sales. It doesn't help us!!

AMD made GPU's specifically for gamers, RDNA 1, 2 and 3, yes AMD made some really smoothbrain decisions with them, especially with some of RDNA 3, not all of it was bad, some of it was good.

No one wanted even those, AMD learned some lessons they needed to learn, we didn't learn a thing, just carry on doing the same thing over and over again expecting a different outcome.

AMD will change, the changes they are making are to insulate themselves from us.
 
Problem is, even if quality & performance was somehow identical they are still hundreds of games behind and let's be honest there will be almost 0 retrofitting. :(

The quality will never be equal or even good enough, try to picture Digital Foundry saying FSR 4 is just as good as the latest DLSS.
 
Back
Top Bottom