• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's FSR3 possibly next month ?

When Nvidia publish their results we will probably find they are down in these segments too, people just aren't buying these things as much as they used to.
Like wise sales of Tesla's, Mercedes... are also down, as are consoles, so embedded is down.

Overall AMD's revenue is good, Client and Data Center are up, quite a lot, Handheals, EPYC and AI GPU's, those segments saved AMD's bacon, overall they actually beat estimates slightly.

If you want to look at a bad result look at Intel's.
 
Last edited:
Those are not bad margins, Nvidia's 70% margins are just stupidly high, because....... well, because they are worth it.

I'm not talking about the margins themselves, but what they mean in relation to the price of the cards.

For every 7900xtx sold AMD made around $170 (I know is less, AIB taking their part as well, but let's keep it simple), so the cost with the card was $830.

If for every 4080 nVIDIA made $840 (70%), then the cost to make the card was only... $360.

Now sure, 4080 is only 379mm2, while 7900xtx with the memory modules is 529mm2, but still... MCM design should have kept the prices lower, more so considering cheaper manufacturing processes.

So... how come the 7900xtx was so expensive to make? The useless extra vRAM? Or is this more of an accounting "magic" to make the best for the profits of the company - or to look better in other areas while hanging a lot of R&D costs on the gaming side...

Keep in mind that Sony was making also a small profit at some point out of the digital version of PS5, so still 16GB RAM plus APU, plus fast (and expensive at the time) nvme drive, plus PSU, etc.

If these are really the costs, then no wonder AMD didn't bother to compete from a price angle. They would have been DOA with nVIDIA dropping their prices, making a profit and AMD could not even recoup the cost to build the card... Something went really wrong there. Or... the numbers are wrong.

PS: it looks even worse when you look at the number of transistors for both cards, especially since nVIDIA has dedicated ones for specific tasks when it comes to RT/DLSS that don't use it with regular Raster - yet they perform similarly.
 
Last edited:
PS: it looks even worse when you look at the number of transistors for both cards, especially since nVIDIA has dedicated ones for specific tasks when it comes to RT/DLSS that don't use it with regular Raster - yet they perform similarly.

They can't afford to add dedicated RT and Tensor core equivalent. It would create even more power draw which would add cost to the PCB, power delivery and impact yields. Until they have a significant architectural re-design, it's gonna be a tough road ahead.
 
They can't afford to add dedicated RT and Tensor core equivalent. It would create even more power draw which would add cost to the PCB, power delivery and impact yields. Until they have a significant architectural re-design, it's gonna be a tough road ahead.
Yeah, they have almost 26% more transistors (57.700mil vs 45.900mil), actually even more if you don't put those dedicated from nVIDIA and yet... in raster, where AMD should have a clear advantage... they don't.
 
Last edited:
Screenshot-2024-05-01-at-2.02.45-AM-1456x825.png

Radeon sales are poor.

"Released FidelityFX Super Resolution 3.1 offering significant image quality improvements for gamers"...

OK AMD...Where is it? An announcement is not the same as a release.
 
I'm not talking about the margins themselves, but what they mean in relation to the price of the cards.

For every 7900xtx sold AMD made around $170 (I know is less, AIB taking their part as well, but let's keep it simple), so the cost with the card was $830.

If for every 4080 nVIDIA made $840 (70%), then the cost to make the card was only... $360.

Now sure, 4080 is only 379mm2, while 7900xtx with the memory modules is 529mm2, but still... MCM design should have kept the prices lower, more so considering cheaper manufacturing processes.

So... how come the 7900xtx was so expensive to make? The useless extra vRAM? Or is this more of an accounting "magic" to make the best for the profits of the company - or to look better in other areas while hanging a lot of R&D costs on the gaming side...

Keep in mind that Sony was making also a small profit at some point out of the digital version of PS5, so still 16GB RAM plus APU, plus fast (and expensive at the time) nvme drive, plus PSU, etc.

If these are really the costs, then no wonder AMD didn't bother to compete from a price angle. They would have been DOA with nVIDIA dropping their prices, making a profit and AMD could not even recoup the cost to build the card... Something went really wrong there. Or... the numbers are wrong.

PS: it looks even worse when you look at the number of transistors for both cards, especially since nVIDIA has dedicated ones for specific tasks when it comes to RT/DLSS that don't use it with regular Raster - yet they perform similarly.

The cost of the card was obviously no where near $830.

If AMD make a GPU that cost them $250 and they sell that to an AIB for 17% more its sold for $293, the AIB sells it in to a supply chain for 30% more bringing it up to $381, the supply chain sell it to OCUK for 20% more, $457 who in turn sell it for 20% more, $550.

The thing with that is the market dictates the price, because of marketshare, its that same marketshare that allow Nvidia to dictate their own terms.

So a $250 cost GPU + 70% Nvidia margins, $425, the AIB only takes 15%, $488, the supply chain only takes 5%, $512, OCUK only take 10%, $564.

Now aask yourself why EVGA pulled out.
 
Last edited:
Yeah they don't hold a developer to ransom I guess, FSR 3.1 and frame generation are coming to PS5 and Xbox soon, which I would assume is a good thing and likely mean more newer games have it by default.
Nvidia doesn't either.
The cost of the card was obviously no where near $830.

If AMD make a GPU that cost them $250 and they sell that to an AIB for 17% more its sold for $293, the AIB sells it in to a supply chain for 30% more bringing it up to $381, the supply chain sell it to OCUK for 20% more, $457 who in turn sell it for 20% more, $550.

The thing with that is the market dictates the price, because of marketshare, its that same marketshare that allow Nvidia to dictate their own terms.

So a $250 cost GPU + 70% Nvidia margins, $425, the AIB only takes 15%, $488, the supply chain only takes 5%, $512, OCUK only take 10%, $564.

Now aask yourself why EVGA pulled out.
There's still $450 missing there, almost half.

But in your calculation, someone selling nVIDIA is doing better since it's selling more units. And frankly, those percentages for AIB, supply chain and OcUK are way too high since they aren't doing the the difficult part. So yeah, Nvidia seems "fair" to take more.

Monopoly means nothing since in a duopoly there's no real competition as well (such as it was in the graphics market). One can always go bust, would make no difference.
 
Last edited:
But but but open source is great!!!!!




:p
Normally should be, but it doesn't mean much in the gaming market where easy profit is king. AI accelerated on the GPU, physics too (with Bullet,), True Audio, Mantle (which they've given away), tressfx... all don't mean much without implication. That's the cost of doing business and AMD fails to get that.
 
Yeah they don't hold a developer to ransom I guess, FSR 3.1 and frame generation are coming to PS5 and Xbox soon, which I would assume is a good thing and likely mean more newer games have it by default.

Didn't we hear this about FSR 1 and then 2 too? Yet hardly any games on xbox and ps 5 have FSR..... AMD need to stop with their "over the fence" approach as it clearly isn't working for them, this isn't just with their fsr solution but any of their software features/solutions where they fail to get consistent good quality or/and lack of uptake.

Normally should be, but it doesn't mean much in the gaming market where easy profit is king. AI accelerated on the GPU, physics too (with Bullet,), True Audio, Mantle (which they've given away), tressfx... all don't mean much without implication. That's the cost of doing business and AMD fails to get that.

Exactly.

Back in the day, it was a valid excuse as they had an incredibly small development team and of course when you're last to the market (by a considerable time difference of several months/years) and with an inferior product, they have little choice in how to go about these things, which is why amd need to get out of nvidias shadow and beat them to the punch, until they do this, I can't see things every changing. The whole FSR situation with their "announcements" has been nothing but a kneejerk reaction to nvidias, some youtuber called out their FSR 3 reveal in forgotten claiming that the footage we saw was most likely fake given how awful FSR 3 looked in that title on launch..... it's things like that which don't do amd any favours whatsoever.
 
Nvidia doesn't either.

There's still $450 missing there, almost half.

But in your calculation, someone selling nVIDIA is doing better since it's selling more units. And frankly, those percentages for AIB, supply chain and OcUK are way too high since they aren't doing the the difficult part. So yeah, Nvidia seems "fair" to take more.

Monopoly means nothing since in a duopoly there's no real competition as well (such as it was in the graphics market). One can always go bust, would make no difference.
Well AMD aren't selling their chips on for only 17% GP for a start, so all the calculations were wrong anyway ;) not unless they run the whole of the rest of the business for nothing.
 
FSR 3 has been available open source for a few months now, only ones stopping it from getting into games are the game devs.

I guess you haven't heard about the FSR 3.1 announcement at GDC 2024 which improves the upscaler to get rid of the fizzle and ghosting. It is apparently a big improvement over FSR 2.2.
 
Last edited:
I guess you haven't heard about the FSR 3.1 announcement at GDC 2024 which improves the upscaler to get rid of the fizzle and ghosting. It is apparently a big improvement over FSR 2.2.
Mate, reduce fizzle and ghosting, four grainy gifs, two side by sides comparisons, one is a comparison of two green guns(?) at different angles, I'm still shocked they've bothered improving it tbh-if they have.:p
 
Recent reports of FSR hardly being used in the consoles(again).....

Consoles have a tendency to rely too much on FSR according to DF's Richard:p

Feel free to list the games on Xbox and ps5 (still waiting When this was brought up before.....) that have fsr then as of right now.... I can think of like 5 at most.
 
Last edited:
Back
Top Bottom