• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

GSync would probably be the closest thing to vendor lock-in we've had in a while, but even that's probably not so much of an issue these days.

Great point, that was an Apple-esque masterstroke by NV! Happy to see that fail ;)

We should also not forget PhysX as a vendor lock in.

Was PhysX ever required by any games? Or would you just see fewer fancy effects if you didn't have an NV card or PPU?

AMD brought decent levels of FG and upscaling to masses using older gen GPUs. Intel were able to bring arguably better (than FSR) upscaling in XeSS to the older gen GPUs as well. The first iteration of FSR was far superior than the original early DLSS which was terrible.

100%. Very under-appreciated fact.

Why don’t AMD just take to hit to their profits and make less profit by selling the 9070XT for $450- that would be disruptive and gain market share.

Supply constraint.

They have a limited supply of silicon wafers from TSMC, competing against Nvidia, Intel, Apple, Qualcomm etc. to try and get enough for CPU and GPU across server and consumer products. They're not going to sacrifice profits to gain share in a market where they are weak, when they could make a killing with the same silicon with other products.
 
I thought Azor spoke really well. I like the quote along the lines of "this isn't some big flashy tech showpiece, it's just a graphics card for gamers". They seem to be positioning themselves well alongside Nvidia.
Agreed. This is the kind of tone they need for their GPU showcase. Now that they've checked the shareholder AI boxes with their CES presentation, talk directly to us. Give us raster performance, without the obfuscation! And a good price!
 
Was PhysX ever required by any games? Or would you just see fewer fancy effects if you didn't have an NV card or PPU?
It was useless IMO, looked good in some games but if it came to multiplayer it would be just stuff you could see, and it wouldn't have any effect on anything.

Half Life 2 did it right with Havok.. already in the game and was actually useful.
 
The 580 was among the top cards on Steam hardware survey so it was not a failure by any means.

The RX580 was also what, the third refresh of 290x performance? It did well considering.

If they release something this time actually fast (I mean solid 1440p performance, not top tier fast) at an actually good price it might do them some favours this time around, especially if FSR4 is on point.
 
Agreed. This is the kind of tone they need for their GPU showcase. Now that they've checked the shareholder AI boxes with their CES presentation, talk directly to us. Give us raster performance, without the obfuscation! And a good price!
I liked his slant on RT as well. He clearly called out the RTX 2000s and said there was no value in those cards, but now there is value in competitive RT performance.

It was disappointing to not see much in their keynote, but it's starting to make a lot of sense now. They wanted to make this release as good as they possibly could. They know they're behind Nvidia and need to react to their movements. If they can do what they seem to be doing, and provide good gaming performance (without the AI circus that gamers don't care about), with an improved upscaler at a competitive price, then they will do well. They won't even need to undercut Nvidia by 50% or whatever it is that some seem to be demanding.
 
Last edited:
What Frank was saying is exactly the approach AMD should be taking. Concentrate on the type of cards mainstream gamers want to buy i.e. decent performance that is affordable for the masses.
 
I liked his slant on RT as well. He clearly called out the RTX 2000s and said there was no value in those cards, but now there is value in competitive RT performance.

It was disappointing to not see much in their keynote, but it's starting to make a lot of sense now. They wanted to make this release as good as they possibly could. They know they're behind Nvidia and need to react to their movements. If they can do what they seem to be doing, and provide good gaming performance (without the AI circus that gamers don't care about), with an improved upscaler at a competitive price, then they will do well. They won't even need to undercut Nvidia by 50% or whatever it is that some seem to be demanding.
Agreed. It really is like a game of chess that they're playing. They have a strategy, but they also need to react to their (much more powerful) opponent's moves.
 
$200 isn’t enough to compensate for a lack of DLSS and CUDA. Not surprised that it tanked sales.

FSR4 looks like it will compete well with DLSS and AMD frame gen is good enough for most people. I would definitely take a 9070 over a 5070 if the price was lower and it offered better raw performance. Not bothered about CUDA since I don't use the card for rendering or work but if needed there is ZLUDA or ROCM.
 
It should also be noted that being $200 cheaper than a 4080 was hardly a big deal. Both were vastly overpriced and that is reflected in the price reductions both had.
The point I was making is that the "$50 cheaper" tagline is disingenuous.
 
The point I was making is that the "$50 cheaper" tagline is disingenuous.

Well the truth is it depends, I could point at the 7900 XT ridiculous price being $100 more expensive than the 4070 Ti.

My point is that neither AMD nor Nvidia had realistic pricing for the current gen cards on release.
 
Last edited:
Well the truth is it depends, I could point at the 7900 XT ridiculous price being $100 more expensive than the 4070 Ti.

My point is that neither AMD nor Nvidia had realistic pricing for the current gen cards on release.
I mean I'm not sure it's up for debate whether 200 is more than 50 :cry:

People seem to have wildly different price expectations. For myself, if the 9070xt (for example) offers the same as the 5070ti and costs a fiver less, I'm going with the 9070xt.

The 7000 vs 4000 was far more nuanced, you're right. If you didn't use RT or upscaling, then the cheaper 7000 was better. If you did then it was up to the individual how much those features were worth - apparently more than $200 or $100 or $50 :cry:
 
Back
Top Bottom