• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Hence the key word here:



Do you not find it a bit odd how the only thing we know of rdna 3 ray tracing so far is amds comment of "it will be more advanced than rdna 2", if they were going to significantly improve it, why not say something like "it will be better than ampere", same way intel have.... Not agree that is a poor choice of words and doesn't show they have much confidence in it?

Then go based on history:

- how long did amd take to catch up on dx 11?
- how long did they take to catch up on raster?
- how long did they take to catch up on opengl? And still aren't there yet

Of course, they could come out and completely smash 40xx RT and I hope so as right now, if you care about ray tracing, we only have the one choice.

Looking historically at AMD tells us very little though, especially from one that was circling the drain and had no compelling products.

Looking at their recent history paints a more telling picture, going from being no where near to competitive at the high end on raster. They've also made huge gains across their software stack - including FSR not existing 16 months ago to the latest 2.1 release which is really closing the gap.

Not sure if the statement on OpenGL is true or not (literally can't find benchmarks), but Techpowerup says the new driver reaches 3080 levels on the 6800XT using Unigine Valley benchmark.

Their AMF encoder has gone from neglect to competitive, but they still need to work on wider software support and some improved hardware encoders next generation for sure.

Whilst they still have plenty to do, if they continue momentum I might be interested next gen - or at least I won't default my nVidia recommendation when people ask what they should buy :D
 
Do you not find it a bit odd how the only thing we know of rdna 3 ray tracing so far is amds comment of "it will be more advanced than rdna 2", if they were going to significantly improve it, why not say something like "it will be better than ampere", same way intel have.... Not agree that is a poor choice of words and doesn't show they have much confidence in it?

Then go based on history:

- how long did amd take to catch up on dx 11?
- how long did they take to catch up on raster?
- how long did they take to catch up on opengl? And still aren't there yet

Of course, they could come out and completely smash 40xx RT and I hope so as right now, if you care about ray tracing, we only have the one choice.

iu
 
I understand your reasoning - in the absence of any other information of course most people will assume that the situation is the same. But let's see what they talk about when they do start talking about RDNA3 more - if they downplay RT then then I think assumptions of not being market leaders could be correct, but I don't think their strategy of not hyping RDNA3 just yet says anything specific about RT.

As mentioned there is the possibility they could come out and smash 40xx, not long to find out.

Looking historically at AMD tells us very little though, especially from one that was circling the drain and had no compelling products.

Looking at their recent history paints a more telling picture, going from being no where near to competitive at the high end on raster. They've also made huge gains across their software stack - including FSR not existing 16 months ago to the latest 2.1 release which is really closing the gap.

Not sure if the statement on OpenGL is true or not (literally can't find benchmarks), but Techpowerup says the new driver reaches 3080 levels on the 6800XT using Unigine Valley benchmark.

Their AMF encoder has gone from neglect to competitive, but they still need to work on wider software support and some improved hardware encoders next generation for sure.

Whilst they still have plenty to do, if they continue momentum I might be interested next gen - or at least I won't default my nVidia recommendation when people ask what they should buy :D

No doubting they have caught up but do you think they did that overnight? They would have been working on those improvements software/hardware for a long time, rdna 2 like turing was basically a tickbox for RT "support"

That's why I mentioned opengl along with the othe rpoints i.e. they have been lacking behind for so long and have only just now caught up:


Reddit is a good source for discussion on this:



Their encoder/streaming capabilities are not a patch on nvidias nvenc/shadowplay either, hence why so many streamers/youtubers use nvidia, personally not something I care for though.

Lots of amd fans like to use the phrase "fine wine" but is it really when the competition have had the head start/lead for years? You know for a fact if nvidia were lacking behind in those areas, we would never hear the end of how nvidia are awful/evil for not fixing/improving these but because it's amd aka the underdog/charity, white knight for the pc gaming community or whatever way the amd fans view them as these days... it's ok :p :D


Can you "debunk" them statements? I'm going to assume no :cry:
 
The trouble with AMD is they try to price match nvidia while having less features and previous gen RT, had AMD dropped a 6800XT for 500 quid then I'd have definetly picked that over a 3070 but at £600 it's just to close to a 3080. Hopefully with RDNA3 it won't be the usual nvidia prices minus 50 quid.
 
Ampere was brute force turing, its nothing special. Has been Nvidia approach, if in doubt shove volts through it

Nothing special and just brute forcing.... If that's all it is, well then it's not a bad uplift if that's all they did ;) :p


The RTX 3090 can set itself apart from the RTX 2080 Ti by an impressive 119 to 171 percent

The amperage also leads to the unknown picture that the RTX 3060 Ti ranks 19 to 61 percent ahead of the RTX 2080 Ti. Regardless of the fact that both models are hopelessly overwhelmed with RT Ultra or Overkill, the improvements within the Ampere GPUs are clearly evident here. What exactly brings the significant increase in performance is not apparent from the outside. The fact is that Ampere has improved RT cores and can theoretically implement the upcoming shading with doubled FP32 performance. Apparently, this scenario is the moment of glory for the ampere architecture. As a reminder: According to Nvidia, the triangle intersection rate (detection of polygons in space) of the RT cores 2.0 has been doubled compared to Turing.

The more complex the ray tracing in space, the better Ampere performs.

If you compare Turing-TU102 as 1st-gen RTX with Ampere-GA102, Ampere is up to a factor of 2.7 faster. If you only use (halfway) smoothly displayed settings, in this case "RT High", it is still a factor of 2.2 and thus significantly more than in all current ray tracing games. It's definitely because of the scene
 

That's exactly what it is... one the one hand he says "oh we don't know yet" while at the same time go to great lengths in making sure you know exactly what he thinks, he's trying to bait people in to steelmanning an argument so he can knock it down later and parade around like Mussolini with his chin out decreeding "i told you so, everyone but me is over-hyping"
 
*Rumour*

AMD's strategy is to go super efficient, reference models will be less than 375 watts and more than 300 watts. >350 watts?

They will allow board partners to clock them to the moon and in that way let them deal with the heat and potential warranty problems, as far as reference cards goes they want them to be inexpensive and not hot.
The reasoning is in a recension and a market about to be flooded with used cards they don't want to be trying to push very high cost high power consumption cards, they also think top end Nvidia cards will be super expensive and have very high power consumption as their attitude is win win win at any cost and AMD are quite happy to sit back and watch Nvidia win with 4 slot mega money cards, AMD figure that will help AMD shake off the "hot cards" mind virus of past generations and have cards that are sensible at a cost to performance ratio.

In other words AMD want to look like the grown-ups, while at the same time letting AIB's involve themselves in a performance war with Nvidia.

IMO that's very smart.

 
*Rumour*

AMD's strategy is to go super efficient, reference models will be less than 375 watts and more than 300 watts. >350 watts?

They will allow board partners to clock them to the moon and in that way let them deal with the heat and potential warranty problems, as far as reference cards goes they want them to be inexpensive and not hot.
The reasoning is in a recension and a market about to be flooded with used cards they don't want to be trying to push very high cost high power consumption cards, they also think top end Nvidia cards will be super expensive and have very high power consumption as their attitude is win win win at any cost and AMD are quite happy to sit back and watch Nvidia win with 4 slot mega money cards, AMD figure that will help AMD shake off the "hot cards" mind virus of past generations and have cards that are sensible at a cost to performance ratio.

In other words AMD want to look like the grown-ups, while at the same time letting AIB's involve themselves in a performance war with Nvidia.

IMO that's very smart.

I Like the idea but this can backfire if they don't release AIB cards simultaneously with the reference models. Not the 1 week staggered launch, like last time. So it is reviewed on day 1.
Lets assume the overclocking gives them a good boost to performance then it will only make them look good on release day.
 
Ah I see the usual has happened, you point out the not so good bits/issues of amd and instead of discussing the post or trying to address said points, resort to the good old one liners/calling people fanboys :cry: :D

Take note from the below chaps on how you address said post:

I understand your reasoning - in the absence of any other information of course most people will assume that the situation is the same. But let's see what they talk about when they do start talking about RDNA3 more - if they downplay RT then then I think assumptions of not being market leaders could be correct, but I don't think their strategy of not hyping RDNA3 just yet says anything specific about RT.
Looking historically at AMD tells us very little though, especially from one that was circling the drain and had no compelling products.

Looking at their recent history paints a more telling picture, going from being no where near to competitive at the high end on raster. They've also made huge gains across their software stack - including FSR not existing 16 months ago to the latest 2.1 release which is really closing the gap.

Not sure if the statement on OpenGL is true or not (literally can't find benchmarks), but Techpowerup says the new driver reaches 3080 levels on the 6800XT using Unigine Valley benchmark.

Their AMF encoder has gone from neglect to competitive, but they still need to work on wider software support and some improved hardware encoders next generation for sure.

Whilst they still have plenty to do, if they continue momentum I might be interested next gen - or at least I won't default my nVidia recommendation when people ask what they should buy :D


I see the resident Nvidia fanboys are already busy talking themselves into their RTX 4090 purchase. :)

Only takes a special kind of mug to commit to them gpus but it's ok as long as you mine as much as you can to retrieve your money back, meanwhile Jenson is laughing all the way to the bank ;) :D

Selectivity at its finest:

Its only 10% (when it suits).

Seems your reading skills aren't too good either, you do realise that is comparing to turing and only in ray tracing benchmark :cry: Pass me the popcorn @TNA :D
 
  • Haha
Reactions: TNA
The more Nvidia charges for 4000, the more they encourage people to wait for RDNA3.

Doubt it. Far too many if AMD competed then I could buy a geforce for cheaper mentality. If more wanted to body blow nvidia where it hurts, should AMD offer equivalent performance you need to vote with the wallet.
 
The trouble with AMD is they try to price match nvidia while having less features and previous gen RT, had AMD dropped a 6800XT for 500 quid then I'd have definetly picked that over a 3070 but at £600 it's just to close to a 3080. Hopefully with RDNA3 it won't be the usual nvidia prices minus 50 quid.
This is where I worry about the fine line between market pricing and anti-competitive pricing. Nvidia have such a market preference advantage that if AMD offered cards at a big discount to Nvidia, Nvidia could just drop their pricing to suit and AMD still wouldn't sell any more cards, and both parties would just be poorer which Nvidia can probably take better than AMD. If you're only going to get 20% of the market then you might as well get the most return you can for it.
 
This is where I worry about the fine line between market pricing and anti-competitive pricing. Nvidia have such a market preference advantage that if AMD offered cards at a big discount to Nvidia, Nvidia could just drop their pricing to suit and AMD still wouldn't sell any more cards, and both parties would just be poorer which Nvidia can probably take better than AMD. If you're only going to get 20% of the market then you might as well get the most return you can for it.
Regarding the bolded section, this was true during the GCN days but not so much now, IMO.

If memory serves me right during those days AMD's dies where bigger than the equivalent Nvidia die and the GPU section was what was keeping AMD afloat, due to their non existent CPU division.

Now the current prediction is that with the multi-chip design, AMD design is more cost efficient than Nvidia. Coupled with the fact that the CPU division is now making AMD money hand over fist, while Nvidia is mainly reliant on GPUs. In a race to the bottom AMD is positioned much better than Nvidia.

With AMD going second on release they have a number of strategies available to them for pricing but one option is to play their advantage and squeeze Nvidia a little. It is a double edged sword though but the idea would be to price in such a way as to force Nvidia to lower their prices, which means lowering their margins, which has the knock on affect of reducing Nvidia's cash flow. Reduced cash flow usually leads to scaling back investment opportunities.
 
The trouble with AMD is they try to price match nvidia while having less features and previous gen RT, had AMD dropped a 6800XT for 500 quid then I'd have definetly picked that over a 3070 but at £600 it's just to close to a 3080. Hopefully with RDNA3 it won't be the usual nvidia prices minus 50 quid.
Roughly agree with this sentiment, but AMD couldn't keep up with demand with their pricing. It would make no sense for them to reduce price.

Note that once demand fell (dramatically), AMD dropped their prices straight away. Interesting their prices fallen further. IMO it implies they're clearing limited stock out before next gen release.
 
Roughly agree with this sentiment, but AMD couldn't keep up with demand with their pricing. It would make no sense for them to reduce price.

Note that once demand fell (dramatically), AMD dropped their prices straight away. Interesting their prices fallen further. IMO it implies they're clearing limited stock out before next gen release.

It always comes down to demand and what people are willing to pay, if stock isn't moving, then prices will keep coming down until they start selling in a high volume again.
 
Status
Not open for further replies.
Back
Top Bottom