• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Unreal. But accurately outed as going to buy one brand anyway which is why the absurdity of pleads is confusing. :cry:

Outed? What planet are you on?

Last time around I wanted a 6900XT until they were released and the RT was underwhelming and we could only buy inflated AIB cards over here. That’s precisely why I ended up with the 3090FE in the first place.

Sorry, doesn’t that fit your narrative?

I’m all for friendly banter but there are a few people on here that love to twist others words to suit themselves, but you’re not ####ing twisting mine.
 
Outed? What planet are you on?

Last time around I wanted a 6900XT until they were released and the RT was underwhelming and we could only buy inflated AIB cards over here. That’s precisely why I ended up with the 3090FE in the first place.

Sorry, doesn’t that fit your narrative?

I’m all for friendly banter but there are a few people on here that love to twist others words to suit themselves, but you’re not ####ing twisting mine.

Nothing new there then :cry:





Radeon RX 7950XT, 7900XT, 7800XT and 7700XT confirmed by Enermax?


The upcoming GeForce RTX 4090 and RTX 4080 series are not a surprise here, but RTX 4070 and RTX 4060 were not announced by NVIDIA yet. If this data is really based on preliminary information provided by NVIDIA, then we might now guess how powerful supply the new RX 40 series will require:

  • GeForce RTX 4090: 853W PSU (GPU TDP 450W)
  • GeForce RTX 4080: 720W PSU (GPU TDP 320-285W)
  • GeForce RTX 4070: 684W PSU (GPU TDP ~285W)
  • GeForce RTX 4060: 598W PSU (GPU TDP ~200W)
The 850W supply for the RTX 4090 is not as surprise, this is the official recommendation by NVIDIA. The RTX 4080 series have two SKUs either with 320W or 285W, so we can guess that other components (Core i9-12900KF, 32GB of VRAM, Corsair AIO and one M.2 disc) consume up to 400W. Therefore, it would appear that RTX 4070 has a TDP up to 285W and RTX 4060 might have a TDP of 200W.

Furthermore, there are four Radeon RX 7000 SKUs listed as well: 7950XT, 7900XT, 7800XT and 7700XT. Based on the same method, we can estimate the following TDP/TBP:

  • Radeon RX 7950XT: 822W PSU (GPU TBP ~420W)
  • Radeon RX 7900XT: 730W PSU (GPU TBP ~330W)
  • Radeon RX 7800XT: 700W PSU (GPU TBP ~300W)
  • Radeon RX 7700XT: 598W PSU (GPU TBP ~200W)
It would seem that the flagship Radeon RX 7950XT card might have a TBP of 420W, with RX 7900XT and 7800XT being a 330 and 300W SKUs. The mid-range RX 7700XT have a similar power requirement as RTX 4060, which is 200W.

If true, kind of wonder why on earth AMD have been banging on about efficiency unless their performance is going to be just as good or better than nvidia competing cards..... but even then, doesn't exactly fit what they were saying with their last press release of these cards being all about power efficiency.... Lets hope not but starting to seem like "poor volta" all over again potentially.
 
Last edited:
Nothing new there then :cry:





Radeon RX 7950XT, 7900XT, 7800XT and 7700XT confirmed by Enermax?




If true, kind of wonder why on earth AMD have been banging on about efficiency unless their performance is going to be just as good or better than nvidia competing cards..... but even then, doesn't exactly fit what they were saying with their last press release of these cards being all about power efficiency.... Lets hope not but starting to seem like "poor volta" all over again potentially.

Mean, this depends if you are trying to say the each tier of AMD cards stack directly against Nvidia product stack.

Theres a good chance that the Radeon RX 7700XT will compete with the currently named 4080 and thats how you would compare it. (as we are strongly under the impression the 4080 should have been a 4070ish type card.
 
Last edited:
there have been rumors that 7900 and 7800 skus will be on different dies, so idk if you still would like to hold on to the "its a 4070" theory.. thats how the market is being segmented, and i think we have to accept that as the new reality
 
Last edited:
Outed? What planet are you on?...Sorry, doesn’t that fit your narrative?...I’m all for friendly banter but there are a few people on here that love to twist others words to suit themselves, but you’re not ####ing twisting mine.

@Chuk_Chuk posted perfectly what it was about. Seems like you need to calm down. If you cant deal with it don't post nonsense.
 
822w psu is still a lot lower than nvidia's 850
Mean, this depends if you are trying to say the each tier of AMD cards stack directly against Nvidia product stack.

Theres a good chance that the Radeon RX 7700XT will compete with the currently named 4080 and thats how you would compare it. (as we are strongly under the impression the 4080 should have been a 4070ish type card.

Obviously less consumption is better but given the way AMD have been banging on about how great their efficiency is, I would be expecting better, when in the same "bracket" as nvidia competing gpus, I wouldn't say that is much better..... especially if they can't match the performance of 4090 or 4080 16gb, I suspect 7900xt will be the 4090 competitor and the 7800xt the 4080 16gb competitor.

Maybe in raster their efficiency will be better as was the case for RDNA 2 but for RT workloads, like ampere, ada will most likely be considerably superior, of course will depend what use case people value more though.

Will also be interesting to see how well both these undervolt as pretty much every ampere gpu could undervolt to reduce power consumption by 100w with little to no effect on performance, rdna 2 also undervolted very well.
 
lol i was just joking about the 822w psu.. btw did you find one while shopping around "A 822W PSU"?
i am telling you man, the future looks like a full tower and 1600W PSU with 2 12VHPWR slots.. its clear as air, the next generation will land in 600-700w for the flagship
and people will be still complaining abt "fake" dlss frames.. people got to be completely blind to fkin not being able to see where the technology is heading
 
Last edited:
The problem moves down the whole range tho, its why we now have this absurd situation.



The 6800XT is now $900, because RT and DLSS, i blame tech journalists for this.

There is a problem down the range only if people will buy.
6800XT was $649 MSRP.
Those inflated prices are due to AMD's own greed - same for nVIDIA $1000+ cards. They received the "ok" from the customers when they've paid for those.

Just like with Ryzen 7xxx I have a feeling AMD will be just as greedy as nVIDIA with the GPU prices. 10% less or thereabout compared to the green team is most likely irrelevant for the majority of gamers.

Nothing new there then :cry:





Radeon RX 7950XT, 7900XT, 7800XT and 7700XT confirmed by Enermax?




If true, kind of wonder why on earth AMD have been banging on about efficiency unless their performance is going to be just as good or better than nvidia competing cards..... but even then, doesn't exactly fit what they were saying with their last press release of these cards being all about power efficiency.... Lets hope not but starting to seem like "poor volta" all over again potentially.

They can always present some cards as those being their best performance/watt sku - "for those of our dear gamer friends with a focus on efficiency". The other cards will be masked under "enthusiast" products "for those who want the maximum performance". :)
 
@Chuk_Chuk posted perfectly what it was about. Seems like you need to calm down. If you cant deal with it don't post nonsense.

Hahaha. Calm down? I just don’t like people twisting what I say to try and score cheap points or to fit their personal narrative.

I’m about as calm a person as you’d ever meet so it might be better if you just put me on ignore TBH.
 
There is a problem down the range only if people will buy.
6800XT was $649 MSRP.
Those inflated prices are due to AMD's own greed - same for nVIDIA $1000+ cards. They received the "ok" from the customers when they've paid for those.

Just like with Ryzen 7xxx I have a feeling AMD will be just as greedy as nVIDIA with the GPU prices. 10% less or thereabout compared to the green team is most likely irrelevant for the majority of gamers.



They can always present some cards as those being their best performance/watt sku - "for those of our dear gamer friends with a focus on efficiency". The other cards will be masked under "enthusiast" products "for those who want the maximum performance". :)
:cry: Yeah good trollololol
 
Last edited:
Hahaha. Calm down? I just don’t like people twisting what I say to try and score cheap points or to fit their personal narrative.

I’m about as calm a person as you’d ever meet so it might be better if you just put me on ignore TBH.

If I upset you I apologise. There's only one guy on my ignore (seems to be on a few others too) and its coming up to twenty years being on here!

Anyway, back to the question - still waiting for your answer though...

You probably don’t agree with this assessment so I will ask you the same question I posed to another poster 2 years ago when the 3000 series came out. What would AMD need to show you that would stop you buying a 4090?
 
  • Like
Reactions: HRL
Nothing new there then :cry:





Radeon RX 7950XT, 7900XT, 7800XT and 7700XT confirmed by Enermax?




If true, kind of wonder why on earth AMD have been banging on about efficiency unless their performance is going to be just as good or better than nvidia competing cards..... but even then, doesn't exactly fit what they were saying with their last press release of these cards being all about power efficiency.... Lets hope not but starting to seem like "poor volta" all over again potentially.
Eh, what does Enermax'es PSU recommendation have to do with actual power use of a board? Most of the time vendor recommendations are way overblown, just check the current 6k series from AMD and 3k series from NVIDIA and compare to real use of both as checked by various testers. What will be the reality, we'll see when GPUs arrive and will be tested properly.
 
Last time around I wanted a 6900XT until they were released and the RT was underwhelming and we could only buy inflated AIB cards over here. That’s precisely why I ended up with the 3090FE in the first place.

Sorry, doesn’t that fit your narrative?
awkward but true non the less, that was the situation in the UK market at least.

So what performance then is on the table for the RX 7950XT, is it a 4090 killer a market disruptor or slotting in under it like a double glazing sales man with no input on engineering just trying to shift product?

I really fear it's the latter, hope I'm wrong just a few days left to find out before the narrative shifts...
 
I'm willing to go up to 1k but unless I get something within 20% of a 4090 in raster for that then I'm not getting my wallet out. I'd also be happy with something similar to the 4070 16gb but only for £600 max.
 
Last edited:
awkward but true non the less, that was the situation in the UK market at least.

So what performance then is on the table for the RX 7950XT, is it a 4090 killer a market disruptor or slotting in under it like a double glazing sales man with no input on engineering just trying to shift product?

I really fear it's the latter, hope I'm wrong just a few days left to find out before the narrative shifts...

For me it doesn't need to be a 4090 killer if they have something that slots between the 4080 and for less I'd be very interested
 
Eh, what does Enermax'es PSU recommendation have to do with actual power use of a board? Most of the time vendor recommendations are way overblown, just check the current 6k series from AMD and 3k series from NVIDIA and compare to real use of both as checked by various testers. What will be the reality, we'll see when GPUs arrive and will be tested properly.

Of course it will be wait and see as they said, it's an estimate based on nvidias power consumption figures/recommendations but still, even if in they're putting them in the same bracket for power consumption, that doesn't scream being leagues ahead for efficiency (which is what amd are shouting from the rooftops about) unless their performance is better than nvidia competing products... Not long to go anyway.
 
If I upset you I apologise. There's only one guy on my ignore (seems to be on a few others too) and its coming up to twenty years being on here!

Anyway, back to the question - still waiting for your answer though...

Thanks but there’s no need to apologise and I’m not upset at all. It’s just not very nice when someone states either something I haven’t said or meant, and then twists it into something else.

Personally, if AMD could match the top tier product for less, even with lower RT performance, I might be interested. I’ll admit it’s doubtful that RT is going to catch up with the 4090 anyway as Nvidia well and truly have a generational headstart and clearly the current gen weren’t competitive in RT, although raster performance is very good and sometimes better than the current top Nvidia products.

It all boils down to price and my understanding is that AMD’s production costs for the new cards are significantly lower than Nvidia’s. If that is the case then they really should be kicking Nvidia in the shins and undercutting them, it’s the only way they are likely to grab any of that market share back.

Of course, if AMD do release some great products but we’re limited to AIB cards again, and above MSRP, it would be yet another opportunity missed. They didn't appear to consider the UK a big enough market to worry about last time around though, here’s hoping that’s different this time.

Ultimately, Nvidia have priced the market to such a high price point, presumably to sell off all of the existing inventory as “bargains”, that there really is a wide open goal just sitting there. Whether or not AMD put it in the back of the net is unfortunately a different story. If they do then Nvidia would have egg on their face and would likely have to slash prices, IMO, or else face ridicule from the media and public, not something their shareholders would appreciate either way.

I could be a million miles away but that’s my interpretation of the way things look right now. My wife does say I’m usually wrong though so what do I know!
 
Do many make the choice based on their current gameplay list of favourites? Cyberpunk is one of my favourite games which I anticipate still playing over the next couple of years. I want to have the best possible experience on it with all the bells and whistles turned on for my next and future playthrough's and a 4090 is going to do that. For games I am waiting for, mostly Stalker 2, I know the card will perform well enough and there is a strong chance the game will have either RT, DLSS or both.

Maybe others buy based on a more broader performance metric, but I assume a 4090 well perform very well overall anyway.

If AMD get something out before the 12th that shows some interesting performance and tech for my most played games over the next couple of years, I'd probably hang on. After the 12th is too late though.
 
Last edited:
Thanks but there’s no need to apologise and I’m not upset at all. It’s just not very nice when someone states either something I haven’t said or meant, and then twists it into something else.

Personally, if AMD could match the top tier product for less, even with lower RT performance, I might be interested. I’ll admit it’s doubtful that RT is going to catch up with the 4090 anyway as Nvidia well and truly have a generational headstart and clearly the current gen weren’t competitive in RT, although raster performance is very good and sometimes better than the current top Nvidia products.

It all boils down to price and my understanding is that AMD’s production costs for the new cards are significantly lower than Nvidia’s. If that is the case then they really should be kicking Nvidia in the shins and undercutting them, it’s the only way they are likely to grab any of that market share back.

Of course, if AMD do release some great products but we’re limited to AIB cards again, and above MSRP, it would be yet another opportunity missed. They didn't appear to consider the UK a big enough market to worry about last time around though, here’s hoping that’s different this time.

Ultimately, Nvidia have priced the market to such a high price point, presumably to sell off all of the existing inventory as “bargains”, that there really is a wide open goal just sitting there. Whether or not AMD put it in the back of the net is unfortunately a different story. If they do then Nvidia would have egg on their face and would likely have to slash prices, IMO, or else face ridicule from the media and public, not something their shareholders would appreciate either way.

I could be a million miles away but that’s my interpretation of the way things look right now. My wife does say I’m usually wrong though so what do I know!

Well said @HRL especially the first sentence ;)

I think DLSS 3/frame generation will also throw a spanner in the works for amd just as it did with them having no dlss competitor for nearly 2 years (only counting dlss >2 and fsr 2 as v1 was ****), believe or not, people don't want to be waiting months, let alone years for tech. like that and some will be quite happy to pay a small premium if it means getting it "now". FSR 2.1 has nearly caught up with dlss now after having tested it for myself in spiderman over the weekend but still doesn't beat dlss on temporal stability, shimmering and "overall" reconstruction especially when looking at the lower presets i.e. balanced and performance modes

People will say "fake frames" when it comes to frame generation/dlss 3 but at the end of the day, how do you measure performance? FPS and frame latency? Which is what DLSS 3/fg improves.... Only con of dlss 3/fg in "normal" gameplay will possibly be latency but this will come entirely down to the type of game, k+m vs controller use. In terms of artifacts, doesn't look to be any noticeable issues outside of slowing the footage down entirely and capturing the "fake frame", if what DF/nvidia have said where nvidia have been working on this for the past 6-7 years and with amds lack of focus on ML/AI so far, how long are we going to be waiting for them to deliver this feature?

Do many make the choice based on their current gameplay list of favourites? Cyberpunk is one of my favourite games which I anticipate still playing over the next couple of years. I want to have the best possible experience on it with all the bells and whistles turned on for my next and future playthrough's and a 4090 is going to do that. For games I am waiting for, mostly Stalker 2, I know the card will perform well enough and there is a strong chance the game will have either RT, DLSS or both.

Maybe others buy based on a more broader performance metric, but I assume a 4090 well perform very well overall anyway.

If AMD get something out before the 12th that shows some interesting performance and tech for my most played games over the next couple of years, I'd probably hang on. After the 12th is too late though.

I do. Another area where amd are very poor now, they used to be great back in the day. As said a few times, I've played more RT games than raster only over the past 2 years and pretty much every game (outside of amd sponsored titles) has had dlss too, meanwhile, amd and fsr 2, even in their own sponsored games is sketchy at best, let alone for fsr 2.1.... where as nvidia have already announced 35 apps/games getting dlss 3. It's thing like that which doesn't help when you're already lacking behind in features that doesn't help amd claw back market share imo.

Ultimately they need to become bang per buck kings again, if you can't compete on performance or/and quantity and quality of features then you have to price appropriately imo.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom