• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
That's BS. If you actually believe that, you're either very young, very ignorant or simply pretending to be to suit a narrative. AMD (and further back ATI) have had the superior product a bunch of times in the past. And you know what happened? People bought inferior Nvidia alternatives anyway for the most part. Consumers created Nvidia as it exists today by blindly sticking with them even when there was a better alternative. That's why I can't really feel any sympathy for Nvidia fanboys moaning about how they're being milked now. You chose to stick the pumps on your chest.

RE Bolded: Right, this is a problem of their own making, they created this situation themselves, they still have the power to do something about it, but wont.
 
Last edited:
Anything AAA with RT will run great on AMD as it's optimized for consoles which use AMD. The examples are already there with the Resident evil games.

FSR adoption has been great and much faster than DLSS on the same timeframes. There is really no reason to avoid AMD unless you're a fanboy or RDNA3 greatly underperforms.
Only reason those games run decent on AMD hardware is limited use of RT ( usually either just shadows or reflections ).

The others which have full RT effects run like garbo on AMD. I know this forum is mostly an AMD bubble but cmon..

And while FSR is great on its own, DLSS is better, so given the choice between the two as long as you dont particularly care about a money difference NVidia is the superior product period.
 
That's BS. If you actually believe that, you're either very young, very ignorant or simply pretending to be to suit a narrative. AMD (and further back ATI) have had the superior product a bunch of times in the past. And you know what happened? People bought inferior Nvidia alternatives anyway for the most part. Consumers created Nvidia as it exists today by blindly sticking with them even when there was a better alternative. That's why I can't really feel any sympathy for Nvidia fanboys moaning about how they're being milked now. You chose to stick the pumps on your chest.

Had to quote also as its very apt. Been said on the forums a few times but gets drowned out by all the noise. As you say, lets see if the people bitching this time do anything other than buying same brand products time and time again. :)
 
Why soon? People are buying cards to last them 2 years at minimum. Also people might want to play already released games maxed out with RT and DLSS of which there are many.

Plus there will be even more UE5 games coming out, even indie ones with lots of RT effects and DLSS.
Why in the actual hell would you drop £2K now to eventually play a big AAA game that is full RT in 2 years time.
 
That's BS. If you actually believe that, you're either very young, very ignorant or simply pretending to be to suit a narrative. AMD (and further back ATI) have had the superior product a bunch of times in the past. And you know what happened? People bought inferior Nvidia alternatives anyway for the most part. Consumers created Nvidia as it exists today by blindly sticking with them even when there was a better alternative. That's why I can't really feel any sympathy for Nvidia fanboys moaning about how they're being milked now. You chose to stick the pumps on your chest.
Care to give some examples mate because that's a myth that needs to die.
 
Last edited:
Why in the actual hell would you drop £2K now to eventually play a big AAA game that is full RT in 2 years time.
I bet GPU cycles get longer and longer so 4090 could last 3+ years before it's replaced and with the way inflation is going anyway I bet 2k will seem reasonable
 
some people got locked into a brand depending on their monitor though
After sporting 6800XT for months with a Dell QD Oled one (G-Sync module) I can say that even though they work together, it's not perfect. G-Sync caused quite a few issues with this AMD card (black screens, monitor crashed few times and only full power cycling of it reanimated it, flickering image etc.). 3080 Ti - zero issues with the same monitor. Same AMD card with a different monitor - also zero issues, work perfectly.
In other words, it would seem that in a very unplanned way I am locked now to NVIDIA too (love this monitor way too much to change).
 
Last edited:
And while FSR is great on its own, DLSS is better, so given the choice between the two as long as you dont particularly care about a money difference NVidia is the superior product period.
I heard that so many times, then switched from 6800XT to 3080Ti (because of G-Sync module issues) and... I am very disappointed with DLSS. For example, in Deathloop DLSS produces a bit better looking thin lines but it's full of artefacts in bunch of places, e.g. railing produce very weird halo/ghosting, mesh on windows is just one big blur-mess of ghostiness, etc. NONE of these issues show up on FSR 2 in the same game, but thin lines look worse in the distance (which seems to be what DLSS tests focus on as if nothing else mattered). FPS - almost identical. I've tried to replace DLSS with newest one, it got better but still not ideal (and game's dev should release patch with it, not user should be forced to swap DLSS lib manually).
MSFS with newest DLSS lib (it comes with it) - flying over water, it produces really annoying artefacts too (patches of it turn into more like jeans fabric than water), whereas TAA looks perfect in these same areas.

Hence, even though I use 3080Ti, I prefer FSR in Deathloop or TAA upscaling in MSFS. There's few other titles where DLSS just looks bad comparing to native 1440p UW (on Quality setting, to be clear!). TAA looks at times worse (though adding sharpening often helps), but at least does not produce artefacts that DLSS does. I really don't understand how people can use DLSS and not see these glaring issues/artefacts comparing to standard resolution - claims of "Better than native" are even more silly (again, thin distant lines sure, everything else NO!).
 
Last edited:
Why in the actual hell would you drop £2K now to eventually play a big AAA game that is full RT in 2 years time.

Agree, I mean if you bought an Ampere your better off at the very least in waiting to see what the Super/Ti/Specials might offer in 6+ months. I can understand if folk have an aging 970 for example or recently improved their display; but the sku's of current Ada below 4080 dont seem to offer anything for their soon to be price. Some were anticipating the 4070 to beat the 3090..

I heard that so many times, then switched from 6800XT to 3080Ti (because of G-Sync module issues) and... I am very disappointed with DLSS.

;)
 
Last edited:
AMD have had better offerings in the past and people still bought the inferior nvidia at the time. AMD could have a card 100% more powerful and still sell less, all the goal post shifting on excuses would be hilarious.
I'd disagree at least for the last few generations. AMD seem content with matching nvidia on raster while being 50 quid cheaper but this isn't enough.

This gen with the pricing set by Nvidia and the animosity swirling around I think AMD have a great chance but if they follow with silly pricing of their own then they don't deserve to win people over.
What AMD should do though is release a 4080 12gb equivalent for £500 and a 4080 16gb for £700
IF they do this then I'll buy a 7800XT but if its like 1k then they can forget it and i'll won't buy from either company.
 
Last edited:
This needs to happen even if I dont buy this new gen and sit it out. The only way to remind these companies that prices only go up attitude is lost business. Hopefully the recession will put many into check and they dont buy if they dont need. This will cause prices to adjust lower when volumes are low and no market for scalpers/miners to prop it up.
 
Agree, I mean if you bought an Ampere your better off at the very least in waiting to see what the Super/Ti/Specials might offer in 6+ months. I can understand if folk have an aging 970 for example or recently improved their display; but the sku's of current Ada below 4080 dont seem to offer anything for their soon to be price. Some were anticipating the 4070 to beat the 3090..



;)
*Gently pats my 970 hooked up to a 4k work monitor* It's okay buddy, I know you're trying your hardest...
 
Well it does but the issue is Nvidia decided to call it a 4080 16gb and charge 1300 quid.

I want to see what the adjusted 4070 when released does though as its going to be a hard sell since we rumbled their shenanigans. With all the bus changes and sku switching they can call it what they like, only the laymen are going to miss that trickery!
 
I want to see what the adjusted 4070 when released does though as its going to be a hard sell since we rumbled their shenanigans. With all the bus changes and sku switching they can call it what they like, only the laymen are going to miss that trickery!
Going to be interesting to see how many cuda cores and mem bandwidth it gets...
 
What news about the new AMD cards are we expecting, and will we get news/benchmarks etc before October 12th before the 4090 goes on sale?
we are expecting 'more details' before Nov 3rd whatever that might mean and that's it! Basically when Scott Herkelman comes back from vacation and wants to sell some AMD Radeon products instead of saving for his retirement.
 
Well if the top RDNA3 card trumps the 4090 at raster, potentially quite a few customers I would have thought.

Personally, I don’t know anyone that exclusively buys one brand over the other, we all just want the best performance, or potentially the best performance per buck.
Mate you own a 3090 and are potentially looking to buy a 4090 on day 1 because of want not need (not a criticism). You can’t even wait a few weeks to see what AMD is offering. No offence but you’re not a customer to them. Nvidia already has you in their hooks.

You probably don’t agree with this assessment so I will ask you the same question I posed to another poster 2 years ago when the 3000 series came out. What would AMD need to show you that would stop you buying a 4090?
 
Last edited:
I want to see what the adjusted 4070 when released does though as its going to be a hard sell since we rumbled their shenanigans. With all the bus changes and sku switching they can call it what they like, only the laymen are going to miss that trickery!
I'm thinking it'll have 3080 performance 10gb VRAM on a 160 bus and cost around 700 quid.
 
Status
Not open for further replies.
Back
Top Bottom