• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

User error which was largely caused by poor design
Poor design can be inherent to DIY PC builds in general so you should be careful about everything. AM4 CPUs have a massive design flaw due to the way the socket is built. If you want to remove any CPU on Intel, you just need to unscrew the CPU cooler and it will come out with little force. On AMD, you have to run Prime 95 for 10 mins to warm the TIM, unscrew the cooler, twist it sideways to dislodge it and lift it gently just to ensure the CPU doesn't come out along with the cooler. If you cannot turn on the PC in the first place, you are SOL unless you use a blowdryer. So many new builders have ended up breaking their CPUs not following this method. This has now been fixed with AM5.

Not saying the 12VHPWR is not a design flaw but when plugging in a cable, its common sense to plug it in all the way until you head a click otherwise its not connected. There have been no cases of melting since this was pointed out.
 
Last edited:
But But But Whatabout Nvidia power connectors which fail in 1% of cases due to user error, so its ok for AMD to have an inherent design fault in the vapour chamber
lets not blame it all on user error when the adapters were made to a **** standard, if it looks like its in but didnt click you might assume its fine, my adapter that came with the fe made an audible click and slipped in easily, not all were like that so discarding it as user error lets the corpo off the hook and just makes it sound like the customers are retards
 
He has not enabled RT in all the 50 games tested. Cyberpunk, Watch Dogs Legion, Dying Light were shown with RT disabled.

He also included massive outliers like the MW2 data points twice to pad the AMD score and the major RT benchmark in Fortnite was flawed as he used software ray tracing to benchmark instead of hardware which is much superior in quality. The RTX cards are largely unaffected with this option while AMD suffers a hit. He posted a tweet and deleted it when pointed out.

Personally I don't think any of the cards below the 4090 are worth it if you are on a 3000 or 6000 series card. None of those cards ever had any trouble playing rasterised games maxed out even at 4k so the only reason to upgrade would be RT which is most compelling on the 4090.

Good spot(s) :) Quite a few other titles where RT was left of in the final summary too.....
Funny thing is, most people according to their poll rather pay similar money and get a worse package with the 7900xtx :o
 
Last edited:
It isn't really nvidia's design. Also it sounds like it is very rare, >99% of people know how to plug stuff in properly.
Never said it was Nvidias design, just poor as these sort of things need to account for that 1% unfortunately.

Poor design can be inherent to DIY PC builds in general so you should be careful about everything. AM4 CPUs have a massive design flaw due to the way the socket is built. If you want to remove any CPU on Intel, you just need to unscrew the CPU cooler and it will come out with little force. On AMD, you have to run Prime 95 for 10 mins to warm the TIM, unscrew the cooler, twist it sideways to dislodge it and lift it gently just to ensure the CPU doesn't come out along with the cooler. If you cannot turn on the PC in the first place, you are SOL unless you use a blowdryer. So many new builders have ended up breaking their CPUs not following this method. This has now been fixed with AM5.

Not saying the 12VHPWR is not a design flaw but when plugging in a cable, its common sense to plug it in all the way until you head a click otherwise its not connected. There have been no cases of melting since this was pointed out.

As true as that is its irrelevant in this thread but you couldn't help taking a dig at AMD though could you.

My understanding is the issue with the 12VHPWR was that the click was almost inaudible or couldn't be felt with.

It will be interesting to see what AMD's solution to this mess will be.
 
He also included massive outliers like the MW2 data points twice to pad the AMD score and the major RT benchmark in Fortnite was flawed as he used software ray tracing to benchmark instead of hardware which is much superior in quality. The RTX cards are largely unaffected with this option while AMD suffers a hit. He posted a tweet and deleted it when pointed out.

He also posted a massive outlier in Nvidia's favour, World War Z so it's all swings and roundabouts really.
 
So it includes games with RT enabled, yes?
In a flawed way. Why test software RT and not hardware RT which is clearly superior and more demanding and actually utilises the RT Hardware? Crysis Remastered uses software RT and performs near identically on AMD and Nvidia but the RT is so weak it may as well not be there. The same applies for F1 2022.
 
He also posted a massive outlier in Nvidia's favour, World War Z so it's all swings and roundabouts really.
He posted the MW2 results twice, one using Basic and another using Ultra. The Nvidia outliers are different games. Why not include medium and high settings at that point as well?

I don't really think it matters because if you remove outliers on both sides, they still perform roughly the same. My main gripe here is that the review was clearly slanted towards showing the AMD card in a better light mainly due to a) MW2 results being posted twice and b) disabling RT in the games which actually use the feature meaningfully and c) software based RT instead of hardware on Fortnite as the RT cores on the Nvidia cards are not utilised in this mode.
 
In a flawed way. Why test software RT and not hardware RT which is clearly superior and more demanding and actually utilises the RT Hardware? Crysis Remastered uses software RT and performs near identically on AMD and Nvidia but the RT is so weak it may as well not be there. The same applies for F1 2022.

F1 22 is actually a pretty good showcase for RT, problem is a lot of the benchmarks are doing benchmark of dry courses so essentially missing the main RT effect i.e. reflections, which is what hits performance a bit harder.

But agree, it's a flawed test "overall", what next, reducing other graphical settings for benchmarks, oh wait..... that was done too :D

4080 actually did better than I expected in raster titles tbf too.
 
Last edited:
  • Like
Reactions: Rup
In a flawed way. Why test software RT and not hardware RT which is clearly superior and more demanding and actually utilises the RT Hardware? Crysis Remastered uses software RT and performs near identically on AMD and Nvidia but the RT is so weak it may as well not be there. The same applies for F1 2022.
Crysis Remastered doesn't work without RT cores, because its hardware RT.

Add to that Software RT does not run in real time, its for 3D/2D art rendering, like Cinema 4D (Cinebench) No games have "Software RT" its all just RT, real time RT.
 
Last edited:
  • Like
Reactions: Rup
Never said it was Nvidias design, just poor as these sort of things need to account for that 1% unfortunately.



As true as that is its irrelevant in this thread but you couldn't help taking a dig at AMD though could you.

My understanding is the issue with the 12VHPWR was that the click was almost inaudible or couldn't be felt with.

It will be interesting to see what AMD's solution to this mess will be.
It was not intended to be a dig but to show that design flaws exist on both sides. It was AMD which took a dig at Nvidia with the burning connectors and look where it landed them. Its better to remain silent and accept design flaws will exist on mass produced products instead you end up with foot in the mouth moments.
 
The Ray Tracing is Crysis Remastered is no different to how it is in Cyberpunk, its all DXR, the only difference is one was sponsored by Nvidia.
 
Last edited:
It was not intended to be a dig but to show that design flaws exist on both sides. It was AMD which took a dig at Nvidia with the burning connectors and look where it landed them. Its better to remain silent and accept design flaws will exist on mass produced products instead you end up with foot in the mouth moments.

Yup things like that are childish and always a big risk, this is largely why I have lost a lot of respect for amd over the last 3/4 years, they prefer to point fingers when things don't go their way (which sometimes can be true/justified i.e. witcher 3 hairworks and crysis 2 tessellation but often it is their own fault especially when they prefer/want an over the fence approach) and it always seems like they are taking little jabs at nvidias approach i.e. "closed source = bad!!!" yet they follow in nvidias footsteps and don't really innovate the same way. Instead they should be focussing on their own strengths and coming out with their own things where nvidia end up following them and their ideas but I guess this is what happens when nvidia have the market by the balls...

You can't spell karma without RMA :p
 
Last edited:
He has not enabled RT in all the 50 games tested. Cyberpunk, Watch Dogs Legion, Dying Light were shown with RT disabled.
The reason why he doesn't enable RT in those games is in many cases it will kill performance to an unaceptable level and showing that data is not helpful as the people buying these cards are likely looking to drive a 4k display at 60hz+.
 
The reason why he doesn't enable RT in those games is in many cases it will kill performance to an unaceptable level and showing that data is not helpful as the people buying these cards are likely looking to drive a 4k display at 60hz+.

I'm gonna say it.

The RT performance of the 4080 and 7900xtx are as bad or as good as each other.

From what i'm seeing either both can hardly hit 60fps or both exceed 60fps. Sure the 4080 may get 90fps and the 7900xtx gets 60fps or something but they are either both playable or both unplayable, and i don't think either one is anything to shout about. (Looking at the 4k games as that's the monitor I have)

It seems like such a pointless argument on these two cards.

Does it matter if the 4080 is 30% faster in ray tracing at 4k in Witcher 3 than the 7900xtx when it's only getting 35fps, both cards are unplayable.

Does it matter if the 4080 is 15% faster in ray tracing in Riftbreakers if both are 100fps plus?

At 4k at least both cards are garbage at Raytracing and neither card should be bought for on that basis.

I'm a 4k user (though i think ill keep my 1080ti for a while longer) but neither card is suitable for Raytracing in my opinion, they are either both playable or both unplayable in all games. I didn't see one game in particular where one card was playable and the other unplayable. As such looking at the reviews now it seems 7900xtx is a perfect match for the 4080 though it's dissapointing it's not faster overall in raster performance (I got caught up in the hype train)

The annoying part is the 4080 is a bad value card but a better value card than the 7900xtx, looking at the prices Nvidia are sticking to, keeping, and still releasing founders editions so the MRSP of £1200 is sticking around and we have multiple cards available for under £1200. The problem is the 7900xtx was a paper launch, crap supply of MBA cards and as Gibbo said Sapphire will no longer release them going forward, meaning we will only have 7900xtxs available for £1200-£1300.... the issue is even though I think raytracing is garbage on both of those cards it's frankly stupid to buy a 7900xtx over a 4080 at those prices. The "value" is actually in Nvidias favour.

I'm super glad I didn't FOMO and buy one on launch gonna wait until summer I think see what happens with the prices, but my sad little 1080ti will have to continue for a few months more, I simply can't jump in at those prices I was willing to pay £1000+ but to have 0 generational performance increase I don't feel like the 4000 series and the 7000 series are a new generation rather a continuation of the last.
 
It isn't really nvidia's design. Also it sounds like it is very rare, >99% of people know how to plug stuff in properly.
Yet we have 0.00% of the other connectors on those same adapters melting.

-Funny how the people who can't "plug stuff in properly" managed to plug in the **other** connectors without melting them.
 
Last edited:
Back
Top Bottom