• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RDNA3 unveiling event

In all seriousness though, with a driver update or 2, I'm expecting it to be ahead of a 4080 in games but it will need to be at least 20% ahead to justify going for it over a 4080 @ £1099 if them timespy scores apply to games too..... I suspect the reference ones will be a bit limp and it will be the custom aib models which shine but expect prices for those to be £1200+
RT the ****er is gonna get slapped around like a little bitch though... :D
 
If performance is similar in raster but the 4080 is superior in RT and they are both similar price i.e. £1050-1150, why wouldn't you go with the option that offers the better all round package though?

because amd drivers age like fine wine, just look how close 6000 series got to ampere, yes it takes time, i can see the benifits of RT, i was running a wc fe 3090 for a good while, then i managed to pickup a liquid devil 6900xtu and been running it for a good while, as drivers released performance gains were nuts, i game at 4k and most of my games dont use RT except cyberpunk, but with my settings i can run solid 60fps maxed and the game runs amazingly smooth, other games i play almost max out at 144fps.

really looking forward to getting the next gen and hopefully a banger of a AIB card to see what they can really do :)
 
If performance is similar in raster but the 4080 is superior in RT and they are both similar price i.e. £1050-1150, why wouldn't you go with the option that offers the better all round package though?
What if the Nvidia card is worse in terms of overall efficiency? How would you place the cards relative to each other then?
 
Last edited:
Important point here too which many overlooked with rdna 2 vs ampere, was when it came to rt workloads, ampere is considerably more power efficient, it will be the same with ada and rdna 3 too, if anything, difference might even be larger especially if you throw in dlss and FG and if more future RT games implement SER.....

But is that due to Nvidia being better at RT or due to Nvidia using DLSS to compensate the RT performance? I would think DLSS uses less power than raw frames?
If it performa the same as a 4080 it needs to be less, quite a bit less if AMD want to gain any market share. I mean if it is £50 or even £100 might as well go Nvidia and get DLSS 3 and better RT and better resale price as much bigger market to sell to.

We don't suddenly go from laughing at a 4080 and it's pricing and think it's great because AMD offer similar performance for £100 less. To be a good card it needs to be way less. I even said 4080 should be £700 or so to appeal to me.

I agree with that, i'd be willing to pay £100 more for better RT if the performance is the same.
 
What if the Nvidia card is worse in terms of overall efficiency? How would you place the cards relative to each other then?


I mean if you can afford to spend £1000 on GPUs i don't think you're too worried about power efficiency.
If you cant afford the electric you shouldn't be buying a new GPU in the first place.
 
Yup, gotta be vendor agnostic imo, the better all-rounder at the same pricing wins.

Exactly, and if the AMD tests translate into gaming performance then Nvidia 4080 will be the all-round better performer at those prices.

Now i'm hoping the benchmarks we have seen are wrong and AMD are better than what we see, or i would think AMD are overplaying their hand.
 
yes just sit in your cold house waiting to die becauise if you cant afford one thing you cant afford anything else

That's not how it works.
It just doesn't make sense to think about buying the top tier cards that will be the most power hungry if you are concerned about efficiency.

You are best buying the middle of the stack.
Either way the power limits do not translate to real use anyway, your GPU wont be sucking up max power at all times. Realistically the difference between either side is likely to be negligable in real use.
 
Last edited:
because amd drivers age like fine wine, just look how close 6000 series got to ampere, yes it takes time, i can see the benifits of RT, i was running a wc fe 3090 for a good while, then i managed to pickup a liquid devil 6900xtu and been running it for a good while, as drivers released performance gains were nuts, i game at 4k and most of my games dont use RT except cyberpunk, but with my settings i can run solid 60fps maxed and the game runs amazingly smooth, other games i play almost max out at 144fps.

really looking forward to getting the next gen and hopefully a banger of a AIB card to see what they can really do :)

RDNA 2 and ampere performed pretty close on launch (in raster) and still do i.e. 3080 and 6800xt are neck and neck depending on title, 6800xt will win some, lose some, same goes for 6900xt vs 3090 and 3090ti vs 6950xt. SAM/rebar came along which offered a decent jump in some games compared to ampere i.e. ACV, forza 5, halo but then nvidia recently improved their dx 12/fine wined ampere and rebar performance to close the gap significantly in those main titles where amd had a decent lead.

I suspect there might be a good chance of some fine wine with this chiplet design but that's an unknown and if history is anything to go by.... by the time amd make those serious improvements, we'll be on the next lot of hardware again, of course if you buy to last years then it is a more valid reason but given you got a 3090 and now a 6900xt, this won't apply to you :p

What if the Nvidia card is worse in terms of overall efficiency? How would you place the cards relative to each other then?

Depends really how much of a difference there is, that and we need to see how well RDNA 3 will scale. But iirc, both are rated at the same wattage, that and we know the 4080 is a bit of a power efficiency beast as is:

Ww9Cjra.png

But is that due to Nvidia being better at RT or due to Nvidia using DLSS to compensate the RT performance? I would think DLSS uses less power than raw frames?

Well both but ruling out dlss/fg, ampere is still ahead considerably in terms of performance per watt in RT especially when/if you factor in how well ampere undervolts e.g. my 3080 is using a 100w less than stock settings for same/better performance.
 
RDNA 2 and ampere performed pretty close on launch (in raster) and still do i.e. 3080 and 6800xt are neck and neck depending on title, 6800xt will win some, lose some, same goes for 6900xt vs 3090 and 3090ti vs 6950xt. SAM/rebar came along which offered a decent jump in some games compared to ampere i.e. ACV, forza 5, halo but then nvidia recently improved their dx 12/fine wined ampere and rebar performance to close the gap significantly in those main titles where amd had a decent lead.

I suspect there might be a good chance of some fine wine with this chiplet design but that's an unknown and if history is anything to go by.... by the time amd make those serious improvements, we'll be on the next lot of hardware again, of course if you buy to last years then it is a more valid reason but given you got a 3090 and now a 6900xt, this won't apply to you :p



Depends really how much of a difference there is, that and we need to see how well RDNA 3 will scale. But iirc, both are rated at the same wattage, that and we know the 4080 is a bit of a power efficiency beast as is:

Ww9Cjra.png



Well both but ruling out dlss/fg, ampere is still ahead considerably in terms of performance per watt in RT especially when/if you factor in how well ampere undervolts e.g. my 3080 is using a 100w less than stock settings for same/better performance.

Ah fair, looking at that chart though trying to figure where my 1080ti would sit lol

What surprises me is the size of the coolers on the 4090/4080 when they use less power than the 3090ti by a hell of a margin.
 
Last edited:
Exactly, and if the AMD tests translate into gaming performance then Nvidia 4080 will be the all-round better performer at those prices.

Now i'm hoping the benchmarks we have seen are wrong and AMD are better than what we see, or i would think AMD are overplaying their hand.
There's several early release videos of the gaming perf and from what i've seen the XTX definitely beats the 80; that'll be with pre-release drivers no doubt too.

I would love AMD to smash nVidia just because i generally don't like nvidia as a company (life-long nvidia owner), they remind me a lot of intel.
 
There's several early release videos of the gaming perf and from what i've seen the XTX definitely beats the 80; that'll be with pre-release drivers no doubt too.

I would love AMD to smash nVidia just because i generally don't like nvidia as a company (life-long nvidia owner), they remind me a lot of intel.

If they are legit, those channels are just farming views, why would some random dude with no history be sent early releases of cards to a channel with under 100 subs?

It would be good if they do as it would nice to have some genuine competition
 
Back
Top Bottom