• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD RX6000 vs Nvidia Ampere performance benchmarks (from AMD's website and compiled by Redditor)

I assume AMD have posted average frame rates as that's standard practice, would be very odd to post maximums, as someone else pointed out, they'd be crucified come review time.

As for the 'up to' bit, I'm sure that's to include rage mode and their high end test rig, some legal reasons too I expect.

Nice to have some competition at the high end again :)

That's what I told him in the most general sense. This was on pg 3 lol. I get a feeling that I was trolloooood. Anyways back to the subject of the thread.

That would be an unreasonable assumption.
The up to stands as a standard disclaimer, so that if someone reproduces the experiment with different results, AMD can disclaim liabilities

Are you sure about that. Gamer's Nexus did a video stating AMD told them Rage mode is just a power increase like moving the slider for Nvidia cards in Afterburner. It does not change clocks.

I don't know the specifics. Anyways I was arguing on how 2 different samples were used for Rx 6800 xt/6900 xt (which too is unnecessary legalese). You can ignore all my post replies made to a certain member. I should lurk more
 
Last edited:
Are you sure about that. Gamer's Nexus did a video stating AMD told them Rage mode is just a power increase like moving the slider for Nvidia cards in Afterburner. It does not change clocks.

If the boost clock is power limited then lifting that power limit will increase clocks in that scenario as I'm sure you're aware, so it can indirectly change clocks.
 
If the boost clock is power limited then lifting that power limit will increase clocks in that scenario as I'm sure you're aware, so it can indirectly change clocks.

Indeed but I would not call it an overclocking profile. I only brought it up as it seems as though the 6XXX could have some overclocking headroom and calling it a profile could be downplaying any actual headroom the card has.
 
That's what I told him in the most general sense. This was on pg 3 lol. I get a feeling that I was trolloooood. Anyways back to the subject of the thread.

He made an assumption, which means he's not certain and open to correction. perfectly acceptable. you made BS claims about numbers of runs and changed your mind on what 'up to' means when other people started posting. forgive me if i don't add any worth to any of your 'common sense'. Or don't, i don't care. Have you looked up those definitions yet? :D

Nablaoperator said:
I don't know the specifics. Anyways I was arguing on how 2 different samples were used for Rx 6800 xt/6900 xt (which too is unnecessary legalese). You can ignore all my post replies made to a certain member. I should lurk more

I said i assumed the two sets were the same, so why are you trying to suggest I'm hair-splitting/nit-picking/moutainmaking or whatever else you think means the same thing when i never made that argument to begin with? strawman, strawman, strawman. boring.

james.miller said:
I'm not trying to figure out the difference between the two sets, i'm assuming they are the same thing - both 'up to'

Good job Nabaoperator. Great idea on the lurking more, too. Go do that:)
 
Last edited:
So DLSS is not blowing you away then? I think this is exactly what @TNA said and got flamed unfairly for it IMO.

I wouldn't expect to be blown away by it, it's simply rendering the game at a lower res and upscaling it isn't it, the performance improvement is very welcome and it looks no different to native 4k, if it looks like native 4 that's fine, it doesn't HAVE TO look better to be good.
 
I wouldn't expect to be blown away by it, it's simply rendering the game at a lower res and upscaling it isn't it, the performance improvement is very welcome and it looks no different to native 4k, if it looks like native 4 that's fine, it doesn't HAVE TO look better to be good.

Yeah this was done to death in another thread mate, few got really anal about it so its stuck in the memory, a guy said exactly your opinion and lets say it ruffled a few die hards feathers.
 
Elp-VWwv-U4-AAp-QGW.jpg


R5 3500x @ 4.3 + RX 6800 SOTTR RT on 4K
https://twitter.com/PJ_Lab_UH/status/1322463479417040896
 
Screenshot_2020_10_29_AMD_Radeon_RX_6000_RDNA_2_Graphics_Card_Ray_Tracing_Performance_Detailed_Ray_Accelerator_Cores_Faste....png

https://wccftech.com/amd-radeon-rx-...phics-cards-ray-tracing-performance-detailed/

AMD Radeon RX 6800 RDNA 2 Graphics Card Ray Tracing Performance Leaks Out, Almost As Fast As RTX 3070 With DLSS at 4K & WQHD

https://wccftech.com/amd-radeon-rx-6800-rdna-2-graphics-card-ray-tracing-dxr-benchmarks-leak-out/

AMD Radeon RX 6800 XT’s Ray-Tracing Performance Falls Significantly Short of NVIDIA RTX, According to Early Tests

GPU DXR Performance
NVIDIA GeForce RTX 3090 749 FPS
NVIDIA GeForce RTX 3080 630 FPS
AMD Radeon RX 6800XT 471 FPS
https://www.thefpsreview.com/2020/1...short-of-nvidia-rtx-according-to-early-tests/


Footnotes
4. Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary. RX-571
https://www.amd.com/en/technologies/rdna-2
 
Last edited:
WHY ARE WE ALL USEING SUCH LARGE CUT AND PASTA TEXT AND SHOOTING DOWN AMD WITH ZERO ACTUAL FACTS AND ZERO PEOPLE WITH ACTUAL REAL WORLD PROVEN DATA, RATHER THAN WAITING AND JUST BEING IMMENCLY HAPPY ABOUT NVIDIA NOT SKULL F'IN YOUR WALLETS WHILE YOU THANK THEM FOR IT BECAUSE WE HAVE MARKET COMPETITION????

You can thank AMD for getting the 3080Ti in Q1/2 2021 now rather than Q4 2022 and not being charged £1500 minimum for the #GreenPrivilege.
 
You can thank AMD for getting the 3080Ti in Q1/2 2021 now rather than Q4 2022 and not being charged £1500 minimum for the #GreenPrivilege.

I'm curious where the 3080Ti will sit performance wise if it's a cutdown 3090. At best it will compete with the 6900XT at $1000 if it has performance between the 3080 and 3090. If it's a full fat chip then it is essentially a 3090 with perhaps a bit less VRAM. A tough situation for Nvidia. AMD may well be readying another card (6950XT?) with even more CU's to counter it.
 
I'm curious where the 3080Ti will sit performance wise if it's a cutdown 3090. At best it will compete with the 6900XT at $1000 if it has performance between the 3080 and 3090. If it's a full fat chip then it is essentially a 3090 with perhaps a bit less VRAM. A tough situation for Nvidia. AMD may well be readying another card (6950XT?) with even more CU's to counter it.

They've left it too tight. More Vram is not enough to tempt people who already own a 3080 and can see there's no issue, but it might tempt people upgrading from previous generations depending on the price.

If a 3080ti comes along it will probably have to outperform the 3090 which would be unlikely as they'd only want the next generation to be doing that.

If they want to squeeze AMD then I think price drops are the most likely route as opposed to new models that will only cannibalise their own sales.
 
They've left it too tight. More Vram is not enough to tempt people who already own a 3080 and can see there's no issue, but it might tempt people upgrading from previous generations depending on the price.

If a 3080ti comes along it will probably have to outperform the 3090 which would be unlikely as they'd only want the next generation to be doing that.

If they want to squeeze AMD then I think price drops are the most likely route as opposed to new models that will only cannibalise their own sales.
There are far, far more people on the waiting list for a 3080 than actually own one. My view is that if I was on that waiting list, having seen what AMD has launched and with strong suspicions that Nvidia are going to release a new better-specced card sooner rather than later, I would be cancelling that pre-order to avoid a strong sense of buyers remorse.
 
Last edited:
There are fae, far more people on the waiting list for a 3080 zthan actually own one. My view is that if I was on that waiting list, having seen what AMD has launched and with strong suspicions that Nvidia are going to release a new better-specced card sooner rather than later, I would be cancellig that pre-order to avoid a strong sense of buyers remorse.

At 1440p, the 3090 is not materially faster than the 3080. What's the point of paying that high premium to get 7% more performance (this is assuming the 3080 Ti is a slightly cut down 3090 with more memory). I would get buyers remorse if like the 2080 Ti (vs. 2080), it was at least 20% faster.
 
There are far, far more people on the waiting list for a 3080 than actually own one. My view is that if I was on that waiting list, having seen what AMD has launched and with strong suspicions that Nvidia are going to release a new better-specced card sooner rather than later, I would be cancelling that pre-order to avoid a strong sense of buyers remorse.

It's not the time to cancel pre orders because the AMD stock situation is an unknown; why leave one endless queue to join another? If AMD have volume on release then I can see them clean up, even if their benchmark results turn out to be exaggerated by favourable hardware and software some won't buy or use.

The immediate battleground is in the supply chain, and the next battle will be price.
 
It's not the time to cancel pre orders because the AMD stock situation is an unknown; why leave one endless queue to join another?
I think you misunderstood my post. It's not about Nvidia or AMD stock (though AMD is sure to better than Nvidia on launch as it can't get any worse than that), it's about buying a card (in this case the 10GB 3080) that may soon be superceded by a better one with more VRAM as a knee-jerk response to Big Navi. I don't buy a card I don't feel is the best choice just because I'm already in a queue.

If you are on the Ampere waiting list and are happy for someone to have almost a grand of your money for weeks and maybe months, while you sit twiddling your thumbs, then by all means go ahead.
 
Last edited:
Back
Top Bottom