all i can say isA big collection of benchmarks are listed here too https://videocardz.com/newz/amd-dis...900xt-rx-6800xt-and-rx-6800-gaming-benchmarks

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
all i can say isA big collection of benchmarks are listed here too https://videocardz.com/newz/amd-dis...900xt-rx-6800xt-and-rx-6800-gaming-benchmarks
Did you check footnotes 5/6/7?
- Testing done by AMD performance labs October 18 2020 on RX 6900 XT (20.45-201013n driver) , RTX 3080 (456.71 driver), AMD Ryzen 9 5900X (3.70GHz) CPU, 16GB DDR4-3200MHz, Engineering AM4 motherboard, Win10 Pro 64. Following games were tested at 4K with each cards best API : Battlefield V DX11 Ultra, Borderlands 3 best API Badass, Call of Duty: MW DX12 Ultra, Division 2 DX12 Ultra, Doom Eternal Vulkan Ultra Nightmare, Forza DX12 Ultra, Gears 5 DX12 Ultra, Resident Evil 3 best API Ultra, and Shadow of the Tomb Raider DX12 Highest, Wolfenstein: Young Blood Vulkan Mein Leben. AMD Smart Access Memory and Rage Mode were enabled. Performance may vary. RX-567
- Testing done by AMD performance labs October 18 2020 on RX 6800 XT (20.45-201013n driver) , RTX 3080 (456.71 driver), AMD Ryzen 9 5900X (3.70GHz) CPU, 16GB DDR4-3200MHz, Engineering AM4 motherboard, Win10 Pro 64. Following games were tested at 4k at max settings: Borderlands 3, best API Ultra; Doom Eternal, Vulkan Ultra Nightmare; Forza Horizon 4, DX 12 Ultra; Gears 5, DX12 Ultra; Hitman 2, DX12 Ultra; Resident Evil 3, best API, Ultra; Wolfenstein: Young Blood, Vulkan Mein Leben. AMD Smart Access Memory and Rage Mode were enabled. Performance may vary. RX-559
- Testing done by AMD performance labs October 18 2020 on RTX 2080 Ti (456.71 driver), RX 6800 (20.45-201013n driver) with AMD Smart Access Memory enabled, AMD Ryzen 9 5900X (3.70GHz) CPU, 16GB DDR4-3200MHz, Engineering AM4 motherboard, Win10 Pro 64. Following games were tested at 4K with each cards best API : Battlefield V Ultra, Borderlands 3 DX12 Ultra, Call of Duty: MW DX12 Ultra, Division 2 DX12 Ultra, Doom Eternal Vulkan Ultra Nightmare, Forza DX12 Ultra, Gears 5 DX12 Ultra, Resident Evil 3 DX11 Ultra, and Shadow of the Tomb Raider DX12 Highest, Wolfenstein: Young Blood Vulkan Mein Leben. Performance may vary. RX-555
..and compare them with settings shown in panel results
If you're hairsplitting atleast do it correctly![]()
Btw you completely ignored my advice to cross-reference geforce numbers shown by AMD.
As for my position...
did you?
So again your response to me asking why nobody else was questioning the numbers, is for me to go look them up myself. That's no an answer it is?
If you're going to accuse me of hairsplitting, at least learn what it means first.
Give me some good advice and i'll listen. the 'advice' you've offered so far is nonsense and not an answer to my question.
Not interested in your position, you've made your 'position' pretty clear with your crap advice and dodging of questions.
Footnote 5/6 rage mode on
Footnote 7 rage mode not on
Panel data: no mention of rage mode
Rx6800 results should match
Improper hairsplitting is trying to figure out which of the 2 results in +/- 3% range is max fps in absence of definitions and then blatantly ignoring footnotes ?
I can't argue BS anymore..
What does rage mode have to do with it?
What does rage mode have to do with anything? Again, i ask what the numbers mean and you try to tell me to look at a footnote because it mentions rage mode? Not an answer. You are clearing unwilling to answer or don't understand the question. Which would just be hilarious.
Answer my question and stop with your BS then: what do the AMD numbers mean? What are they Nabloperator? Not your nonsense 'common sense', tell me, verifiably, what they mean. No more nonsense, answer my question.
mountain out of a molehill
I wonder if he did the same when Nvidia puts out their claims like X times better in ray tracing with Turing and all their crap.Not sure why you are trying to make a mountain out of a molehill. Most of the games tested have builtin benchmarks so they will be using them and showing the average fps. Why on earth would AMD show the high values for their cards and the average for the Nvidia cards? They would be crucified in the reviews if the numbers were nowhere near the ones shown.
If AMD were going to fake anything they would have leaked the results much earlier to stop the 3080 hype dead when it was announced.
I played Watch Dogs Legion with DLSS "Quality" and let me tell you, vs native 4k, no noticeable difference.
Ok
Rage mode is a preset overclocking profile..real world performance impact seems to be a placebo atm with low single digit perf gains shown that may not be statistically significant..
So the 6800xt/6900xt were tested twice once with rage mode on and then with rage mode off.. so there are 2 different datasets being used for these cards while the 6800 was tested only once with rage mode off
Anyhow the above discussion is like lawyer speak because the difference between the 2 datasets is so low that it can be ignored, rather than trying to figure out which one is maximum?
fs123 said:Not sure why you are trying to make a mountain out of a molehill. Most of the games tested have builtin benchmarks so they will be using them and showing the average fps. Why on earth would AMD show the high values for their cards and the average for the Nvidia cards? They would be crucified in the reviews if the numbers were nowhere near the ones shown.
LOL. no it isn't. That's not what it means at all. Oh dear.Nablaooperator said:Isn't that also called hairsplitting?
I wonder if he did the same when Nvidia puts out their claims like X times better in ray tracing with Turing and all their crap.
Just wait for reviews Jesus.![]()
me (aug 2019) said:A huge chunk of silicon indeed. Nvidia could have dropped RTX and turning would have been smaller (around 40% as I recall), cheaper to manufacture andcheaper to buy.... who am I kidding, this is nVidia we are talking about - it wouldn't have been any cheaper to buy. Anyway, the point is given the state of RTX and the cost of the silicon to run RTX, nVidia had the option to not bundle it and not pass those costs on to the consumer. They didn't do that, instead consumers got to pay a 'fair price' for a big lump of silicon that, in the minds of all those people who aren't interested in RTX , is a complete waste of transistors. Hmm.
DLSS is better than native!So DLSS is not blowing you away then? I think this is exactly what @TNA said and got flamed unfairly for it IMO.
LOL. no it isn't. That's not what it means at all. Oh dear.
Maintain out of a molehole = exaggerating the importantance of a trivial matter. Eg, 'elbows are too pointy'
hairsplitting = making unnecessary distinctions. 'it's not duck egg blue, it's light teal'
Lol.. you tend to interpret language out of context
Has nobody answered his question yet?
By far the most sensible assumption so farI assume AMD have posted average frame rates as that's standard practice, would be very odd to post maximums, as someone else pointed out, they'd be crucified come review time.
As for the 'up to' bit, I'm sure that's to include rage mode and their high end test rig, some legal reasons too I expect.
Nice to have some competition at the high end again![]()
Ok
Rage mode is a preset overclocking profile..real world performance impact seems to be a placebo atm with low single digit perf gains shown that may not be statistically significant..
So the 6800xt/6900xt were tested twice once with rage mode on and then with rage mode off.. so there are 2 different datasets being used for these cards while the 6800 was tested only once with rage mode off
Anyhow the above discussion is like lawyer speak because the difference between the 2 datasets is so low that it can be ignored, rather than trying to figure out which one is maximum?