We can't because the used games are the most popular while there are hundreds of not so popular all of which tend to be more Nvidia optimised than Radeon optimsied.
It's actually good science to remove outliers ...
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
We can't because the used games are the most popular while there are hundreds of not so popular all of which tend to be more Nvidia optimised than Radeon optimsied.
It's actually good science to remove outliers ...
No, because you don't know if the benchmarks showing performance parity are not the outliers lol
We can't because the used games are the most popular while there are hundreds of not so popular all of which tend to be more Nvidia optimised than Radeon optimsied.
Why do you hate it so much?RX 5700 XT is a joke of a card. It scores 0 also in all CUDA and Ray-tracing benchmarks.
Adored getting called out for his ******** is good to see - I hope adoredtv gets sued to hell and back
https://mobile.twitter.com/HardwareUnboxed/status/1312385403417104384
That's not how stats are summarized.. we can't suggest population level parameters with such a small sample, and that too not well behaved
No, because you don't know if the benchmarks showing performance parity are not the outliers lol
Well, many gamers replay games such as the NFS series that are dozen on their own, F1 starting from version 2010 onwards, the Colin McRae Rally series, StarCraft, Diablo, Warcraft, Fortnite, CS all of them, Crysis 1, 2, 3, Far Cry 1, 2, 3, 4, 5, Quake - all of them, etc, all the smaller and less popular games - there are literally hundreds of games and yes, the sample size is very small with only 2020 or 2019 games.
Adored getting called out for his ******** is good to see - I hope adoredtv gets sued to hell and back
https://mobile.twitter.com/HardwareUnboxed/status/1312385403417104384
@4K8KW10 I can cherry pic a bunch of games and make the 5700XT look at least as good as an RTX 2080, the 5700XT seems to do much better than Turing with newer titles so its pretty easy.
You're working really hard picking your way through the internet to find things to use to chip off every percent you can get and then argue about those few percent as if its somehow a huge issue for AMD's competitiveness, it isn't, it doesn't make the blindest bit of difference, its just deranged.
And that makes artificially limiting stock to drive up prices and therefore actively lying about MSRP somehow OK? Turing gave us more than Pascal, Pascal gave us more than Maxwell. Are Turing, Pascal and Maxwell prices acceptable then? You clearly don't get what's going on, or you don't care, and that's fine. That also means theres zero reason to continue this particular subject of conversation.
That’s the pretending not to be biased. An occasional slap on the wrist while full on beating the other doesn’t count in my book sorry.
either you’re fully impartial and **** on turing’s price, initial DLSS perf etcetc but do point out the advantages, while also bash AMD’s lack of raytracing, ML etc or you don’t do either. They always make excuses for AMD : ‘RT isnt worth it yet, DLSS was bad.. but its ok now but only used sparingly, so again no problem if AMD doesnt have it etcetc’
can’t wait for the inevitable ‘RDNA2 Hub’ reviews in which suddenly raytracing is worth it even though perf will be same or lower than turing/ampere.
Not that I've watched every one of their videos but I find it hard to believe they've never once mentioned the lack of RT on AMD hardware.That’s the pretending not to be biased. An occasional slap on the wrist while full on beating the other doesn’t count in my book sorry.
either you’re fully impartial and **** on turing’s price, initial DLSS perf etcetc but do point out the advantages, while also bash AMD’s lack of raytracing, ML etc or you don’t do either. They always make excuses for AMD : ‘RT isnt worth it yet, DLSS was bad.. but its ok now but only used sparingly, so again no problem if AMD doesnt have it etcetc’
can’t wait for the inevitable ‘RDNA2 Hub’ reviews in which suddenly raytracing is worth it even though perf will be same or lower than turing/ampere.
Wow, there's utterly no bias in you is there?![]()