Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Cause it fits his narrative and gets the clicks from all the feverish fanboys and haters in his audience.why has he based it all on the 2080ti vs. 3090, rather than 2080 vs. 3080 - or even the 2080ti vs. 3080, which is where all the value is?
why has he based it all on the 2080ti vs. 3090, rather than 2080 vs. 3080 - or even the 2080ti vs. 3080, which is where all the value is?
his own charts show that for the same launch price, a 3080 is effectively 80-100% faster in relative performance than the 2080. that's the biggest price-performance leap we've seen in a long time.
the 3090 isn't a flagship card, it's a niche product for professional use and people who have a money no object approach to performance. the 3080 is the flagship.
That doesn't have some of the driver features that would be needed for some professional software.why has he based it all on the 2080ti vs. 3090, rather than 2080 vs. 3080 - or even the 2080ti vs. 3080, which is where all the value is?
his own charts show that for the same launch price, a 3080 is effectively 80-100% faster in relative performance than the 2080. that's the biggest price-performance leap we've seen in a long time.
the 3090 isn't a flagship card, it's a niche product for professional use and people who have a money no object approach to performance. the 3080 is the flagship.
Because you're comparing generational performance for the architectures on the whole, not cherry picking examples to make Ampere shine.
I mean, his video is spot on, and I've said this since the beginning - Ampere is a turd (especially if you factor in OC), which only looks half-good compared to the absolute **** stain that was Turing, and yet... it doesn't matter. At the end of the day we still need a GPU & if we want to enjoy our gorgeous 4K displays and drive games properly then we don't really have an alternative. It's a **** jump, but I'll still buy one because it's better than waiting for some miraculous jump that might never come. I don't need to delude myself into thinking it's the "greatest leap ever" just because I'll buy one. That's just non-sense.
Why people can't separate the two and instead become echoes of Nvidia's marketing remains baffling to me.
Must be right not like A3D is full of it most of the timeYeah it's bad according to Jim, all facts he says.