• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA GeForce RTX 3090 Ti, the flagship reinvented

Sure -
3080 ~340W @ 47FPS
6900XT ~300W @ 29FPS
360Ti ~200W @ 27 FPS

Not interesting, perhaps you thought this was the knitting section?
Your answer appears to be missing some values. The top 5 Nvidia values. You know, the ones where they are burning 450w+?

To me, the interesting part is the point where it's not the architecture giving them improvement, but just more power being burnt for diminishing returns. Where an architecture is being pushed too dammed far.

It's blatantly obvious that the 3090 Ti is past that point. 3090 as well. The others? Can't tell from this data. But you don't appear interested in actual debate.
 
To me, the interesting part is the point where it's not the architecture giving them improvement, but just more power being burnt for diminishing returns. Where an architecture is being pushed too dammed far.

Think it was like that for Ampere from the off, though. Samsung's manufacturing process was also never that great - even the basic 3080 was toasty for what performance it actually offered. So the architecture and Samsung's inability to actually make it (efficiently) simply hasn't been there.

And Samsung haven't exactly redeemed themselves in the last 18 months - just look at the mess that is the Sanpdragon 8 Gen 1 or their own ARM/RDNA-based Exynos chips. In that market, everyone is waiting for TSMC to rescue Snapdragon with the "+" models.

It also doesn't look as if even TSMC's help with the Nvidia 4000 series is going to improve things, but that may just be due to Nvidia taking the easier way out with massive - very expensive - single dies again.

Still, Nvidia have seen customers willing to throw money at any piece of overpriced silliness - 3080 12GB, 3080 Ti and 3090 Ti - so will carry on happily raking it in whilst that is the case.
 
:cry:
Also portrays as best possible that a 360Ti is generally within 2FPS of the top AMD card. ;)
It's a good card that's also priced well in comparison. I'm not dunking on the Nvidia cards in general. I'm annoyed by people claiming massive performance on cards that cost an insane amount and burn power faster than my coffee machine. Graphs that show these massive leads in [specific workload], for some product. But ignore the fact that it's a pretty pointless. AVX512 as another example. It's a critical workload. Real important. Then... it's not.

People who want to define their *side* as the best. "Look, look how terrible the other side is, in this one benchmark!" without context. Without understanding that everything is a tradeoff. People being gits, basicly.
 
Think it was like that for Ampere from the off, though. Samsung's manufacturing process was also never that great - even the basic 3080 was toasty for what performance it actually offered. So the architecture and Samsung's inability to actually make it (efficiently) simply hasn't been there.
I would be tempted to agree, but I've not seen a full set of numbers for it - which is why I said I'd be interested in them. If it wasn't for the current price insanity, I'd have charted it out myself, but I can't bring myself to look at when I know I'll just wince about the prices? (Although Gibbo has nearly got me a couple of times now. In 3 months, once I've settled into the new job... and prices have dropped further (ha!)....)

Do you think the AMD line is similarly running above architecture performance point?
 
Do you think the AMD line is similarly running above architecture performance point?

For low and mid tier, I think probably not. For the high end (6900), I'd say they are in the same boat as all the other GPU/CPU products out there right now - they are hitting that wall of rapidly diminishing price/performance based on the current architectures and available process nodes.

Just my two cents. Others may disagree.
 
Your answer appears to be missing some values. The top 5 Nvidia values. You know, the ones where they are burning 450w+?

To me, the interesting part is the point where it's not the architecture giving them improvement, but just more power being burnt for diminishing returns. Where an architecture is being pushed too dammed far.

It's blatantly obvious that the 3090 Ti is past that point. 3090 as well. The others? Can't tell from this data. But you don't appear interested in actual debate.

I'm not suggesting anyone should consider the 3090Ti for non professional use, or any of the other cards higher than a 3080. Indeed I find it laughable that people already spend £300+ on a CPU and £150+ on a motherboard just to play games.

All tech uses greater amounts of power to acheive greater performance. That point you claim the 3090Ti is past doesn't exist for those who are not budget limited. It's also worth mentioning again that you only have AMD to compare to who are reaping the efficency of a smaller node, which the 3 cards I listed show just how efficent Ampere is when using up to date tech. If you want to debate, fine, just don't regurgitate the clickbait beggars.

People who want to define their *side* as the best. "Look, look how terrible the other side is, in this one benchmark!" without context. Without understanding that everything is a tradeoff. People being gits, basicly.

Rather than trying to play one side off against the other, why not learn a little about the tech involved? Todays PC tech is made up of AI, RT and legacy support. Stop living in the past.
 
Last edited:
Couldn't agree more tbh.
People who want to define their *side* as the best. "Look, look how terrible the other side is, in this one benchmark!" without context. Without understanding that everything is a tradeoff. People being gits, basicly.
Well said, there is really only a couple of people here that like to do that frequently.
 
I know many think this is a pointless card but I'm still tempted to sell the 3090 and if it costs me a few hundred pounds to "upgrade" I'll do it.

The high end has always been about small diminishing returns anyway.
 
I know many think this is a pointless card but I'm still tempted to sell the 3090 and if it costs me a few hundred pounds to "upgrade" I'll do it.

The high end has always been about small diminishing returns anyway.

The problem is when those "small diminishing returns" become inperceptible to the human eye. I'd say that is certainly the case for the 3090Ti versus a decent OEM 3080 Ti.

Having just looked at OCUK 3080Ti prices versus 3090Ti prices, the average price delta seems to be about £700. That's a lot to pay for an inperceptible improvement. :)

And for those cases where very low framerates come into play (Cyberpunk, AC:V, WD:L and MSFS2020 @4k/Ultra, for example), you'd still end up tweaking the settings anyway to get vastly improved performance on any card, so even in those cases the 3090Ti makes little sense, IMHO.
 
The problem is when those "small diminishing returns" become inperceptible to the human eye. I'd say that is certainly the case for the 3090Ti versus a decent OEM 3080 Ti.

Having just looked at OCUK 3080Ti prices versus 3090Ti prices, the average price delta seems to be about £700. That's a lot to pay for an inperceptible improvement. :)

And for those cases where very low framerates come into play (Cyberpunk, AC:V, WD:L and MSFS2020 @4k/Ultra, for example), you'd still end up tweaking the settings anyway to get vastly improved performance on any card, so even those cases make little sense, IMHO.

Yeah I agree that for the difference in price the jump in perfomance is small, I just seem to suffer from Fomo.

The other argument is whether it's worth buying a card priced this high when it's only a few months before it's overtaken but I'm not convinced the new cards are months away like some people believe. I can't see new cards arriving until the end of the year, with the 4090/Ti equivalent not til the first quarter of 2023 anyway.

So I do think there's some value to be had buying a high end series card now and getting a years use out of it.

The games you mention are frustrating, as having a high end card and not even being able to run at a solid 60 is annoying. I'm looking forward to cards that can game at 4k 120 comfortably. Yes I can do that on certain games but a lot of recent releases I play are nowhere near.

Forza Horizon 5 and Tiny Tina wonderlands can't run at close to 120 maxed out at 4k.
 
The games you mention are frustrating, as having a high end card and not even being able to run at a solid 60 is annoying. I'm looking forward to cards that can game at 4k 120 comfortably. Yes I can do that on certain games but a lot of recent releases I play are nowhere near.

Forza Horizon 5 and Tiny Tina wonderlands can't run at close to 120 maxed out at 4k.

Agree. It's for these very reasons I've stuck with 1440p and my trusty 1080Ti. With sensible tweaking, all of those games can be made to run at 60-100 fps with no massive change in visual quality to my old eyes. Sure, I can't have ray-tracing, but nothing I've seen so far has left me so impressed I felt the need to upgrade.

As you say, 100+ fps @ 4k is really where we need to be. That's at least 6+ months away, IMHO - so I won't be funding a new car for Gibbo until that point. :D
 
Back
Top Bottom