• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I know the synthetics don't translate, but I like them because they indicate potential. The timespy numbers tell me 80% PL is the sweet spot for a 4090, and without having to test any games. Plus you've already confirmed thats pretty close to the power limit you use anyway
 
Last edited:
That's still not saying it's crap! :p

It IS crap that it's only 1% faster, though.
It’s not crap, it’s a good card but it’s just not a high end card for me.

It’s made to look like a high end card because Nvidia is hoarding all the high end 102 dies and only making one card on them (last gen there were 5) this is limiting people’s options to the 4080 unless they want to spend £1600+.

This is why the 1.2k and now £1k pricing just don’t feel right.
 
Depends what you can get from where you're comfortable/able to buy. For instance, right now, Jet Stream model 4080 from Palit is around 1300 euros at the store I usually buy from. Game Rock 4090 is about 2050 euros. Both are the cheapest models. 4090 is 26% faster for a bit over 57% more money and yet people go on how great of a buy the 4090 is!!! :)) There's a lot of performance to cover even until you get the same performance/$, nevermind having the 4090 be better.

So yeah, is bad since it was price hiked compared to 3080 MSRP, but 4090 doesn't look that much better either if what's in the shops isn't great OR you've bought the 3080 at a premium from scalpers of all type.
That really depends on resolution. 4k and higher (super ultra wide etc) start to show more and more of a gap eventually reaching 57% in some tests. But that's not a common use case, not even close.

More common use is in AI processing as then 4080S is considerably faster here than 4080. But that doesn't show in games at all, as DLSS and other Nvidia AI boosters barely use tensor cores as is - in every test I've seen there's essentially 0 difference between slowest and fastest RTX card (always same % of speed boost with DLSS 3 and FG etc.).
 
Last edited:
I wonder if Nvidia is going to release a "boost" driver for 4080 Supers to bring the performance up?

They are power locked cards that's the problem, same happened with 30 series, 20 series etc.. Only a firmware hack or modding the card would allow more power to allow higher clocks and performance.
 
I'm not sure I've ever seen anyone say the 4080 is a crap card, just that it has a crap price.

But even this I don't get.

No one seems to rail against the 4090, instead it's case of "if you want the best and can afford it".

Yet the 4080S offers around 75% of the performance of the 4090 for significantly less than 75% of the price and it's "crap value"?
 
But even this I don't get.

No one seems to rail against the 4090, instead it's case of "if you want the best and can afford it".

Yet the 4080S offers around 75% of the performance of the 4090 for significantly less than 75% of the price and it's "crap value"?

It has always been diminishing returns at the very top - buying the top spot GPU has never been a consideration for the value proposition. (And the 4090 has 24GB which can be useful for some).
 
But even this I don't get.

No one seems to rail against the 4090, instead it's case of "if you want the best and can afford it".

Yet the 4080S offers around 75% of the performance of the 4090 for significantly less than 75% of the price and it's "crap value"?
price is not supposed to scale linearly with performance
3080 was half the price of 3090 at 85% perf.
1080 almost half the price of Titan at 80%

Let halo product have its halo and not affect the rest of the lineup.
 
Do we really need a top tier graphics card, yes games like Starfield, Alan Wake 2 and Hogwarts Legacy, may need them especially for decent Ray Tracing, but will they have player retention, so long term is a 4080 Super or a 4070 Ti Super needed, if the games we end up playing the most, do not need that level of performance.

Look at games like Baulders Gate 3 or Palworld, they have decent graphics but not demanding graphics, but it is the game play that stands out and in the long term they will retain players so if the best games and the most played games, at present, are the least demanding then are we not buying graphics cards, like the 4080 for a moment of eye candy then sticking to games a 4070 could handle with no issues.

4K is s different matter, yes a card like the 4080 or a 4090 is needed, but most players are on 1080p or 1440p.
 
Last edited:
Voltage locked more than power. Strix has a lot of power limit, but qith a voltage ceiling it doesnt matter

Yup , when I said power I meant voltage too.. Basically Nvidia control how fast their cards will go, just have to look at the 3090 vs 3090 ti.. extra 100W they gave the 3090ti. 3090 vs 3090 ti was like 2% more cores unlocked.
 
Last edited:
Do we really need a top tier graphics card, yes games like Starfield, Alan Wake 2 and Hogwarts Legacy, may need them especially for decent Ray Tracing, but will they have player retention, so long term is a 4080 Super or a 4070 Ti Super needed, if the games we end up playing the most, do not need that level of performance.

Look at games like Baulders Gate 3 or Palworld, they have decent graphics but not demanding graphics, but it is the game play that stands out and in the long term they will retain players so if the best games and the most played games, at present, are the least demanding then are we not buying graphics cards, like the 4080 for a moment of eye candy then sticking to games a 4070 could handle with no issues.

4K is s different matter, yes a card like the 4080 or a 4090 is needed, but most players are on 1080p or 1440p.
Horses for courses, some people want the best for that extra grunt because games like Alan Wake 2, Cyberpunk etc. all look incredible when you go all out. Others don't care that much and will continue to play at low settings at lower resolutions.

It's good to have options and with the improvements to monitor tech (higher refresh at higher resolutions becoming more common) I don't think it's bad that you have options at varying points. The only thing that is a ****-boiler is the price of these things.
 
Yup , when I said power I meant voltage too.. Basically Nvidia control how fast their cards will go, just have to look at the 3090 vs 3090 ti.. extra 100W they gave the 3090ti. 3090 vs 3090 ti was like 2% more cores unlocked.
Thats true, 3090 was power starving(more than 100W going to ram alone...). Altough Ampere not so much - give 4090 a 600W limit and its barely faster than stock - I wish we had more control over voltage.
 
But even this I don't get.

No one seems to rail against the 4090, instead it's case of "if you want the best and can afford it".

Yet the 4080S offers around 75% of the performance of the 4090 for significantly less than 75% of the price and it's "crap value"?
Compared to 3080 vs 3090 definitely ;) but this will probably never happen again.
 
Compared to 3080 vs 3090 definitely ;) but this will probably never happen again.

Well I keep saying that the 3080 was really an anomaly, especially as it was almost impossible to get at or even near the MSRP and that judging all subsequent cards by the price/performance of the 3080 tends to render everyting "crap".

But no one seems to agree with me :D
 
Well I keep saying that the 3080 was really an anomaly, especially as it was almost impossible to get at or even near the MSRP and that judging all subsequent cards by the price/performance of the 3080 tends to render everyting "crap".

But no one seems to agree with me :D
Yeah true - last time I remember lower tier being such a great value was maybe gtx 260 216...or gtx 470... then AMD stopped being competitve and here we are
 
Last edited:
Back
Top Bottom