• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Highest I've seen my GPU 65C/55% fan speed in Alan Wake 2 maxed out + FG

Clipboard02.jpg
Is that blocky enough??? :p
 
Yeah it's not worth it unless you can get a good s/h deal tbh, 35% increase at best. Better just tweaking settings if you want to raise the fps :)
That depends on resolution and desire for RT/PT. In standard 1440p 4080S will be fine. But the higher you go above that (even just UW 3440x1440) the bigger the difference and 4080 quite quickly falls short. Let's not forget that table someone pasted her a few times, where it clearly shows that 4080 is more of a xx70 class chip, hence one shouldn't expect miracles from it. At least price drop made it much easier accessible than before, considering MSRP - though there seem to be very small amount of the MSRP priced ones (especially in US apparently) and the rest are still silly expensive.
 
It was a 4k/240Hz screen being discussed, and yes I've done my homework.
I've seen plenty of examples where in 4k 4090 is the only GPU giving 60FPS+ of 1% lows, without decreasing settings. Which, don't forget, is a sin in case of PCMR. ;) That said I really got used to just setting everything to max without thinking these days, but that's really not necessary to enjoy a good game. :)
 
I hope he does not work within science.

I deal with information at the level of tens/hundreds of millions of data points. People's ignorance of basic statistics is alarming. Do they not teach sampling and things like the central limit theorem anymore in school.

Some anecdotes is not evidence. A massive survey regularly conducted by steam is strong evidence when you take the picture over time.

I know people want to believe AMD are battling nvidia and succeeding to a level, but data sources which are global and deal with a large amount of data disagrees.

We even have JPR analyst reports which show something very similar (for new shipments), but people woud prefer to ignore them because they think they know better and think they can find a hole in the analysis, or have anecdotal evidence.

 
Last edited:
Thinking of switching over to the green side if a decent 750ish price comes up for a 4070ti super. What are the good quiet brands ? Good warranty etc
 
Last edited:
I've seen plenty of examples where in 4k 4090 is the only GPU giving 60FPS+ of 1% lows, without decreasing settings. Which, don't forget, is a sin in case of PCMR. ;) That said I really got used to just setting everything to max without thinking these days, but that's really not necessary to enjoy a good game. :)

Even something like F1 23, with everything on, will only manage 70-80fps at 4K on a 4090.
 
dont mind that gives me more info and I can decide for myself good to see poeple pointing out how poor offerings now are , or you want no one to ever complain and just buy and say how great they are ?
One would think the people constantly whinging and shouting the loudest being so informed would also take some of their own advice and practice what they preach given how poor the offerings this generation an increasing number of them have caved in and purchased.

Theres a bias angle as AMD given their pricing is a little better they get nowhere near the level of scrutiny from these people as they tend to defend and make excuses as some have bought high end AMD cards that are perfectly fine and one would think they would be happy and content with their purchase.

Lastly some got their 4xxx/7xxx cards for free and profit on top by mining and/or scalping cards... then have the balls to sit on here and complain about retailers/pricing and act like some consumer champion.
Nvidia has basically just cut the price of the 4080 by $200 so I'd say it's working quiet well.
Is it? I dont think thats something to really be flexing about as time and the 4080 Super did way more to effect its pricing.
 
Last edited:
One would think the people constantly whinging and shouting the loudest being so informed would also take some of their own advice and practice what they preach given how poor the offerings this generation an increasing number of them have caved in and purchased.

Theres a bias angle as AMD given their pricing is a little better they get nowhere near the level of scrutiny from these people as they tend to defend and make excuses as some have bought high end AMD cards that are perfectly fine and one would think they would be happy and content with their purchase.

Lastly some got their 4xxx/7xxx cards for free and profit on top by mining and/or scalping cards... then have the balls to sit on here and complain about retailers/pricing and act like some consumer champion.

Is it? I dont think thats something to really be flexing about as time and the 4080 Super did way more to effect its pricing.

I don't think that is entirely the case. Plenty of people criticised the stupid RX7900XT positioning and naming and people avoided it,so prices dropped £200 and AMD had to bundle free games too. The same goes with the RX7800XT and RX7700XT. Quite a few RX7800XT were above the RRP at launch,but now more are at RRP and a few below it on and off. The RX7700XT I have seen dip below £400 sometimes.You have also seemingly forgotten AMD had to bundle a copy of Starfield worth £50~£60 on top of this at launch. The RX7600XT launch has been mocked,like the RTX4060TI 16GB pricing. Even the pricing of the latter has dropped.

AMD has had to deflate pricing of its RX6000 series for well over a year,like Intel has had to also.

Nvidia also released the Super cards,which essentially give you extra performance/VRAM for a similar price. It's quite clear sales are not as great as hardware enthusiasts on tech forums think,otherwise they won't bother.This was the same case with Turing. The "just shut up and buy crowd" defended pricing,saying Nvidia had a £200 billion professional VFX market to tap into. Yet they released the Super range because people were not buying cards. Cards such as the RTX2060 and RX5700 dropped well under £300.

Then we had the RTX3000 series range and RX6000 series which were great value in 2020 until the Pandemic and Mining screwed over prices in 2021. If the latter didn't happen then things would have panned out differently. It was people paying way above RRP for these cards which gave both Nvidia/AMD the idea of jacking up pricing this generation. This was the same thing which happened with the Turing V1 launch because of mining.

Also looking at the realworld,I am see most of my gaming mates and acquaintances looking at what is available and just sticking with what they have. The only people I see buying new cards now are those who buying new builds or to replace very old cards. None of these people really post on tech forums.But even on gaming forums people also are not happy either. Its also where I would argue DLSS/FSR also has helped to some degree, so people are just staying put if they can.
 
Last edited:
Some anecdotes is not evidence. A massive survey regularly conducted by steam is strong evidence when you take the picture over time.
My point exactly. There's 0 public information how these surveys are conducted, what is the methodology, etc. Without that and proper peer review, it's not science or a very useful tool, it's just black box of magic - fun to look at, but evidence of nothing.
 
My point exactly. There's 0 public information how these surveys are conducted, what is the methodology, etc. Without that and proper peer review, it's not science or a very useful tool, it's just black box of magic - fun to look at, but evidence of nothing.

May not be the last word in accuracy but Steam does enough polling to be indicative, when it might miss some people it will be capturing others. Over time the results reasonably line up with other surveys or sources.
 
May not be the last word in accuracy but Steam does enough polling to be indicative, when it might miss some people it will be capturing others. Over time the results reasonably line up with other surveys or sources.
It's indicative, yes. But the problem starts when people take the exact number as a proof of for example numbers of sold GPUs of various brands and models, or CPUs etc. - which is totally not what this is. And the "over time" thing also seems wrong - because they do not survey everyone each month. If they would, that would build up the data over time. But it's not - it seems very random each month, as I described earlier. Some people get counted multiple times, some people never get surveyed. Some moths it's one group of countries (and by that different income levels and availability of hardware), then other group - but we never get told which ones are each month. Then next month it doesn't add to the previous month to fill in the holes, instead it just starts from scratch and just shows the difference/change comparing to previous month, but as data points are gathered from other locations and from other people it just feels wrong.

I like to look at it, you can get some general image of what Steam gamers are using (and how little 4000 series really sold comparing to previous ones - price matters), but exact numbers are total bogus IMHO.
 
Last edited:
It's indicative, yes. But the problem starts when people take the exact number as a proof of for example numbers of sold GPUs of various brands and models, or CPUs etc. - which is totally not what this is. And the "over time" thing also seems wrong - because they do not survey everyone each month. If they would, that would build up the data over time. But it's not - it seems very random each month, as I described earlier. Some people get counted multiple times, some people never get surveyed. Some moths it's one group of countries (and by that different income levels and availability of hardware), then other group - but we never get told which ones are each month. Then next month it doesn't add to the previous month to fill in the holes, instead it just starts from scratch and just shows the difference/change comparing to previous month, but as data points are gathered from other locations and from other people it just feels wrong.

I like to look at it, you can get some general image of what Steam gamers are using (and how little 4000 series really sold comparing to previous ones - price matters), but exact numbers are total bogus IMHO.

Like all "surveys" you'll get an error factor and a percentage of trust. For the regular thing that's done to see people's vote intentions it can be around 2.5-3% +/- for a 95% trust level - or whatever is called in English :p . That's the general problem with new data, it will need some time to get in and get in properly as new additions will fit very well within that margin of error.

Of course, this is assuming they have some good people creating the sample size for the data.
 
By the way, not many places test various GPUs in Ultrawide and Super Ultrawide resolutions but I've found a YT channel that focuses on that mostly - it might be useful for someone (was for me), as scaling seems widely unpredictable on various cards even just between 1440p and 3440x1440p: https://www.youtube.com/@ultrawidetechchannel

I'm playing at 5760x1080, so a bit wider and higher in mpx. I just look at 4k and make sure the numbers are good since it will have some life in it for the road ahead, not just at the current time.

BTW, in his data, the 4090 is more likely limited by the CPU, perhaps the 7900xtx somewhat as well - both suffering the least performance penalty. I did a quick test in CB77 and between 1080p and 5760x1080, for the 3x increase in resolution, the performance difference goes up to around 2.5x - surprisingly I'm bottle necked (by the 5800x3d with a 4080) a bit more in raster than Path tracing - PT is something around 2.4+x while raster is 2.5x+ :))

Bottom line, it does well in 4k, it will done well in Ultrawide.
 
I can pull over 90fps in Cyberpunk with path tracing at 5160x2160 DLSS Performance, but you do notice the higher input latency as the pre-frame gen fps is below 60 which highlights the latency obviously. At 3440x1440 is not noticeable as the pre-frame gen fps is in the mid 70s even with DLSS Quality, making the post-frame gen fps up to 133fps. That's with a 12700KF btw on a 4090.

And that's not even 1% lows but averages. It's not an FPS but it still matters, though at least vrr helps a bit to soften drops.
4K low fps on 4090 doesn't sound right, since even a 5700X with 4080 gets over 90fps with max settings at 4K DLSS:

4SrohPU.png


That's from Digital Foundry's video.
 
Last edited:
I can pull over 90fps in Cyberpunk with path tracing at 5160x2160 DLSS Performance, but you do notice the higher input latency as the pre-frame gen fps is below 60 which highlights the latency obviously. At 3440x1440 is not noticeable as the pre-frame gen fps is in the mid 70s even with DLSS Quality, making the post-frame gen fps up to 133fps. That's with a 12700KF btw on a 4090.


4K low fps on 4090 doesn't sound right, since even a 5700X with 4080 gets over 90fps with max settings at 4K DLSS:

I average about 110fps on my 3440x1440 UW with everything on and no DLSS.
 
Back
Top Bottom