• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Any reviews that compare real world differences?

Soldato
Joined
18 Oct 2002
Posts
19,578
Location
Somewhere in the middle.
Every YouTube video I tend to come across is a benchmark comparison of GPUs, but are there any good ones where people sit infront of a screen and play the game on say, a 3090 vs 4090 and vote on which is which?

We all have the upgrade itch from time to time and bigger numbers are always appealing, but actual unbiased experiences would be interesting.

People who invest over a grand would always say it's a big boost but I sometimes wonder if its just people justifying the money.
 
Wasn't GPUs, but LTT recently did a video compairing the 13900k to the 7950x in a reallistic head to head gaming test and came away saying theres no difference, buy the cheapest.
 
It's a good question;

Primarily depends on the rest of the setup. If you were VR, multi monitor setup or favorite game is MSFS 2020 or 4k high hz, then the rest of the setup will be easier to see/feel the difference if sat comparing 3090 - 4090.

You cant see the benefits on youtube of 4k or high hz if you only have 1080p 60hz monitor yourself, hence why results are numbers based rather than actual benefit/feeling based as people and the stream may not actually be seeing - capturing accurately every frame. Youtube probabaly cant do 4k 120fps anyway to portray the effect.

A gfx card purchase should be made regarding your monitor if a general gamer and/or a specific game if you play it a lot and depending on how serious you are about it. Casual gamers not so much or may lead towards a console.

Cockpit games under VR or multimonitor require beefy GPU's especially for games that are hard to run anyway.

Higher resolution always looks nice
VRR is also very nice to have
High hz screen is also a nice game changer for smoothness or comfort, then theres the CPU and sometimes RAM in being able to feed the fps to compliment the refresh rate of your monitor. I havent seen +144hz myself (e.g) 240hz and would have to see it in person to see the difference. You just cant see it on youtube or even if 240 fps / hz is possible via online videos and would probabaly need to be downloaded as a lossless file to see - then you'll need the equipment also.

CPU & RAM wont bottleneck your GPU - CPU's, especially 1% lows can make a really noticeable difference to game smoothness - especially when paired with VRR where microstutters are less as the VRR range the GPU is working across is less.

So unless you have a high hz, high resolution screen, with hardware that wont bottleneck your GPU, you wont get the full benefit of measuring a 4090.

90 fps is much better than 60 fps - paired with VRR and a 120hz panel. 4090 can run most games at 4k ~120fps.

Then there are technologies such as raytracing on top.

Only way you can really see say 4k, high hz, high fps, 1% lows, high res VR setup, is with your own eyes. Otherwise you have to rely on numerical results and bar charts. Imagine going to CES (any EXPO) or OCUK shop you could see in person.

Games themselves depending on how well optimized they are or technologies implemented such as VR can hammer a GPU.

Take CP2077, FC6 or MSFS 2020 - a 4090 will be easy to see the difference over a 3090 especially if RT is an option - then there are various levels of RT.

What is your personal setup? Monitor in particular.
 
My setup runs doom eternal at 150 fps at 4k. now I've seen the 4090 do over 200fps I'm still trying to see if I can get a 4090 at a decent price and not a scalped one ..

I'm on a 4k lg tv..
 
Last edited:
What’s the point if your TV has a refresh rate of 120 and you are already at 150.

Nail on the head.

A friend cannot get his head around the fact that his 60hz screen cannot physically display more than 60 FPS... EVER. I said the latency improves but results in tearing.

"But the FPS number says 130, so that's what I'm getting".
 
What’s the point if your TV has a refresh rate of 120 and you are already at 150.
Nail on the head.

A friend cannot get his head around the fact that his 60hz screen cannot physically display more than 60 FPS... EVER. I said the latency improves but results in tearing.

"But the FPS number says 130, so that's what I'm getting".

No nail on no head. Regardless of what display you're using, there's often benefits to running a higher framerate - namely in the game's responsiveness. For an easy example/test; you can quite easily tell the difference on a game locked to 30, then locked to 60 -- the game *feels* less sluggish at 60. It's not just a case of visually showing more frames.
 
No nail on no head. Regardless of what display you're using, there's often benefits to running a higher framerate - namely in the game's responsiveness. For an easy example/test; you can quite easily tell the difference on a game locked to 30, then locked to 60 -- the game *feels* less sluggish at 60. It's not just a case of visually showing more frames.

I literally mentioned the reduced latency in my post, did you miss it or ignore it? But the scenario of getting 120 vs 150 is not going to give a significant reduction in latency and you start to see diminishing returns.

None of which changes the fact that a monitor or TV cannot by design display more FPS than its max refresh. So nail very much on head in this case considering the already high refresh.
 
Last edited:
No nail on no head. Regardless of what display you're using, there's often benefits to running a higher framerate - namely in the game's responsiveness. For an easy example/test; you can quite easily tell the difference on a game locked to 30, then locked to 60 -- the game *feels* less sluggish at 60. It's not just a case of visually showing more frames.

The game is already running at 150fps. What sort of additional responsiveness are you gaining going to 200fps. It’s not a competitive esports game.
 
OP is asking about what you can 'see' maybe feel and investing £1k.

30 - 60 fps yes, obviously, but that is console territory, not enthusiast PC forum numbers. PC master race is 60 fps minimum as panels are 60hz standard refresh.

Linus tech tips did a video with pro gamers with 120hz, 240 & 360. There wasn't much perception difference certainly in score.

SOME people maybe very sensitive and pick it up. My wife cant stand LED bulbs as they flicker. Only got to wave your hand in front of 60hz bulb to see effect.


FPS won't make you a better gamer - RGB does (top comment on that vid) :cry:
 
Generally speaking if you can get, and maintain over 100FPS for the majority of the time you will have a pretty seamless experience in my own experience. Any setup that can do that, with a monitor also capable of actually displaying those frames, will give you a great experience.
 
I literally mentioned the reduced latency in my post, did you miss it or ignore it? But the scenario of getting 120 vs 150 is not going to give a significant reduction in latency and you start to see diminishing returns.

None of which changes the fact that a monitor or TV cannot by design display more FPS than its max refresh. So nail very much on head in this case considering the already high refresh.
I didn't miss it, but you're simultaneously agreeing with Rare that there's no point to higher refresh and also agreeing the opposite by conceding that it improves responsiveness. So....
The game is already running at 150fps. What sort of additional responsiveness are you gaining going to 200fps. It’s not a competitive esports game.
Regardless of whether you think it's worth it or not, you asked what the point was (implying there wasn't one) - yet, there is a point.

Personally, I can't feel (or see) much difference once I pass about 100-120 range... but that's just me.
 
There's lots of YouTubers who will do split screen comparisons of multiple cards, they're pretty handy to judge if it will make a meaningful difference, but they do often use overkill CPUs that makes the results dubious in the real world. I'm pretty sure some of them are fake too, especially the ones that don't have much footage. RandomGaminginHD does these kind of comparisons in some of his videos, though they're not always split screen.
 
I didn't miss it, but you're simultaneously agreeing with Rare that there's no point to higher refresh and also agreeing the opposite by conceding that it improves responsiveness. So....

Regardless of whether you think it's worth it or not, you asked what the point was (implying there wasn't one) - yet, there is a point.

Personally, I can't feel (or see) much difference once I pass about 100-120 range... but that's just me.

I can see there being a ‘point’ that you can technically argue in regards to potentially slim levels of perceivable responsiveness but in the real world we are talking about upgrading to a £1600+ GPU to take a frame rate from 150 to 200 on a screen that displays 120…
 
Last edited:
There's lots of YouTubers who will do split screen comparisons of multiple cards, they're pretty handy to judge if it will make a meaningful difference, but they do often use overkill CPUs that makes the results dubious in the real world. I'm pretty sure some of them are fake too, especially the ones that don't have much footage. RandomGaminginHD does these kind of comparisons in some of his videos, though they're not always split screen.
They use the best CPU's so there's a much lower possibility of a CPU bottleneck. It's a difficult test to do really.
 
150fps = 7.5ms render latency
200fps = 5ms render latency

120hz lg oled has a display latency of 6.7ms at 4k

Input latency for a really good mouse is around 4ms wired or 10ms wireless
ID tech 7 has great engine latency, but it's still going to add at least 5ms or so, more if raytracing.

So best case you are going from around 23ms to 20ms.

Doesn't seem worth it to me, but not my money so fair dinkum to ya.
 
I can see there being a ‘point’ that you can technically argue in regards to potentially slim levels of perceivable responsiveness but in the real world we are talking about upgrading to a £1600+ GPU to take a frame rate from 150 to 200 on a screen that displays 120…

My comments were more to quash the notion that rendering more frames than the screen can display is without use. Some folk will take it to heart ;)
 
150fps = 7.5ms render latency
200fps = 5ms render latency

120hz lg oled has a display latency of 6.7ms at 4k

Input latency for a really good mouse is around 4ms wired or 10ms wireless
ID tech 7 has great engine latency, but it's still going to add at least 5ms or so, more if raytracing.

So best case you are going from around 23ms to 20ms.

Doesn't seem worth it to me, but not my money so fair dinkum to ya.

This is the kinda thing I'm getting at. It comes down to being told there's a difference but I wonder if i user could differentiate between the two on a blind test.
 
Back
Top Bottom