What frame rate are you happy with?

I can happily play most single player games at 60fps with no issues although, like others in this thread, playing at 90+ does feel like it brings a bit more responsiveness and smoothness to it. I played God of War (2018) on PC last year and playing that at 90fps was great.

For multiplayer games I really do like to have as many frames as possible... I swap from my 1440p monitor to my 1080p240hz monitor for most FPS games and just aim to get 200+ fps. I don't think it makes me a better player at any of the games I play but it feels so so smooth to play.
 
I expected an answer like that, just not sure why you prefer it. I assume you don't watch any streaming/blurays at that ratio? I made the jump to 1440p ultra wide and its great (use the oled for fps/tv watching).
Productivity. 16:9 became the standard for TV's as it was the best compromise between 2.35:1 cinema and 4:3 TV. Not the best for working. Even with this, I normally have the monitor on the right turned vertically so it's running at 1200x1920.

My office has a 65" plasma on the wall, so watching anything is done on that. If I do watch the occasional thing on the PC, guess what, it shows in 1920x1080 without any stretching and small 60 pixel bars black top and bottom if I go fulls screen. Gaming is fine, it either supports the resolution and runs fine, or runs in 1080p, and doesn't use the extra pixels as I have scaling turned off, as scaling is the devil.

Or, even better, with very old games I can run them in 1600x1200. Although, I have couple of 20" Dell 4:3 monitors that are better for that.

Seriously though, ~24" 2560x1600 @ ~120hz would do me very nicely. Hell, I'd have 3 of them on this desk this very moment.

Or.... a nice 24" montor with a 16:10 resolution of 2304x1440 @85hz Oh wait... We had that 22 years ago. I still have my 2 Sony FW900's. Not in use at the moment as they're enormous and space is a limitation right now. But next year with the re-work of my office, I will have a proper area for them. :D
 
Last edited:
Productivity. 16:9 became the standard for TV's as it was the best compromise between 2.35:1 cinema and 4:3 TV. Not the best for working. Even with this, I normally have the monitor on the right turned vertically so it's running at 1200x1920.

My office has a 65" plasma on the wall, so watching anything is done on that. If I do watch the occasional thing on the PC, guess what, it shows in 1920x1080 without any stretching and small 60 pixel bars black top and bottom if I go fulls screen. Gaming is fine, it either supports the resolution and runs fine, or runs in 1080p, and doesn't use the extra pixels as I have scaling turned off, as scaling is the devil.

Or, even better, with very old games I can run them in 1600x1200. Although, I have couple of 20" Dell 4:3 monitors that are better for that.

Seriously though, ~24" 2560x1600 @ ~120hz would do me very nicely. Hell, I'd have 3 of them on this desk this very moment.

Or.... a nice 24" montor with a 16:10 resolution of 2304x1440 @85hz Oh wait... We had that 22 years ago. I still have my 2 Sony FW900's. Not in use at the moment as they're enormous and space is a limitation right now. But next year with the re-work of my office, I will have a proper area for them. :D
I agree, I've played a few older games and at 4:3 it just works, it depends on the game though. Rts's are great with a wider screen, fps gets a bit like watching a tennis match, hence the TV for those types of games. I actually bought my ultra wide for my degree as cam get more pages of word across it and for 3d modelling its far better (icons around the edge tend to get in the way at small resolutions).
 
Generally 100 is a good sweet spot, much higher and I can't tell the difference (even at 240Hz) and it is high enough that fps drops still aren't too noticeable.

I play FPS games on a 1440p 155Hz and while I can't see the difference, I can "feel" the difference if it drops to 60Hz. Non-FPS games are fine all the way down to 60 fps. I payed cyberpunk at 1440 at about 50fps and it was fine (3060 Ti).

I have a 49" G9 that my 3080 Ti just can't run at high FPS in most games. It's great for driving games, but getting to 100 fps is hard on max settings.
 
My thoughts are this

otqcM19.jpg
 
Interesting graph, when it's put like that it's obvious it's very much diminishing returns past a certain point.

One thing that graph doesn't illustrate is the impact of uneven frame pacing and additional factors like display latency and pixel response. Most people would be happy with 20-30% lower frame rates if there was a more ideal presentation of those frames.
 
75fps is fine for me, but I have only got a 75Hz monitor so more is wasted.
For fast paced games I want that as a minimum, but for more relaxed games I'll generally favour visuals providing I can get a consistent framerate.

Some day I'll try out a faster refresh monitor to see if I can even tell the difference.
 
Interesting graph, when it's put like that it's obvious it's very much diminishing returns past a certain point.

Yeh, once you get close to 90, the difference between that and 240 for the frame interval is only 5 milliseconds. I guess that's why it starts to get increasingly difficult to tell the difference when you start going that high.

From 30 to 60 fps is an almost 20 millisecond frame interval difference. That's still more than the entire frame interval difference from 60 to 240 FPS!

Interesting graph that.
 
Last edited:
Doesn't letting your PC run at crazy high fps simply ramp up the heat and noise of the fans?

I just got a new mid range gaming rig with i7 1270 and Geforce RTX 4070 etc and noticed when I first played a game that the fan noise were a bit noticeable. since then I set it at 60fps in Nvidia settings and it runs games on ultra silently and cool and everything looks fine to me.
 
Doesn't letting your PC run at crazy high fps simply ramp up the heat and noise of the fans?

I just got a new mid range gaming rig with i7 1270 and Geforce RTX 4070 etc and noticed when I first played a game that the fan noise were a bit noticeable. since then I set it at 60fps in Nvidia settings and it runs games on ultra silently and cool and everything looks fine to me.

TBH, I can't even hear my 4070 at 100+ fps and the hottest I've seen it is 58C so far, usually stays lower than that so I happily let it run all out and aim for anything between 80-100fps as I can't feel the benefit over 60 above that and the card is too weak anyways.
 
TBH, I can't even hear my 4070 at 100+ fps and the hottest I've seen it is 58C so far, usually stays lower than that so I happily let it run all out and aim for anything between 80-100fps as I can't feel the benefit over 60 above that and the card is too weak anyways.
I think it depends on the user

To me 800rpm fans even noctua are “loud” were as some find 1500rpm quiet

My 3090 left to it’s on its own clocks at 1950mhz @ 1.090v pushing 200fps in lot of games at 1440p at 75c with 2500rpm fans

I’ve undervoltes it to 1800mhz at 800mw now runs at 65c and get maybe 15fps less at 1200rpm

But if I lock it to 90fps it sits at 60c with 900rpm
 
Last edited:
100 fps is perfectly fine for me.
I have a 144Hz Gsync monitor and cap it through the Nvidia drivers to 100, as I feel anything more isn't needed. Plus it's putting hardware under extra strain for no reason.
To be honest I couldn't go back to a 60Hz monitor for gaming these days. Not a problem in the work place though.
 
Back
Top Bottom