• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Getting bored of chasing numbers

...
Mouse changed but had no choice, old one had double click issues, new one is better spec
...

Could still be contributing, better or not it'll feel different and take getting used to. Also better spec in what way? e.g. higher DPI can be a curse as if it's not actually a higher res sensor then it'll just be subdividing pixels and you'd be better running on lower DPI settings anyway (true for most mice even very high-end ones) and if you went optical to laser then movement speed & acceleration can be issues.
 
I went 1440p FreeSync just didn't like it at all, went back to 1080p/120Hz and then went Gsync 144Hz when I changed to Nvidia.

Couldn't get into 1440p for some reason, thought I was just an anomaly since every one raves about it.
 
Yup. That's why Freesync/G-Sync are actually at their best between a higher set of frame rates such as 60 and 100 because it's more difficult to notice the transitions at those frame rates. When I drop below 60 I notice it instantly regardless of G-Sync.

To be fair it depends to a large extent on what the user finds comfortable themselves. Some people swear they can play a game at 40 fps for example. For me anything under 60 and I'd rather go do something else entirely.

Exactly why i said i aim for around 90-100fps for this exact reason.
 
This is what I am going to basing the issue on and hopefully resolving it by double checking my setting, trying different AA etc

Isnt the CCC FPS cap good enough? Why would I need a different one
You dont necessarily *need* a different one, but sometimes certain ones dont always work or work optimally and for some reason using a different program works better. I really couldn't tell you why, but there's definitely something to be said about the quality of framelimiting software for whatever reason. The ones in consoles for instance are probably the best around.

Rivatuner Statistics Server is one of the most reliable ones out there just going from what I've heard from others and my own personal experience. Hard to really 'measure' which one is better, but you may have better luck with a different program. I know I cant rely solely on Nvidia Control Panel cuz it doesn't always work ideally.

Yup. That's why Freesync/G-Sync are actually at their best between a higher set of frame rates such as 60 and 100 because it's more difficult to notice the transitions at those frame rates. When I drop below 60 I notice it instantly regardless of G-Sync.

To be fair it depends to a large extent on what the user finds comfortable themselves. Some people swear they can play a game at 40 fps for example. For me anything under 60 and I'd rather go do something else entirely.
I've run my monitor at 50hz before and honestly got very used to it, but 60hz is definitely a bit better when I switch back. That's the thing about VRR monitors though - you dont have to worry about specific guideline framerates like 30fps or 60fps whatever. It's all arbitrary as far as the monitor is concerned. It can come down purely to what the user finds tolerable. And you can either run games with variable performance or lock them down to whatever you want(within range of the VRR capabilities).
 
1440p is not worthwhile for a lot of games now unless you're on a 980ti/1070 or above. You have to make too many sacrifices graphically and are better off using higher settings at a higher framerate plus more aa at 1080p.
 
At home looking at my settings, reset everything and started again in Global Profile -

Vsync forced off in CCC
AMA is set to high on the monitor as recommended by TFT Central
Blur Reduction is off as not compatible with Freesync
Changed FPS cap from 144 to 143
Power efficiency is off in the driver
Freesync is on in Display
AA set to application decides

Done the Windmill test, pretty hard to tell IMO but there is a slight difference, much easier to tell in the line test and Freesync is definitley working. Not sure why the demo is locked to a max of 60FPS

Just need to fire up a game now and change 2xMSAA to either SMAA or FXAA and see what happens
 
Think you can use Nv pendulum GSync test too.

You should check out SweetFx it can be very effective on jaggies and relatively simple to use too of you stick to Fxaa.
 
When you push the graphics things do start to feel a bit sluggish when you really dont want it to, i would make hte mistake of checking the FPS when nothing was going on and it was all great, then i was getting killed a lot in fights and losing races etc. I realised i was getting drops so i switched all the settings to minimum so it looked aweful but suddenly everything speeded up and i started winning fights/races again and enjoyed it a lot more even with potato graphics.

Just try settings your competitive games to potato graphics for a while and see if you enjoy it more.
 
Ok so just fired up BF4 and set MSAA to off and post to Medium (I think this is FXAA)

No more massive FPS fluctuations, still stays above 60 and up to 100ish but there are no more huge spikes and dips that are noticeable at critical moments

I dont know if this is just AA being switched off, I cant tell the difference in terms of quality buts its definitely smoother, I also noticed my latency dropped from the usual 30 to 23 (Belgian Server)

I think 1440p and any AA was too much for my setup, either that or resetting the Global Profile and setting preferences again sorted something

Thats BF4, going to fire up Max Payne 3 in a bit as I fancy playing it again and see how it runs
 
It's not going to help but I'm the complete opposite...ever since I switch out 1080/60 for 1440/144 I can't go back! Even if it's a case of limiting to only 85hz refresh rate to keep my gaming mostly silent, it's still leaps and bounds better than 1080/60 was, and I don't really require AA as much :/
 
Turning AA off has helped a lot, I'm used to 1080 and using AA, cant even tell its off in 1440 but the performance is there now, Max Payne 3 looks amazing and runs like a dream
 
Turning AA off has helped a lot, I'm used to 1080 and using AA, cant even tell its off in 1440 but the performance is there now, Max Payne 3 looks amazing and runs like a dream

Yeah it's the first thing I do, my 290 handles 1440p like a dream without msaa.
MSAA I feel anyway it's far to demanding for what you get in return. Smaa is a post processing AA and looks just has good.
 
1440p is not worthwhile for a lot of games now unless you're on a 980ti/1070 or above. You have to make too many sacrifices graphically and are better off using higher settings at a higher framerate plus more aa at 1080p.

Not true at all.....

Been using a single 290 and keep above 60fps in all the games i play. All I do is first remove msaa see what performance is like, then I drop down from ultra to high

And if you tell me you can see the difference between ultra and high quality without putting your face against the monitor you be lying.

Ultra graphics is over hyped, it's eye candy that is hard to notice but the performance hit is very noticeable.

The added sharpness and image quality you get from 1440p alone is worth dropping down to high settings.

I would pick high@ 1440p over ultra @1080p everytime.
 
You mentioned you changed your mouse? A rodent with acceleration turned on by default can make things feel dramatically off in games, and dodgy mouse software can cause problems too. There's been some absolute crap out there being sold under big brand names. What are you using now?
 
Not true at all.....

Been using a single 290 and keep above 60fps in all the games i play. All I do is first remove msaa see what performance is like, then I drop down from ultra to high

And if you tell me you can see the difference between ultra and high quality without putting your face against the monitor you be lying.

Ultra graphics is over hyped, it's eye candy that is hard to notice but the performance hit is very noticeable.

Tried it with a clocked 780 - it's less than ideal and you feel as though you're gimping your monitor. It's bordering on 'this is worse than console' in games like The Division cranked up (using a mouse makes the difference more noticeable compared to game pad and doesn't help). Minimal AA and high settings at 60fps just isn't worthwhile for 1440p and 144hz even with gsync for me. Settings wise it's all game dependent, however some things can be quite noticeable when it comes to textures and shadows. I'd take 1080p 144hz gsync + the saving over 1440p 144hz gsync with most single cards still - prefer higher aa, higher settings, and the higher frame rate. 1440p and 60hz I almost see as pointless once you're used to the higher refresh like 120fps capped on 144hz.
 
Last edited:
Tried it with a clocked 780 - it's less than ideal and you feel as though you're gimping your monitor. It's bordering on 'this is worse than console' in games like The Division cranked up. Minimal AA and high settings at 60fps just isn't worthwhile for 1440p and 144hz even with gsync for me. Settings wise it's all game dependent, however some things can be quite noticeable when it comes to textures and shadows. I'd take 1080p 144hz gsync over 1440p 144hz gsync + the saving with most single cards still - prefer higher aa, higher settings, and the higher frame rate. 1440p and 60hz I almost see as pointless once you're used to the higher refresh.

60fps @144hz is much better than 60fps@60hz I find
Textures you can always put these on ultra anyway so long you dont go over Vram they no performance hit..

Shadows depends on game some they not much in it.
 
You mentioned you changed your mouse? A rodent with acceleration turned on by default can make things feel dramatically off in games, and dodgy mouse software can cause problems too. There's been some absolute crap out there being sold under big brand names. What are you using now?

Mionix Naos, I'll be looking into that next but it sees this issue is sorted and it was down to one very simple thing unless power efficiency had somehow switched itself on again

I've been so used to maxing out 1080p with MSAA that I didnt realise at 1440p MSAA isnt needed as much and such a perfomance hit, its not just the FPS numbers as with MSAA on I was still above 60 in BF4, its the massive dips and spikes it seemed to cause

Turning it off completely seems to have resolved it, I am getting higher FPS without the noticable dips and spikes and best of all I cant tell the difference in image quality
 
Back
Top Bottom