• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G-Sync got worse??

I see this quite a lot when talking about Nvidia features. The amount of effects you need to disable these days from GameWorks is kind a waste of time tbh

Buy Nvidia GPU and disable everything XD all these effects just waste important frame rates what really is the point?

the point is that if you have a powerful enough machine and GPU, you can leave everything on and enjoy it.

Maybe you don't, but I certainly enjoy keep everything turned on.

Can you play without? Sure. That's not the point for some people though.
 
the point is that if you have a powerful enough machine and GPU, you can leave everything on and enjoy it.

Maybe you don't, but I certainly enjoy keep everything turned on.

Can you play without? Sure. That's not the point for some people though.

This isn't always the case though is it! GTX 1080 even struggles with all the Gameworks effects switched with Watch Dogs 2

If you don't mind dropping frame rates for better IQ then that's perfectly fine but I just find it odd the amount of effects added from Gameworks are just so demanding to run that need the high end GPUs to run!
What about people using 900 series and below? Ignore them now? Sorry disable you not first class anymore.

My point is
GameWorks title are known for being bad ports or bad performing tiles and the reason is the mainstream GPUs have issue running them.
 
This isn't always the case though is it! GTX 1080 even struggles with all the Gameworks effects switched with Watch Dogs 2

If you don't mind dropping frame rates for better IQ then that's perfectly fine but I just find it odd the amount of effects added from Gameworks are just so demanding to run that need the high end GPUs to run!
What about people using 900 series and below? Ignore them now? Sorry disable you not first class anymore.

My point is
GameWorks title are known for being bad ports or bad performing tiles and the reason is the mainstream GPUs have issue running them.

I personally stay away from all these ports based on the feedback from other people who played them. The whole point of a game for me is to actually enjoy it, not get annoyed because of so many reasons :)

so yeah, I haven't played that game but in every single game I play I do the same thing, crank all the settings to max and play :)
 
I personally stay away from all these ports based on the feedback from other people who played them. The whole point of a game for me is to actually enjoy it, not get annoyed because of so many reasons :)

so yeah, I haven't played that game but in every single game I play I do the same thing, crank all the settings to max and play :)

Kinder like me I stopped using MSI afterburner etc and monitoring because I felt I was spending to much time looking at that info more than enjoying the actual game LOL
 
Kinder like me I stopped using MSI afterburner etc and monitoring because I felt I was spending to much time looking at that info more than enjoying the actual game LOL

that's the best way to game, turn off all that stuff, like information, temps, fps and just play the game. you'll notice pretty quickly if it's not smooth as butter :)
 
+1 to both of those comments.

I want to enjoy my games, not be concerned with numbers in the top corner of my screen.

Griffuldur your right, you will soon notice if your card is struggling a bit. I always bang up the settings and on the odd occaision may have to bring them down one notch with no noticable IQ difference but better performance.

:D
 
If you come from a history of twitch shooters and stuff like 125fps quake 3, CS with stupid framerates, etc. etc. and sit down and swap between them a few times in one sitting it is noticeable and once you do notice it you can't really not notice it kind of like with 60Hz v 120Hz though not as dramatic - if you played on one for awhile and then swapped to the other a few weeks later or something you might not notice the difference (I'm only talking here about the behaviour around certain low framerates and not the overall experience in normal 60 fps gaming or whatever).

There really nothing between them though. I have a gsync and a Freesync monitor, an nvidia and an AMD card. If you did a blind test I bet you wouldn't be able to tell them apart. High FPS on a high hz screen pretty much removes the need for sync anyway. It's more useful on 60hz screens.

As for gameworks. I dunno, the whole thing seems poorly put together. But my rx480 runs Witcher 3 better than my 970 with GW stuff on, so it doesn't always favour geforce cards :P
 
Last edited:
There really nothing between them though. I have a gsync and a Freesync monitor, an nvidia and an AMD card. If you did a blind test I bet you wouldn't be able to tell them apart. High FPS on a high hz screen pretty much removes the need for sync anyway. It's more useful on 60hz screens.

I'm talking about certain low framerate situations. Unless you are playing i.e. CSGO its still relevant at higher framerates - playing stuff like BF4 at 120 ish fps its much nicer without any tearing and still responsive without bouncing off the V-Sync multipliers when you can't hold 120 fps or whatever solid.
 
Sadly Linus's testing methodology for G-Sync is flawed and results in extra latency on G-Sync in some situations which isn't there if you set it up properly - there was a thread awhile back on here around the time of the video where it was proved - if you look at Battle(non)sense's videos on G-Sync he sets it up correctly and you can see the real latency numbers for yourself.



Like with some things you'd have to A-B test probably to notice the difference but once you do notice it you can't "un-notice" it :S

and you said earlier you have not tested the latest iteration and therefor by your own admission you only have outdated info as a basis, therefor my point still stands.
i have seen both and can say there is no noticeable difference between them, nothing that the human eye could perceive anyway. you are merely speculating.
 
and you said earlier you have not tested the latest iteration and therefor by your own admission you only have outdated info as a basis, therefor my point still stands.
i have seen both and can say there is no noticeable difference between them, nothing that the human eye could perceive anyway. you are merely speculating.

I did not say I had not tested - I said my more extensive testing was back when FreeSync was launched - nothing in my post is speculation - might want to take a browse in the monitors section of this forum before claiming that ;)

It isn't so much about what the human eye can perceive but the overall feel as well as it is related to input latency as well as how smooth the rendering is - if you were just watching the monitors you probably wouldn't even notice a difference but when playing through the same kind of framerate transitions it is noticeable.
 
Last edited:
I did not say I had not tested - I said my more extensive testing was back when FreeSync was launched - nothing in my post is speculation - might want to take a browse in the monitors section of this forum before claiming that ;)

It isn't so much about what the human eye can perceive but the overall feel as well as it is related to input latency as well as how smooth the rendering is - if you were just watching the monitors you probably wouldn't even notice a difference but when playing through the same kind of framerate transitions it is noticeable.

and back when freesync was launched is a different experience to now.
i have been browsing the monitor section for quite a while. ;)

and as i stated i feel no difference, if i was the only one here stating that, then you could perceive that as an error perhaps on my part but others say the same thing and so believe what you want, but don't state it as fact when it is not.
 
and as i stated i feel no difference, if i was the only one here stating that, then you could perceive that as an error perhaps on my part but others say the same thing and so believe what you want, but don't state it as fact when it is not.

Plenty of documentation on how both handle those situations that backup that there likely is a difference (also somewhat dependant on what the framerate floor for the FreeSync window is on any given panel). G-Sync's way of handling lower framerates simply has more granularity and an approach that also helps it to transition back to rendering higher framerates quicker which while we are talking milliseconds of difference will be noticeable to some people more than others. I'm stating it as fact as the behaviour can be demonstrated in scoped analyses of both systems and I certainly can notice the difference personally.

EDIT: I'm starting to feel some deja vu back to the old microstutter days :( I don't have a habit of pulling stuff like this out of my behind especially not to try and score some imaginary points or whatever.
 
Plenty of documentation on how both handle those situations that backup that there likely is a difference (also somewhat dependant on what the framerate floor for the FreeSync window is on any given panel). G-Sync's way of handling lower framerates simply has more granularity and an approach that also helps it to transition back to rendering higher framerates quicker which while we are talking milliseconds of difference will be noticeable to some people more than others. I'm stating it as fact as the behaviour can be demonstrated in scoped analyses of both systems and I certainly can notice the difference personally.

EDIT: I'm starting to feel some deja vu back to the old microstutter days :( I don't have a habit of pulling stuff like this out of my behind especially not to try and score some imaginary points or whatever.

i don't think we will agree either way so lets agree to disagree as theirs no point in arguing about it, certainly nothing proved or gained. not that i would call that an argument at all :)
 
If I had the choice of the same monitor free or g sync for a few quid more I would always choose g sync. Having said that g sync monitors are not a few quid more so I would get a free sync one which is why I hope vega delivers.
 
I tested the AOC G2460PG G-SYNC 144 Hz and the ASUS VG248 144 Hz and both are 1920x1080 and i could not tell the difference, i guess some can detect it better than others. According to the green team engineers NVIDIA G-SYNC works with all games. However, they have found some games that do not behave well and for those games they recommend that users take advantage of their control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in their driver.


There is no list in the control panel that tells you which games are g-sync disabled by default, unless i just didn't see it?


Maybe someone has made a list of games that do not behave well with g-sync enabled?
 
G-stync doesn't work correctly in BF1 for me in fullscreen exclusive mode, only in borderles windowed mode. Fortunately tearing is only really noticeable when the viewpans up quickly across a cloudy background when it shifts from the menu to a loading screen.
 
Back
Top Bottom