• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to support Freesync?

Soldato
Joined
27 Feb 2015
Posts
12,621
Exactly
If you gaming below 48fps you doing it wrong.

Some of us play games that are coded in a manner you could have 4 titan RTX golden samples paired with a 9900k golden sample at min setttings (if its even tunable, fair few games you cant tune settings) and you go below 48fps, it is what it is. We dont all play perfectly optimised for PC shooters.

Also I value image quality, I dont like short draw distances, shimmering etc. I am sure I am not the only one who has this preference.
 
Soldato
Joined
27 Feb 2015
Posts
12,621
1440p monitor running HDR, you might easily struggle to get 48FPS.

30HZ should be the minimum

yeah, a 48 minimum shouts out to me these were designed for twitch gamers who want super high frame rates, so they see 48 as some kind of super duper low framerate, so with that in mind a good adaptive sync tech would allow a game to flip flop between 30 and 60 seemlessly and smoothly. Even in this era 30/60fps is still mainstream framerate, in the console world, 60 is seen as high, and most PC gamers dont game above 60.
 
Soldato
Joined
19 Dec 2010
Posts
12,030
yeah, a 48 minimum shouts out to me these were designed for twitch gamers who want super high frame rates, so they see 48 as some kind of super duper low framerate, so with that in mind a good adaptive sync tech would allow a game to flip flop between 30 and 60 seemlessly and smoothly. Even in this era 30/60fps is still mainstream framerate, in the console world, 60 is seen as high, and most PC gamers dont game above 60.

Freesync/adaptive sync handles frame rates right down to 9/10fps as long as the max refresh rate of the monitor is greater than or equal to twice the min refresh rate.

So in a 48 to 144hz screen it would still be smooth right down to 30 and lower.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
1440p monitor running HDR, you might easily struggle to get 48FPS.

30HZ should be the minimum

HDR has a performance impact? Playing games on the PS4 Pro and I see no impact on the frame rate running HDR. I remember seeing something a while back Nvidia older GPUs has a performance impact but am very sure that is no the case with new cards.

Edit
It seems AMD doesn't have a performance reduction! Only Nvidia does. That would explain why PS4 Pro doesn't lose frames.

https://www.computerbase.de/2018-07...2/#diagramm-assassins-creed-origins-3840-2160

 
Last edited:
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
yeah, a 48 minimum shouts out to me these were designed for twitch gamers who want super high frame rates, so they see 48 as some kind of super duper low framerate, so with that in mind a good adaptive sync tech would allow a game to flip flop between 30 and 60 seemlessly and smoothly. Even in this era 30/60fps is still mainstream framerate, in the console world, 60 is seen as high, and most PC gamers dont game above 60.

If a game is running at 60fps and then drops down to 30fps you would still feel this with Gsync/Freesync

These tech do NOT fix the laws of Frame latency. Frame rate is Frame rate and latency is latency.

All these tech do is remove the bottleneck that PC gamers have been stuck with for years and that 1. Screen tear and 2. Vsync stutter

Games become smoother because the screen tearing is removed giving you a much smoother looking image.

Dam I remember when Gsync came out almost everyone on here banging on about wow 30fps Gsync feels like 60fps Yeah right!!
 
Associate
Joined
25 Apr 2017
Posts
1,122
I would rather drop settings.. Gsync and Freesync do not fix frame latency.

Example
30fps Gsync or freesync is still 33ms lag
Assassins creed odyssey is such a cpu bound game it could max out a 9900k in the cities and towns. No matter how much you drop settings it’s going to dip below 50 unless you play on medium and low which looks very bad.Just cause 4 needs an RTX 2080ti at 144P to maintain 60fps at all times.

I want my screen to be tear free if I am going to play horrible console ports like these. 40fps doesn’t really sound all that bad to me. Before the 2080 was using a 970 and a 60hz monitor and I played Witcher 3, Assassins creed origins, GTA 5 in 40-50fps range.
 
Soldato
Joined
17 Jul 2007
Posts
24,529
Location
Solihull-Florida
HDR has a performance impact? Playing games on the PS4 Pro and I see no impact on the frame rate running HDR. I remember seeing something a while back Nvidia older GPUs has a performance impact but am very sure that is no the case with new cards.

Edit
It seems AMD doesn't have a performance reduction! Only Nvidia does. That would explain why PS4 Pro doesn't lose frames.

https://www.computerbase.de/2018-07...2/#diagramm-assassins-creed-origins-3840-2160



I don't get any drop in frames with my UHD 10bit\FALD\HDR TV if I turn on HDR.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,565
Location
Greater London
What GPU?
He has the same as me, but he water cools his, so runs at 2100MHz vs mine which is on a reference cooler that is just shy of 2000MHz typically on my profile. Though I hardly ever need to use that profile anyway, I use one that uses 75% Power Limit which is essentially reducing the voltage used. It ends up running a lot cooler and quieter at around 1650MHz which does the job for 95% or more of my steam library :D
 
Soldato
Joined
27 Feb 2015
Posts
12,621
If a game is running at 60fps and then drops down to 30fps you would still feel this with Gsync/Freesync

These tech do NOT fix the laws of Frame latency. Frame rate is Frame rate and latency is latency.

All these tech do is remove the bottleneck that PC gamers have been stuck with for years and that 1. Screen tear and 2. Vsync stutter

Games become smoother because the screen tearing is removed giving you a much smoother looking image.

Dam I remember when Gsync came out almost everyone on here banging on about wow 30fps Gsync feels like 60fps Yeah right!!

I am not talking about frame latency (something a lot of gamers dont seem to notice, yet mysteriously twitch gamers do), I couldnt care less about frame latency. I am talking about stutters that you see when vsync is enabled and the refresh rate cannot be achieved by the game, so basically same smoothness as if vsync is off but without the tearing.

So if you play a game at unlocked to 60fps, but it cannot sustain that, with vsync enabled it has to drop all the way back to 30 from 60, and the sudden change is obviously noticeable, but if vsync is off it might drop to e.g. 55fps and its a smooth transition. This is the real benefit of adaptive sync. Been able to keep that fluidness without screen tearing.

The term image smoothness seems to have been redefined by twitch gamers sadly, which has led to confusion on this subject now. Traditionally it has meant how smooth something seems on the basis of watching it, no relation to lag from controls etc. But now twitch gamers keep referring to lag instead as in smoothness in response to game events and controls.

So for me, 30fps and 60fps gaming is fine, but what I dont like is playing a game that flips between the 2 framerates because it cannot sustain 60, adaptive sync makes that much more pleasant. As it removes the choice between stutters and tearing. Best of both worlds.

Ironically I am seeing it on vesperia right now on the ps4 pro. In a few cut scenes it drops to 30 because the console clearly cannot handle the game at full 60 (it was full 60 on xbox 360 and ps3 LOL), but the dev's didnt cut it to 30 in all the scenes it struggles I assume because the player base would moan, so instead it stutters when it cannot cope as its vsync locked.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
I am not talking about frame latency (something a lot of gamers dont seem to notice, yet mysteriously twitch gamers do), I couldnt care less about frame latency. I am talking about stutters that you see when vsync is enabled and the refresh rate cannot be achieved by the game, so basically same smoothness as if vsync is off but without the tearing.

So if you play a game at unlocked to 60fps, but it cannot sustain that, with vsync enabled it has to drop all the way back to 30 from 60, and the sudden change is obviously noticeable, but if vsync is off it might drop to e.g. 55fps and its a smooth transition. This is the real benefit of adaptive sync. Been able to keep that fluidness without screen tearing.

The term image smoothness seems to have been redefined by twitch gamers sadly, which has led to confusion on this subject now. Traditionally it has meant how smooth something seems on the basis of watching it, no relation to lag from controls etc. But now twitch gamers keep referring to lag instead as in smoothness in response to game events and controls.

So for me, 30fps and 60fps gaming is fine, but what I dont like is playing a game that flips between the 2 framerates because it cannot sustain 60, adaptive sync makes that much more pleasant. As it removes the choice between stutters and tearing. Best of both worlds.

I agree with this reply. I must have misunderstood you first time.
 
Soldato
Joined
27 Feb 2015
Posts
12,621
yeah I reread what you wrote you basically saying what I meant, so I think you just misunderstood me, its fine. I am not one of those who thinks gsync makes 30fps like 60fps :)
 
Permabanned
OP
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
The reason we use Freesync in conjunction with these monitors is that we expect Nvidia to offer the same support as AMD, no less of an experience. As I have said before, if AMD can make it work, why can't Nvidia? :)
 
Back
Top Bottom