• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

What's so good about Gsync Monitor ? I always just went with gsync compatible so have 2 options open and don't get trapped into just having one option for GPU

There are some advantages to the G-Sync module, but most people in this context will, as above, be running monitors from before G-Sync compatible was a thing.

VESA Adaptive Sync/G-Sync compatible is an inferior spec none the less - which was developed by hacking pre-existing functionality not intended for this use such as panel self-refresh into an adaptive sync feature. The G-Sync module has better ability to boost pixel response times, better low-frame rate handling and recovery and better support (when MS doesn't keep messing with the way Windows display model works and breaking it) for non-exclusive fullscreen and window modes in games and applications amongst several other slight advantages. But there are some people who have a vested interest in saying they are both the same despite that not being objectively true.

Unfortunately it seems to be the way in this space things happen - one vendor or another comes out with a superior technology and then we end up adopting a lesser version of it and for some reason a lot of people seem overjoyed about that...
 
There are some advantages to the G-Sync module, but most people in this context will, as above, be running monitors from before G-Sync compatible was a thing.

VESA Adaptive Sync/G-Sync compatible is an inferior spec none the less - which was developed by hacking pre-existing functionality not intended for this use such as panel self-refresh into an adaptive sync feature. The G-Sync module has better ability to boost pixel response times, better low-frame rate handling and recovery and better support (when MS doesn't keep messing with the way Windows display model works and breaking it) for non-exclusive fullscreen and window modes in games and applications amongst several other slight advantages. But there are some people who have a vested interest in saying they are both the same despite that not being objectively true.

Unfortunately it seems to be the way in this space things happen - one vendor or another comes out with a superior technology and then we end up adopting a lesser version of it and for some reason a lot of people seem overjoyed about that...

Personally I'd prefer to have a G-Sync module but they seem to now sadly be exclusive to super high end monitor over the £1000 mark.

I did want to get an Alienware Ultrawide OLED that had the module but they stopped making them.
 
Last edited:
Personally I'd prefer to have a G-Sync module but they seem to now sadly be exclusive to super high end monitor over the £1000 mark.

I did want to get an Alienware Ultrawide OLED that had the module but they stopped making them.

Probably my next main gaming monitor won't have the module, but I'd rather have one. I've a mix of G-Sync and G-Sync compatible monitors and though I'm happy with the ones without the module (EDIT: Mostly - some are better than others, surprisingly one of the better ones is the Philips 436M6 despite only being 48-60Hz) I do notice the difference.

There are a couple of downsides as well to the G-Sync module - on some displays if you aren't careful it can cause transient image retention problems, I'm guessing due to the variable overdrive maybe slightly overdoing things and in some cases white content can look choppy due to the way it tries to preserve clarity in motion.

Some older FreeSync monitors really are not great though - I can only assume people have not actually used a proper G-Sync module monitor in comparison.
 
Last edited:
Probably my next main gaming monitor won't have the module, but I'd rather have one. I've a mix of G-Sync and G-Sync compatible monitors and though I'm happy with the ones without the module (EDIT: Mostly - some are better than others, surprisingly one of the better ones is the Philips 436M6 despite only being 48-60Hz) I do notice the difference.

There are a couple of downsides as well to the G-Sync module - on some displays if you aren't careful it can cause transient image retention problems, I'm guessing due to the variable overdrive maybe slightly overdoing things and in some cases white content can look choppy due to the way it tries to preserve clarity in motion.

Some older FreeSync monitors really are not great though - I can only assume people have not actually used a proper G-Sync module monitor in comparison.


I can remember those articles, the first Freesync monitors were extremely buggy and a total mess

This is just a small sample

"I used LG’s wide-aspect 34-inch 34UM67 display. This is a 2560×1080 IPS panel with a maximum refresh of 75Hz. To turn on FreeSync, you go into the monitor’s OSD, drill into the general settings tab, and flip it on. Once that’s done, you’ll also need to make sure it’s enabled in the control panel for your Radeon card. I failed on my attempts, because the machine I used—a Core i7-5960X Haswlell-E rig with a pair of Radeon R9 290X cards—had CrossFireX enabled when I installed the new driver. I tried turning CrossFireX off which allows FreeSync to be switched on, but then I lost the power to the GPU"
 
Last edited:
I have a monitor with a Gsync module and there is a noticeable difference.

The main one being zero brightness flicker. It also just works. I have tried using an AMD 6800 card with this monitor and it was a bit of a faff actually getting it to enable VRR properly. Using a nVidia card it just works. You don't ever think about g-sync when using it because it never gives you any problems, just smooth gameplay.
 
even with something like the Samsung G7 240hz / 1440p which is gsync compatible , and is highly reviewed I havent noticed any issues using it with the 3080 and I would consider myself sensitive to flicker/ smoothness etc

Ive tried both AMD and Nvidia cards on it and it just works
 
Last edited:
Yeh I know :(
Also it seems it's not so great for ray reconstruction either (for now at least), whereas it's great on the 4080/4090:


Edit* Oh yeah just seen this, 5090 leaked:

RnMGCbJ.jpg
 
Last edited:
Also it seems it's not so great for ray reconstruction either (for now at least), whereas it's great on the 4080/4090:


Edit* Oh yeah just seen this, 5090 leaked:

RnMGCbJ.jpg

Regarding the video above:

Not to troll the AMD crowd, but I was thinking to myself the other day why are people even buying AMD cards atm. There is nothing outside of more vram and being cheaper which is attractive about the AMD GPUs over the the Nvidia cards atm.

But why would I want a cheaper card with more vram which doesn't have the same level of RT and upscaling technology as the Nvidia cards.

What's the point in having more vram when the performance and features aren't there to make use of it.

:/
 
Last edited:
AMD should really focus on invented new technologies for themselves instead of literatly just copying Nvidia like they've done for the last 10 to 15 years

Amd\ati have brought in new tech but the fact is it usually doesn't go anywhere. True audio being one example, tress fx being another. Way back in the day ati were the first with a form of tessellation that was for use on 3d models to help give them more rounded limbs in human characters and curved surfaces, instead of the overly angular look which was common back then. That was called truform and that was back in 2001, again it went nowhere.

So at some point I suppose they have to ask themselves why bother, as they generally only get a limited amount of use before vanishing


You could also class eyefinity as a tech of sorts that amd brought in that nvidia copied, that was successful for quite a few years until superwides started appearing. Multi monitor was a thing for productivity but it was never done for gaming like how amd done with eyefinity.

.
 
Last edited:
Regarding the video above:

Not to troll the AMD crowd, but I was thinking to myself the other day why are people even buying AMD cards atm. There is nothing outside of more vram and being cheaper which is attractive about the AMD GPUs over the the Nvidia cards atm.

But why would I want a cheaper card with more vram which doesn't have the same level of RT and upscaling technology as the Nvidia cards.

What's the point in having more vram when the performance and features aren't there to make use of it.

:/
Because not everyone cares about RT at the moment.

When there are 10,000+ games and the number of RT games hasn't even reached 300 yet.

AMD are more appealing to me as they offer similar or slightly better raster performance for it's tier equivalent and are about 150+ cheaper. Only reason I haven't got one is because. Both companies are overpriced.

The question should be "why aren't people willing to spend 150-200 more for a feature they don't care about or don't need"..
 
Last edited:
Back
Top Bottom