• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

A Week With NVIDIA's G-SYNC Monitor

Yeah, looks that way. Re-reading the article, it clearly says the G-Sync unit replaces the monitors own scaler.

Well, that's a bum :(

Hopefully a monitor manufacturer will release a model with both it's own scaler and a G-Sync one in it.
 
Yeah, looks that way. Re-reading the article, it clearly says the G-Sync unit replaces the monitors own scaler.

Well, that's a bum :(

Hopefully a monitor manufacturer will release a model with both it's own scaler and a G-Sync one in it.

Replacing a scaler with another that can also do variable does not automatically mean incompatible with standard fixed refresh rates or you would need another screen to get into the BIOS or anything else before the driver loads.

Is a 3D vision screen incompatible with non NV cards, the 3D vision part is but everything else is not, but that does not mean that it can not be made intentionally incompatible with standard fixed refresh rates if a non NV card is used.
 
Last edited:
I did point this out at the time but some people were adamant it was only coming to 120+ panels and thus wrote it off as irrelevant

the prototype / 1st edition gsync module is an FPGA, if NVidia rework it as an ASIC as volumes increase then it will drop in price, though keeping it as an FPGA could also mean they can upgrade it's features without replacing it which would be a nice feature

oculus rift or similar with gsync would also be a nice touch, shame that neither OR or CastAR look like they are coming out much before the end of next year

I tried to point it out too but my posts kept getting lost in the "this is useless I want to run a 1440/1600 ips" comments :p

I think it was pcper (may have been someone else will try dig it out) put out a great article when gsync was announced about how it effectively makes refresh rates as we know it almost null.

I suppose if you move back to amd or a non kepler card, you could pop the back of the monitor off and remove the module, voids warranty but at least its a solution until monitors are available with switches to disable it.
 
Replacing a scaler with another that can also do variable does not automatically mean incompatible with standard fixed refresh rates or you would need another screen to get into the BIOS or anything else before the driver loads.

Is a 3D vision screen incompatible with non NV cards, the 3D vision part is but everything else is not, but that does not mean that it can not be made intentionally with incompatible standard fixed refresh rates if a non NV card is used.
Given that G-Sync only works in full screen mode it would seem certain that the monitor can still operate using normal refresh rates.
 
I did point this out at the time but some people were adamant it was only coming to 120+ panels and thus wrote it off as irrelevant

the prototype / 1st edition gsync module is an FPGA, if NVidia rework it as an ASIC as volumes increase then it will drop in price, though keeping it as an FPGA could also mean they can upgrade it's features without replacing it which would be a nice feature

oculus rift or similar with gsync would also be a nice touch, shame that neither OR or CastAR look like they are coming out much before the end of next year

I tried to point it out too but my posts kept getting lost in the "this is useless I want to run a 1440/1600 ips" comments :p

I think it was pcper (may have been someone else will try dig it out) put out a great article when gsync was announced about how it effectively makes refresh rates as we know it almost null.

I suppose if you move back to amd or a non kepler card, you could pop the back of the monitor off and remove the module, voids warranty but at least its a solution until monitors are available with switches to disable it.

Like both of you guys, I also pointed it out on numerous occasions but it got ignored or not read. I leave you to decide :D

Man I can't wait for this now :D Screw the 290/X, this is where my cash is going :D

Mantle + G-Sync =

78cba6597d627a2d14f5a229e9ac5043.jpg
 
I did point this out at the time but some people were adamant it was only coming to 120+ panels and thus wrote it off as irrelevant

the prototype / 1st edition gsync module is an FPGA, if NVidia rework it as an ASIC as volumes increase then it will drop in price, though keeping it as an FPGA could also mean they can upgrade it's features without replacing it which would be a nice feature

oculus rift or similar with gsync would also be a nice touch, shame that neither OR or CastAR look like they are coming out much before the end of next year

No, Oculus rift would be horrendous with gsync, read up on it, watch the oculus rift talk from Apu 13. When you're using VR type solutions a delay in a frame, or a changing framerate will induce horrendous problems for it.

Their entire focus is on as low latency as possible, removing motion blur as it induces more sickness in VR than usually. They want 90fps minimum, use as many tricks to reduce frame latency and an absolutely static frame rate is completely and utterly essential.

What they want is oleds + low persistence. There is a very real reason that Carmack said yeah, g-sync is okay but spent the majority of every interview talking about low persistence being the ultimate goal.

With a "normal" screen, the world around that screen is not stuttering or doing anything abnormal thus you feel fine though could do without the blurring on the screen. When you remove the rest of the world and can only see the screen, blurring, low frame rate and input latency are the enemy. Changing frame rate, blurring and low fps stuttering are basically completely unacceptable in VR so g-sync isn't going to be a VR helping tech.

I'm not expecting a decent oculus unit till they can get some decent 1080p(min) oled screens, limited to higher end hardware that can sustain 90fps (to stay above 60fps minimum per screen.
 
Yeah, that makes sense - but why remove it? I don't get that bit at all.

Surely, "as well as" would be better?

Even if they dont remove it the parts that were plugged into the original module would be plugged into the Gsync module unless they make connections internally so that both can be connected at the same time=more costs but its possible.
 
What they want is oleds + low persistence. There is a very real reason that Carmack said yeah, g-sync is okay but spent the majority of every interview talking about low persistence being the ultimate goal.

I obviously watched a different interview, because the interview I saw Carmack said that gsync was an absolute game changer, he never once it was "just ok"

he added that gsync will also support low persistence modes and that he was also interested in that, it was an addition not a put down on what else gsync offers

fixed FPS 100% of the time is impossible, whatever you set as your lower bound, you are always going to get some dips, when this happens with vsync on or off you are always going to get some artefacting, in fact testing with fps limits even at 59fps locked I see more artefacts than with vsync, so gsync would eliminate artefacting, if you also want to aim to lock in high fps then great, but you are always going to get artefacts without gsync, you also want low input lag which again with double or triple buffering you are going to get input lag, gysnc eliminates this as well

carmack even said in a subsequent interview that the first consumer OR would "unfortunately not" have gsync, but that it was definitely something they would like to add to the 4K version
 
Last edited:
Even if they dont remove it the parts that were plugged into the original module would be plugged into the Gsync module unless they make connections internally so that both can be connected at the same time=more costs but its possible.

yeah, it would be possible but costly

the monitor may well work in "fixed" mode, but the gsync module only supports display port, so as long as your device has display port (or an adaptor) then I guess you should be able to plug it in and have it operate in fixed hz mode
 
Like both of you guys, I also pointed it out on numerous occasions but it got ignored or not read. I leave you to decide :D

Man I can't wait for this now :D Screw the 290/X, this is where my cash is going :D

Mantle + G-Sync =

I suppose a valid question now is, would there be any need for mantle if you already ran gsync? As mantle beefs up frame rates to avoid the noticeable lows while gsync smooths out these lows to be for the most part unnoticeable while giving less input lag.

Unless I'm looking at it all wrong anyhow :o
 
Hi there

Just a question, so don't shoot me guys!

But if you have a really powerful machine, like 780Ti or 290X based or even SLI/Crossfire and your constantly getting 100fps+ in games. Does that not make this technology kind of redundant?

Educate me on how it works, because right now say I have a PC, 4770K/FX-9590 with a pair of GTX 780 or a pair of 290's, attached to a 144Hz / 240Hz monitor, then I am getting no tearing issues anyway, everything is silky smooth. As such would this technology benefit me?


To me this only seems useful for those running more mainstream gaming PC's where minimum FPS drops quite a lot, to say sub 30fps and as such it helps smoothen things out, but if your rig is powerful and your minimum FPS is always above 60 or even 100, then will it offer me any benefit?
 
I suppose a valid question now is, would there be any need for mantle if you already ran gsync? As mantle beefs up frame rates to avoid the noticeable lows while gsync smooths out these lows to be for the most part unnoticeable while giving less input lag.

Unless I'm looking at it all wrong anyhow :o

I hope not. My brain can't handle the potential mind-boggling fallacies and stupidities of a Mantle vs Gsync thread.
 
I suppose a valid question now is, would there be any need for mantle if you already ran gsync? As mantle beefs up frame rates to avoid the noticeable lows while gsync smooths out these lows to be for the most part unnoticeable while giving less input lag.

Unless I'm looking at it all wrong anyhow :o

that is my take on it... NVidia are also saying that the ARM cores on Maxwell will more than make up for any advantage that mantle would offer, they've also said that Maxwell will be easier to program for than kepler which might also mean they are coming up with their own tools or extensions that do something similar
 
Back
Top Bottom