• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is displayport 1.3 going this next gen?

Nvidia will sell the benefits of their product as long as they can. They'll talk about gsync being a superior product etc. When free sync truly matches it then they are stuffed though. It will be hard to justify not adopting adaptive sync. Besides, they could win AMD users if their monitors were also compatible with nvidia gpus
 
There is precisely nothing about Nvidia 3d screens that requires Nvidia or exclusivity. The 3d modes supported by Nvidia are normal screens that pay to not be blacklisted in the driver, this is what Nvidia has done forever. They did it with SLI chipsets for donkeys years, they do it with SLI branded PSUs, they locked out Ageia cards, they lock out gpu physx if an AMD gpu is present. It's all a case of simply not allowing the driver to use a certain feature if Nvidia haven't been paid OR Nvidia just don't want you to have it(physx on a nvidia gpu you have paid for if AMD card present).

I've said since the start, there will be a open standard, Nvidia will eventually use non Nvidia chips and go with the open standard. They will still charge a licensing fee, still call it G-sync both increasing the cost and trying to lock customers to g-sync screens.
 
One of the big problems is that AMD have actually been rather clever, maybe not intentionally but clever all the same. Does anyone see any Adaptive Sync monitors advertised, or just ones labeled as Freesync monitors. So even if NVidia decided to support Adaptive Sync, they perceptually will be supporting an AMD product.
Now I do feel they will support DP 1.3, but there will be a bit a who har about it being Adaptive Sync capable and not Freesync capable, which of course technically would be correct. In fact I'm fairly sure that contrary to some opinions the currant Maxwell chips already are Adaptive Sync capable, as they use it for the GSync in laptops and one of the laptop reviewers took thier review sample apart and found no extra hardware in the graphics pipeline, therefore the chips must be capable.
 
Arn't NVidia already using adaptive sync to bring a form of gsync to laptops?

In fact I'm fairly sure that contrary to some opinions the currant Maxwell chips already are Adaptive Sync capable, as they use it for the GSync in laptops and one of the laptop reviewers took thier review sample apart and found no extra hardware in the graphics pipeline, therefore the chips must be capable.

In mobile graphics it's a requirement to support panel self refresh and have a frame buffer if you are using embedded display port 1.3. That's why the mobile chips from Nividia can use sync technology without any extra hardware. It's also why all AMD GCN APUs support adaptive sync, but not all GCN discrete cards do.

And the reviewer wouldn't have found any different hardware than is in any other laptop that supports embedded display port 1.3 because they will all have to conform to the same specifications.

Just another thing, Nvidia can support display port 1.3 without installing the hardware needed to connect to an adaptive sync monitor.
 
In mobile graphics it's a requirement to support panel self refresh and have a frame buffer if you are using embedded display port 1.3. That's why the mobile chips from Nividia can use sync technology without any extra hardware. It's also why all AMD GCN APUs support adaptive sync, but not all GCN discrete cards do.

And the reviewer wouldn't have found any different hardware than is in any other laptop that supports embedded display port 1.3 because they will all have to conform to the same specifications.

Just another thing, Nvidia can support display port 1.3 without installing the hardware needed to connect to an adaptive sync monitor.

You do understand that the mobile chips used are the same as the desktop ones just running in different configurations, so if one supports the feature then so does the other.
 
You do understand that the mobile chips used are the same as the desktop ones just running in different configurations, so if one supports the feature then so does the other.

You do understand that mobile chips need a timing controller and frame buffer to comply with the embedded display port standard.

They were introduced as a power saving measure in eDP spec for use in laptops and APUs.

These are not in desktop GPUs as Battery life is not something desktop GPU's have to worry about.

Sync tech won't work without a timing controller and a framebuffer and an appropriate scaler. With Nvidia the Gsync module is all of those things. The module is not needed in laptops because the controller and buffer are part of the specification for eDP.
 
So if I understand what you are saying, the timing controller frame buffer and appropriate scaler are all separate from the actual GPU? So in which case none of the AMD or NVidia GPU's are actually Adaptive Sync cabable. as they all need outside parts to work with VRR.
 
So if I understand what you are saying, the timing controller frame buffer and appropriate scaler are all separate from the actual GPU? So in which case none of the AMD or NVidia GPU's are actually Adaptive Sync cabable. as they all need outside parts to work with VRR.

That has always been the case with anything. If you don't have a HDMI port you don't support HDMI. If you don't have a HDMI 1.1 port you don't support HDMI 1.1. If you think about all the connectors are outside parts to work with a monitor, they aren't part of the GPU.

The idea came from VRR in laptops. It's not really a GPU tech, it's a connection tech (display port/HDMI port)
 

Am sure it is. Adaptive-Sync has been in laptops for couple years now. It's main use was power saving now it's used for gaming etc
http://www.anandtech.com/show/9303/nvidia-launches-mobile-gsync
From a technical/implementation perspective, because desktop systems can be hooked to any monitor, desktop G-Sync originally required that NVIDIA implement a separate module - the G-Sync module - to be put into the display and to serve as an enhanced scaler. For a desktop monitor this is not a big deal, particularly since it was outright needed in 2013 when G-Sync was first introduced. However with laptops come new challenges and new technologies, and that means a lot of the implementation underpinnings are changing with the announcement of Mobile G-Sync today.

With embedded DisplayPort (eDP) now being a common fixture in high-end notebooks these days, NVIDIA will be able to do away with the G-Sync module entirely and rely just on the variable timing and panel self-refresh functionality built in to current versions of eDP. eDP's variable timing functionality was of course the basis of desktop DisplayPort Adaptive-Sync (along with AMD's Freesync implementation), and while the technology is a bit different in laptops, the end result is quite similar. Which is to say that NVIDIA will be able to drive variable refresh laptops entirely with standardized eDP features, and will not be relying on proprietary features or hardware as they do with desktop G-Sync.
 
I think the main advantages with DP1.3 is 144hz at higher resolutions with 10 bit panel support. Or HDR as they like to call it. Although we will only get real HDR screens once emmisive screens become retail. Sony do an OLED screen but its expensive as hell and only supports 10bit HDR over SGI-HD connections.
 
Back
Top Bottom