• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

i think in 10-12 months time, DirectX 11 will be used only by smaller game developers and indie.
I really dont think DX12 is going to catch on like some people think it will. Not until we get engines really built for it and improved tools that reduce much of the dirty work.

Until then, I expect lots of optional DX12-port versions that have minimal advantages except for those with really unbalanced setups(CPU vs GPU).

I think it's still something worth considering going forward when you buying a new GPU, but unless you're looking for something to last you 3-5 years, I'd say that it probably shouldn't be top priority. As in, dont pick a card that performs worse outside of DX12 just because you think DX12 will quickly take over, cuz I think that will end up being a mistake. I'd say by the time that DX12/Vulkan starts to become 'the norm', any 2016 card will probably be fairly long in the tooth.
 
I do hope those bench results are correct and there is some overclocking headroom. It would be sweet to see 2Ghz clocks on air :eek:


You do realise the architecture would be a failure if it had to have a 2GHz clock speed just to compete with a Polaris at 1GHz?

I was expecting new architectures for the next gen, not an old one clocked higher. If the 14nm process is only used to achieve higher clock speeds then even AMD could have rehashed their Fiji core and clocked it to 1500MHz to effectively give a 50% improvement over the current cards.
 
Agreed Sean. DX12 needs to be built on a brand new engine which takes all of DX12s goodies instead of slapping DX12 on as a wrapper.

You do realise the architecture would be a failure if it had to have a 2GHz clock speed just to compete with a Polaris at 1GHz?

I was expecting new architectures for the next gen, not an old one clocked higher. If the 14nm process is only used to achieve higher clock speeds then even AMD could have rehashed their Fiji core and clocked it to 1500MHz to effectively give a 50% improvement over the current cards.

A failure? That could be the standard clocks for the new nodes and Polaris could well be clocked high off the bat. Don't put too much stock into what clocks you see either, as it is a die shrink and doesn't mean anything till we see what is what. Fiji is evidence of that with it being an overclockers dream :p
 
Last edited:
Remind me again, who does Tom work for and what is his job title, ah yes, he is the director of technical marketing at nvidia... ;)

Sure choice is good but I would rather they optimise G-Sync (which they do) and they know what each panel is tuned to for their GPUs. More doesn't always mean better. My ROG Swift allows G-Sync to work all the way down to 1fps (not that I would want to game at that framerate) and not sure there are any Adaptive Sync panels that can do this? Some of the Freesync monitors ranges are quite poor as well and with a tiny tiny window in which to allow Freesync to operate. Frames need to be capped as well via external software to keep you within the freesync range. It just starts adding so many variables, I would rather they concentrate and dictate to what panels get G-Sync modules so the consumer gets the best possible experience.

Except it has been proven by various "fair" tests from knowledgeable experts i.e. PCMonitors, TFTcentral, Anandtech etc. that their is no real noticeable difference between the 2 sync techs.

No. ;) They are both equally as effective at eliminating tearing, juddering and stuttering from the traditional refresh rate and frame rate mismatches. I've seen some fairly good pixel overdrive implementations for both FreeSync and G-SYNC models and also some not so good ones. And if there are any latency differences they're certainly beyond my sensitivity for that sort of thing.

The freesync range depends on the monitor used and either way, according to PCM2, it really doesn't matter since AMD have LFC now and as I have said many times before you really don't want to be hitting any lower than 40 fps max anyway, g/free sync is not a silver bullet like what many make it out to be.

Low frame rates are low frame rates regardless. You really don't want to be getting under 56fps or anywhere near that on a 144Hz monitor, it feels and looks extremely sluggish. And besides, now that AMD has LFC the frame rates below the hardware floor (56Hz/56fps) are suitably compensated for to remove stuttering and tearing. It's very much a non-issue really.

Before someone links the tomshardware article, they compared a freesync monitor with completely different sync range and back then AMD/freesync didn't have low frame rate compensation either.

Also, no you don't need external software to cap FPS, you can set a global FPS cap via AMD crimson, which goes from 30 to 200.

Generally more is better as it means that there is more competition thus more competitive prices, if you want gsync monitors, you are pretty limited to who you can choose and the only real players invested in gsync atm are ACER and ASUS who aren't exactly well regarded for their QC + CS and as shown by the acer + asus 27" 1440 144+HZ IPS panels, they are plagued with bleed and dead pixel issues so just because they have gsync does not mean that the customer is going to get a good quality unit "overall".

Either way, there is still no reason why nvidia can't do both.
 
You didn't read my response to you previously or did you just miss it? Sure choice is good but I would rather they optimise G-Sync (which they do) and they know what each panel is tuned to for their GPUs. More doesn't always mean better. My ROG Swift allows G-Sync to work all the way down to 1fps (not that I would want to game at that framerate) and not sure there are any Adaptive Sync panels that can do this? Some of the Freesync monitors ranges are quite poor as well and with a tiny tiny window in which to allow Freesync to operate. Frames need to be capped as well via external software to keep you within the freesync range. It just starts adding so many variables, I would rather they concentrate and dictate to what panels get G-Sync modules so the consumer gets the best possible experience.

I assume your speaking out of experience of using a freesync panel, Gregster?

Oh wait, you haven't. Your just assuming, since you don't own a freesync setup.

NVIDIA adopting adaptive sync/freesync is nothing but a win win for consumers. You get to keep your gsync panels and enjoy the closed ecosystem that that entails, while those with freesync panels can also use variable refresh rates on NVIDIA GPU's.

Literally no-one looses with this idea - saying you'd prefer NVIDIA didn't adopt freesync is an opinion that I and I'd hope the majority don't agree with.

If you review channel is to be a success - you need to think of the broader picture, Greg.
 
You didn't read my response to you previously or did you just miss it? Sure choice is good but I would rather they optimise G-Sync (which they do) and they know what each panel is tuned to for their GPUs. More doesn't always mean better. My ROG Swift allows G-Sync to work all the way down to 1fps (not that I would want to game at that framerate) and not sure there are any Adaptive Sync panels that can do this? Some of the Freesync monitors ranges are quite poor as well and with a tiny tiny window in which to allow Freesync to operate. Frames need to be capped as well via external software to keep you within the freesync range. It just starts adding so many variables, I would rather they concentrate and dictate to what panels get G-Sync modules so the consumer gets the best possible experience.

They are a big enough company, most of the work for g-sync is done as far as I understand. There are not that many g-sync panels for them to have to concentrate much anyway.

Is it not that they make g-sync and sell the module to monitor makers which then do the optimisation needed themselves? I would have assumed they don't spend too long if any at all with each line of monitors to optimise.

Either way, I am not seeing how enabling their cards to work on freesync would get much in the way of their g-sync work. But that is just me. Freesycn would be the standard and they can then advertise g-sync as a premium.

Literally no-one looses with this idea - saying you'd prefer NVIDIA didn't adopt freesync is an opinion that I and I'd hope the majority don't agree with.

If you review channel is to be a success - you need to think of the broader picture, Greg.

Well said. It makes no sense to me what he is saying tbh.
 
Last edited:
Yup, and in the meantime us owners will be enjoying it to its fullest :cool:

I love using Gsync on my laptop, just as much as I love using Freesync on my desktop.

I should add though that I do slightly prefer Gsync, simply because it supports windowed mode and fullscreen windowed mode, whereas freesync only supports exclusive fullscreen mode.
 
Gsync will go the way of nvidia 3d vision, it is only a matter of time.

Seeing as monitors are probably my longest serving component, then I'm happy to have 144hz Gysnc for the next however many years I feel like it.

We aren't talking about GPUs here that are obsolete after 12 months, a good quality 1440p 144hz gsync is going to be as good in 3/4/5 years as the day I bought it, at which point should gysnc have gone EOL then I shall still feel like I've had sufficient value out of it.
 
I love using Gsync on my laptop, just as much as I love using Freesync on my desktop.

I should add though that I do slightly prefer Gsync, simply because it supports windowed mode and fullscreen windowed mode, whereas freesync only supports exclusive fullscreen mode.

Can somebody please explain why you would want to play a game in Fullscreen Windowed mode. I certainly dont......LOL :p

Maybe it's just me, but if by accident I hit windowed mode in a games settings and go in to play the game it really ****es me off that I did it. It looks crap....IMHO of course. :)
 
Can somebody please explain why you would want to play a game in Fullscreen Windowed mode. I certainly dont......LOL :p

For now it's the only option available for UWP games isn't it? If Microsoft are against exclusive fullscreen you don't really want to be dependant on it...
 
Can somebody please explain why you would want to play a game in Fullscreen Windowed mode. I certainly dont......LOL :p

Maybe it's just me, but if by accident I hit windowed mode in a games settings and go in to play the game it really ****es me off that I did it. It looks crap....IMHO of course. :)

Some games use borderless window mode and/or non-exclusive fullscreen mode without any option for fully exclusive fullscreen (even though sometimes it just shows as the fullscreen option) which FreeSync won't work with.

For some games i.e. MMOs some people like to use window/non-exclusive fullscreen modes so they can run helper programs (stats, game guides or maps, etc. or voice comms, not cheats) alongside either on screen at the same time or so they can alt-tab quickly.
 
Some games use borderless window mode and/or non-exclusive fullscreen mode without any option for fully exclusive fullscreen (even though sometimes it just shows as the fullscreen option) which FreeSync won't work with.

For some games i.e. MMOs some people like to use window/non-exclusive fullscreen modes so they can run helper programs (stats, game guides or maps, etc. or voice comms, not cheats) alongside either on screen at the same time or so they can alt-tab quickly.

Maybe this is something AMD should enable themselves.
 
Maybe this is something AMD should enable themselves.

Not sure what the status is with Windows 10 but its a failing of Windows/WDDM really - nVidia works around it by doing some nasty hacks which is partly assisted by their FPGA in the monitor which FreeSync lacks (not sure if the hardware is entirely essential but you need some extra buffers I believe).

Overly simplified in an exclusive fullscreen mode only one thing is writing to the screen - in a windowed type mode potentially a lot of things could be writing to the screen all on different timings which means you have to separate out the application in focus.
 
Last edited:
Back
Top Bottom