• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The Official Nvidia GeForce 'Pascal' Thread - for general gossip and discussions **

whats up with multi gpu's lately, in the past ive crossfired evrything from 3870 to 290 and sli 970's and 980's and not much to grumble about but ive been on a single fury for quite a bit and all i seem to read is unhappy multi gpu users, is this down to game devs not implementing very well

It was buying a G-Sync monitors fault. I had always been happy with SLI but seeing how smooth G-Sync was on a single GPU, I started to notice microstutter in SLI, which I hadn't done previously. Not all games it has to be said but a good majority had it.
 
Sometimes I don't really understand what do people expect to see from Pascal. Benchmarks of 1080 that leaked yesterday, show some great results. There is one guy at reddit, who has actually ran the same test with his 37% overclocked 980Ti and 5820k's six cores lowered to 3GHz. The result is only 800 points less than the 1080, but don't forget that's the speed of a stock 1080. If we would be able to push that card another 30%+, we have a monster, which will probably cost £499, is more powerful than the 980Ti, generates more heat and consumes less energy.
Screenshot_20160506_123223_1_1.jpg

The thing is that maybe the upgrade would be worthless for the current 980Ti users until the 2017, when Ti will come out. However, those who have a less powerful card will definitely feel the difference.
 
Last edited:
Sometimes I don't really understand what do people expect to see from Pascal. Benchmarks of 1080 that leaked yesterday, show some great results. There is one guy at reddit, who has actually ran the same test with his 37% overclocked 980Ti and 5820k's six cores lowered to 3GHz. The result is only 800 points less than the 1080, but don't forget that's the speed of a stock 1080. If we would be able to push that card another 30%+, we have a monster, which will probably cost £499, is more powerful than the 980Ti, generates more heat and consumes less energy.
Screenshot_20160506_123223_1_1.jpg

The thing is that maybe the upgrade would be worthless for the current 980Ti users until the 2017, when Ti will come out. However, those who have a less powerful card will definitely feel the difference.

Same for a 980 vs a highly overclocked 780 ti.

Once you overclock the 1080 it should be a fair bit more than 3% faster, either way its an 'upgrade' rather than an upgrade:D
 
I agree it likely won't. But why should it, pretty obvious I would have thought from a customers point of view?

No, Andybird was spot on and my tiredness missed Dave2150s mistake (and NVidia already do Adaptive V-Sync) but Pascal is pretty much based on Maxwell's arch and I would pretty much bet money that it doesn't have Adaptive Sync capabilities. The G-Sync module was NVidia's way of getting Adaptive Sync to work on their GPUs and allowed them to extend it all the way back to the 6 series GPUs. G-Sync works well and they can focus on that tech instead of having to deal with mass monitors. Every G-Sync module is tuned to work with the panel it is in and this in turn stops any flickering or other potential issues.
 
Sorry but that's a 100% Nv brand defender mode you've slipped back into.:p

No reason they couldn't accommodate both techs as outwith ulmb(which can be enabled manufacturer side)as believe it or not-they both do the exact same thing.

They should just go with G-Sync Lite with full fat G-Sync to keep the roasters happy.

FreeSync panel choice are absolutely wiping the floor with G-Sync choice, so it'll be a matter of time anyway.:D
 
It won't and why should it?

Why wouldn't you want it :confused: As you have said many times about wanting AMD to do well because it benefits the end consumer in the end.... If given the choice to nvidia users, would you not much rather have a far larger choice of monitors/panels to choose from, not to mention generally much cheaper ones.

Since you mentioned your interest in a 34" 1440, the freesync version is currently £350 cheaper than the gsync equivalent, imo, the gsync version is not worth the extra considering the only difference will be better motion clarity (which comes at the cost of possibly coil whine and scanlines, which can be somewhat fixed but it requires a bit of faff) and to get that benefit, you will need to be pushing 75+ FPS, which is pretty much a no go in most new AAA games with all the current single GPU's unless you are ok with dropping settings.

If pascal has DP 1.3 then that is all that is needed to make use of adaptive sync AFAIK. As I have posted before, only ways nvidia can avoid this:

1. don't include DP 1.3 but that is going to cut of a lot of future monitors and the likes of 4k with 100+HZ refresh rate
2. they get the monitor manufacturers to disable it on the firmware side (unlikely especially when intel will have support for it)
3. they disable it in their own drivers and I wouldn't be surprised if some kid re-enabled it via a hack

Every G-Sync module is tuned to work with the panel it is in and this in turn stops any flickering or other potential issues.

I haven't really seen many reports from freesync users having any problems (any problems are usually down to the monitor i.e. backlight bleed), in fact, the only issue I have read lately is that "some" have problems with the x34 freesync flickering when using freesync + 1440 + 75HZ + 8 bit depth, when dropping to 6 bit, the flickering is gone, but like I said, only a few have reported this, so chances are there is something on their end that is wrong, otherwise all AMD users would be reporting this.

EDIT:

And yup inb4 the comments of gsync being "superior" arrive:

https://forums.overclockers.co.uk/showpost.php?p=29453135&postcount=618
 
Last edited:
Same for a 980 vs a highly overclocked 780 ti.

Once you overclock the 1080 it should be a fair bit more than 3% faster, either way its an 'upgrade' rather than an upgrade:D

I think you missunderstood me, but 1080 is 3% faster than the 980 Ti which is already overclocked by 37%. In total it's a 40% better performance. Of course those Pascal tests might be fake, but only 12 hours left until we'll know for sure.
 
It won't and why should it?

I feel like I'm explaining the obvious here.... If Nvidia decide to adopt open standards, such as adaptive sync (aka freesync) it's nothing but a big bonus for us consumers. More choice, more freedom.

I own both a freesync and gsync setup, something I don't believe you can say Gregster, so I know how good both technologies are first-hand. I do prefer the NVIDIA implementation, as I prefer being able to run Gsync in fullscreen windowed mode (something freesync cannot do), but both technologies are a game changer.

I've seen how well my Asus RoG Skylake/980M laptop run games, effectively matching or beating my 390X/skylake desktop system in many gameworks titles, while running much cooler and quieter.

I'm quite keen to jump on the Pascal bandwagon, and will likely do so regardless of NVIDIA adopting adaptive sync (freesync), but I'd prefer to keep my BenQ 2730Z (1440P freesync monitor), though I'm prepared to sell it and get a Gsync alternative should NVIDIA stick to Gsync.
 
So a 40% increase in 3d mark score between the 1080 and the 980ti, wonder what the real world bump will be and how the 1080's overclocks. Probably still wait for the ti, which could turn out to be a monster!

Although i do have a week out of base coming up, so that money could burn a hole in my pocket...
 
Dave2150, you don't need to convince me and maybe tell Tom Petersen this. He is the one who needs "explaining the obvious" obviously!

Huh? But you just said "why should it". Quite a few explained why it should. Would be better for us all no? Maybe not nvidia's bottom line in terms of selling gsync modules, but I did not know you worked for nvidia, thought you was a consumer like us and would prefer more options. My bad :)
 
So a 40% increase in 3d mark score between the 1080 and the 980ti, wonder what the real world bump will be and how the 1080's overclocks. Probably still wait for the ti, which could turn out to be a monster!

Although i do have a week out of base coming up, so that money could burn a hole in my pocket...

This looks impressive for a Mid-range card in my opinion and it may well overclock further.
 
Back
Top Bottom