• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel plans to support FreeSync.

If Freesync is possible on the 9 series, I am sure some clever chap will have it working soon enough. And I do mean Freesync :D

It would never be freesync.
So, it'll never work.

If the 9XX series was capable, it's still down to Nvidia to actually create a software solution.
 
If Freesync is possible on the 9 series, I am sure some clever chap will have it working soon enough. And I do mean Freesync :D
Didn't the guy that made the crack for making PhysX to work with ATI/AMD card as the primary card and Nvidia card as PhysX card get an official warning/slap on the wrist from Nvidia?
 
With haswell and ivy, under some conditions a 20 quid air cooler will cool pretty much the same as several hundred quids worth of custom watercooling. Stuff like prime/ibt should never be used. Even tasks like rendering or encoding result in very high temps. Chip in sig at 4.7ghz will easily hit the mid-high 70's in games. Cooled by a k2 dual tower which is comparable to some aio units.

Can't be any worse than what i'm having to contend with now....

I had a Seidon 120V on the FX-9590 and under IBT it was pushing 65 Core and 65c to 70c Socket at 4.7Ghz.

Since then i have put a KRAKEN X31 on it, wow, IBT 48c on the cores... but! still 65 to 70c on the socket

Vishera is not difficult to cool, soldered HS ecte.... but one problem remains. the power consumption. it heats up the Motherboards VRM's, it does not matter how capable those VRM's are, on My Sabertooth they have no trouble dealing with way over 200 watts but the physical thermals of that remain and its hasto go somewhere, that somewhere happens to be the socket, no matter how good the Chip and the Cooler is at keeping things cool the heat bleed off from the VRMs cooks everything.

i like the Vishera chip, for me even the performance is good, i don't play silly DX9 MMO's.

But trying to manage the physical heat output, frankly i have had enough of it, i'm coming round to an i5 just because the physical heat output is far easier to manage, even if the numbers are high physically its not actually generating as much heat.
 
If Freesync is possible on the 9 series, I am sure some clever chap will have it working soon enough.

If the 9XX series was capable, it's still down to Nvidia to actually create a software solution.

Nvidia wouldn't do it, it would go against everything Nvidia said about the PR Gysnc/FreeSync Wars and that would be chaos in here:eek::p, if it was hacked, OcUK Armageddon!!!:eek::eek::p

Didn't the guy that made the crack for making PhysX to work with ATI/AMD card as the primary card and Nvidia card as PhysX card get an official warning/slap on the wrist from Nvidia?

He's no longer with us, Nvidia hired Agent 47.:D
 
if nv properly comply with DP1.2 and VESA then it would be fine - and nv already support it on mobile

They do properly comply with DP1.2/1.3 and VESA, they are on the board of Directors of VESA. The Adaptive Sync part of the specification is optional.

Nvidia can't support adaptive sync because desktop GPUs don't have the hardware needed. AMD started putting the hardware into their GPUs while developing the 290 series cards.
 
They do properly comply with DP1.2/1.3 and VESA, they are on the board of Directors of VESA. The Adaptive Sync part of the specification is optional.

Nvidia can't support adaptive sync because desktop GPUs don't have the hardware needed. AMD started putting the hardware into their GPUs while developing the 290 series cards.

then how do the mobile parts do it.... Nv could use freesync but wont , because they want money from selling gsync licences
 
Nvidia can't support adaptive sync because desktop GPUs don't have the hardware needed. AMD started putting the hardware into their GPUs while developing the 290 series cards.

I believed this was the case also, but as has been said the mobile version of GSync is actually using variable refresh rate, without the Gsync module, so this must be a implementation of NVidia using Adaptive-Sync.
 
OMG NVIDIA SHOULD ADOPT OPEN STANDARDS!!!!!1

If it means that nVidia continue to release new technology a year before the open standard - I'd rather they did not.

They can stick with their closed standard (read: business sense) for all I care.
 
Childish lol. If being childish means having business sense then sure.

Besides the majority of us here buying their products are still childish, we sit on a forum arguing about electronics and playing video games.

Does it benefit us as the consumer? No. But at least use your brain a little to see why they do it and how it's benefiting nVidia. If people really didn't like it they wouldn't buy their products.

This dumb argument again.

"IT'S A BIZNIZ YOU DON'T UNDERSTAAAAAAAAAAAAAAAND"

Bombing schools makes sense to ISIS, do I waive my right to complain about it, no.

Think like a person not a slave.

And to anyone who wants to whinge about the ISIS thing, pick your own analogy. Something involving cars no doubt.
 
Looking at sales figures over the last 10 years there does appear to be a significant number of consumers to whom that does apply. I like the way you've used rolleyes as though everyone is considering switching to AMD. :rolleyes: :D
I don't know about you, but paying more money and happy with freedom of choice being taken away is pretty masochistic :p
 
I don't know about you, but paying more money and happy with freedom of choice being taken away is pretty masochistic :p

I didn't say I didn't want freedom of choice, but buying a gsync monitor is not losing freedom of choice because you can always sell and buy something else or use it as a non-gsync monitor... buying a freesync monitor is the same, it doesn't work with nvidia cards as a *sync monitor...

the way things are now AMD aren't really a choice anyway, we effectively already have a monopoly
Intel supporting adaptive sync doesn't suddenly make them a choice for 1440p@144hz gaming does it and skylake doesn't support it, so it will be at least the next gen before they do... it might force nvidia to support it as well but we are probably 2 years away from that happening as nvidia can keep making hay in the meantime with options that no one else can manage on adaptive sync

if adaptive sync can only manage 75hz and gsync can do 100hz on the same panel, I can see gsync persisting until the scaler makers actually pull their finger out and improve
 
Last edited:
For users, it only make sense if they don't have any intent of switching to AMD ever :rolleyes:

I didn't say I didn't want freedom of choice, but buying a gsync monitor is not losing freedom of choice because you can always sell and buy something else or use it as a non-gsync monitor... buying a freesync monitor is the same, it doesn't work with nvidia cards as a *sync monitor... Intel supporting adaptive sync doesn't suddenly make them a choice for 1440p@144hz gaming does it

the way things are now AMD aren't really a choice anyway, we effectively already have a monopoly

Marine isn't taking into account the fact that people are not tied and can sell a ROG Swift monitor and buy a Freesync monitor (as an example) with the sale funds if they wanted to switch sides.

And agreed Andy, people who only use Intel's IGPU won't seriously be considering buying a A-Sync monitor for gaming. It looks good on paper though with Intel claiming A-Sync support and a good selling feature but truthfully, a PC gamer will use a dedicated GPU.
 
Marine isn't taking into account the fact that people are not tied and can sell a ROG Swift monitor and buy a Freesync monitor (as an example) with the sale funds if they wanted to switch sides.
I don't think it's even worth taking into account due the hassle/impractical, plus it make so little financial sense?

Not being put into a jail and spending money (aka making a loss on selling off old gear) bailing oneself out of jail are not the same.
 
I don't think it's even worth taking into account due the hassle/impractical, plus it make so little financial sense?

Not being put into a jail and spending money (aka making a loss on selling off old gear) bailing oneself out of jail are not the same.

Well I have sold monitors with no problem but the way you are looking at it is with blinkers, so not much else I can say really.

As for your second paragraph, I have no idea what Jail has to do with it and I am fully aware (as should anyone be) that selling hardware will not make you profit but a loss, I have sold lots of hardware over the years and put that money towards new hardware.
 
as it stands, both adaptive sync and gsync "lock you in" to a single GPU vendor, so if potentially selling to buy new hardware puts you "in jail" then you shouldn't buy either, but lots of people are so I can only guess that Marine's view is not representative of the market as a whole
 
Back
Top Bottom