• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel plans to support FreeSync.

Except you just did....

Uranus is another planet you know..I could have been referring to that...



..Not really :p

57567148.jpg


:D:p:D
 
Does gsync currently support a bigger mhz range than freesync?

I cannot wait to try out freesync on my 3570k hd graphics.

How much are freesync monitors nowadays!
 
Does gsync currently support a bigger mhz range than freesync?

I cannot wait to try out freesync on my 3570k hd graphics.

How much are freesync monitors nowadays!

I suspect Minecraft will be about your limit.:D

The igp on Intels mainstream chips is the biggest waste of die space ever. Personally I think they should leave it out and spend the money on better thermal paste.
 
Does gsync currently support a bigger mhz range than freesync?

I cannot wait to try out freesync on my 3570k hd graphics.

How much are freesync monitors nowadays!

I just paid what is a far reflection of value for a 1440p 27" screen, I would have paid more but not much more. Given 4k 28" screens and the cost a 144hz 23-27" screens in 1080p I think that £400+ for a 27"/1440p/144hz is completely and utterly ridiculous. Considering the price I got the Acer at, I think there is profit to be made for everyone involved at minimum £300 if not less.

What I'll never get about people who want a *sync screen and pay for 144hz screen, but then care about not being able to use it below 40hz. 60hz sucks but is bearable, 40hz isn't, 30hz isn't. Herp derp, I spent huge on a 144hz screen but I run it at 35hz..... :rolleyes:

The only screen I know of that has an upper limit is the Asus, which is a bit silly but ultimately if you can sustain above 90fps I'm under the impression it locks to it's full 144hz anyway and will be smooth at that framerate and refresh rate.

If you buy a 144hz panel and want the smoothest and best IQ experience possible you want to be aiming for graphical settings that get you fps in the 60-90fps range.
 
I suspect Minecraft will be about your limit.:D

The igp on Intels mainstream chips is the biggest waste of die space ever. Personally I think they should leave it out and spend the money on better thermal paste.

Don't knock it its useful if your Discrete GPU suffers a premature death.
 
Don't knock it its useful if your Discrete GPU suffers a premature death.

Used it for a few weeks when I was waiting on my new card arriving. But, the vast majority of Ivybridge/haswell owners will disable it when using a discrete card. Adds more heat to what are already extremely hot running chips. You'd be better off investing in a cheap card to keep as a spare. 8 Pack actually stated in a thread on here a few days ago that he had advised Intel to do away with the igp and concentrate more on the thermal flaws of theese cpu's.
 
Used it for a few weeks when I was waiting on my new card arriving. But, the vast majority of Ivybridge/haswell owners will disable it when using a discrete card. Adds more heat to what are already extremely hot running chips. You'd be better off investing in a cheap card to keep as a spare. 8 Pack actually stated in a thread on here a few days ago that he had advised Intel to do away with the igp and concentrate more on the thermal flaws of theese cpu's.


What a usable overclock on a 4690K be with a KRAKEN X31 Strapped to it?
 
Last edited:
What a usable overclock on a 4690K with a KRAKEN X31 Strapped to it?

With haswell and ivy, under some conditions a 20 quid air cooler will cool pretty much the same as several hundred quids worth of custom watercooling. Stuff like prime/ibt should never be used. Even tasks like rendering or encoding result in very high temps. Chip in sig at 4.7ghz will easily hit the mid-high 70's in games. Cooled by a k2 dual tower which is comparable to some aio units.
 
Can depend on the lengths you're willing to go to for a cool chip, a paid for an 8 pack binned, liquid metal modified 4770k, that will stress (aida64) at 5ghz/1.32v and only hit around 65-70c, while gaming the highest I've seen is around 50c with custom water.

Lots of stinkers on haswell unfortunately
 
The problem is the majority of those manufacturers will just tick the box with whatever bargain basement scaler they have purchase, so giving a substandard working window for freesync (I'm sure I've seen some that only work 48-60Hz kek) and nasty overshoot issues because the overdrive is not controlled properly as in a G-Sync display.

This is no different to every consumer market in the world, including all things PC related. The cheaper entry level stuff has lower specifications than the more expensive stuff. There are and will be plenty of high quality Adaptive Sync monitors that will suit the enthusiasts.

Display manufacturers making expensive gaming displays are not going to be swayed by Intel supporting the standard, as people using integrated GPU's are not going to be buying 'gaming' displays. Far more 'gamers' or those likely to drop £££ on a 'gaming' display have G-Sync supporting GPU's than anything else.

This is a logical fallacy. Display manufactures are already making expensive gaming monitors with Freesync, despite AMD significantly less of the discrete GPU market share.
 
This is a logical fallacy. Display manufactures are already making expensive gaming monitors with Freesync, despite AMD significantly less of the discrete GPU market share.

This.

People keep forgetting that Adaptive sync (aka Freesync) is royalty free and part of the DP1.2a spec. All monitor makers can add freesync cheaply and easily to every monitor if the wanted. They will however only add it to the more expensive monitors as a selling point and an excuse to charge more.
 
This is no different to every consumer market in the world, including all things PC related. The cheaper entry level stuff has lower specifications than the more expensive stuff. There are and will be plenty of high quality Adaptive Sync monitors that will suit the enthusiasts.



This is a logical fallacy. Display manufactures are already making expensive gaming monitors with Freesync, despite AMD significantly less of the discrete GPU market share.
I never said they would not continue to make displays. I doubt Intel coming on board will change that premium features seem to turn up on G-Sync displays first and then trickle down in cutdown form to lesser Adaptive Sync models. It's been proven that you can cheaply add Adaptive Sync functionality to a display if you have little care as to the user experiance.

Premium displays are always going to chase the bigger market, and for top end gaming displays that isn't AMD or Intel.
 
This.

People keep forgetting that Adaptive sync (aka Freesync) is royalty free and part of the DP1.2a spec. All monitor makers can add freesync cheaply and easily to every monitor if the wanted. They will however only add it to the more expensive monitors as a selling point and an excuse to charge more.

Just as certain people keep forgetting that nobody can just add FreeSync to anything, they can however add Adaptive-Sync to their monitors and ask AMD if they can be branded FreeSync compatible. ;)
 
So does AMD charge for that ?

Whether they charge for it or not is not relevant, the fact that AMD must be laughing at all the free publicity they get from people using the term FreeSync instead of the correct term Adaptive-Sync when talking about the open standard. even the article in the opening post is doing it, the sooner Intel start to use Adaptive-Sync and call it something else the better, from that standpoint. Maybe Inteli-Sync or something.
 
Whether they charge for it or not is not relevant, the fact that AMD must be laughing at all the free publicity they get from people using the term FreeSync instead of the correct term Adaptive-Sync when talking about the open standard. even the article in the opening post is doing it, the sooner Intel start to use Adaptive-Sync and call it something else the better, from that standpoint. Maybe Inteli-Sync or something.

Nice sidestep from my question. So you can't admit that AMD will not charge for it if someone wants to use the Freesync name. For your information, Adaptive sync was developed by AMD who got VESA to include it in DP1.2a. So your objection to whatever it's called is pointless. Free publicity for a free technology, big deal.

And before you say Nvidia developed it first, may I ask why they didn't submit the idea to VESA?
 
Last edited:
Nice sidestep from my question. So you can't admit that AMD will not charge for it if someone wants to use the Freesync name. For your information, Adaptive sync was developed by AMD who got VESA to include it in DP1.2a. So your objection to whatever it's called is pointless. Free publicity for a free technology, big deal.

And before you say Nvidia developed it first, may I ask why they didn't submit the idea to VESA?

Because they did not have the technology on the GPU, hence the need to create a module within the monitor. Was it not derived from V blank technology used on laptops?

Plus they like to do their own thing so they can charge for it :)
 
Last edited:
Back
Top Bottom