• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD readies three new GPUs: Greenland, Baffin and Ellesmere

Intel can produce their own proprietary software to use sync tech on the current adaptive sync displays (Regardless of their "freesync" branding on those monitors, because it's a fairly frugal brand name, although it changes little, that freesync is still an exclusive proprietary AMD tech, because Freesync ISN'T adaptivesync.)

Those are the actual facts.

Intel can't just decide to use the Freesync name though. They may very well create a proprietary software solution, and AMD may very well allow them to call it Freesync, but it won't be the same as AMD's (Not that it matters).

You've also got it the wrong way round. They're NOT freesync monitors. They're monitors with adaptive sync (The optional part of the VESA DP 1.2A spec), which can work with Freesync.

Isn't all that just semantics? surely the point is Intel can use the Free-Sync screens.

At least that was the point some pages back before all this went completely off topic.
 
Yeah is odd that, Fury X would have been great to see. Especially considering the old 290X is taking on and winning some against the Nvidia 980 Ti.

It's almost as if DX11 has been artificially hobbling AMD cards?

Anyone noticed how similar the spec of 980 Ti is to the old 290X?

Same GLFOPS...

Or it is also it like the benchmark is flawed.
 
Isn't all that just semantics? surely the point is Intel can use the Free-Sync screens.

At least that was the point some pages back before all this went completely off topic.

No it's not semantics at all.
If Intel call their sync technology I-Sync, and they're using I-Sync when using these screens, would you say they're using Freesync technology? Of course not. Would you call it a Freesync screen still?

Them being called Freesync screens is a marketing miscommunication, but one that works out extremely well for AMD.
 
They will obviously need permission from AMD but from AMD press releases I don't think AMD will charge for the privilege.

Even still, the code will have to be written specifically for each Intel IGPU to work with A-Sync, so I don't see why Intel would even need to chat with AMD about it. Freesync and A-Sync and G-Sync are essentially the same with different ways of doing refresh rates of the monitor synced with games refresh rate.
 
Even still, the code will have to be written specifically for each Intel IGPU to work with A-Sync, so I don't see why Intel would even need to chat with AMD about it. Freesync and A-Sync and G-Sync are essentially the same with different ways of doing refresh rates of the monitor synced with games refresh rate.

A-Sync is just an ingredient in the Freesync solution. Without Freesync, there's no sync technology to be utilised with Async, without Async there's no monitor tech for Freesync to use.

Intel could by themselves just create a sync tech for their IGP's to work with the existing Async monitors.
 
No it's not semantics at all.
If Intel call their sync technology I-Sync, and they're using I-Sync when using these screens, would you say they're using Freesync technology? Of course not. Would you call it a Freesync screen still?

As you keep saying Free-Sync is AMD's Branding, its their Software, Intel can use I-Sync on a Free-Sync Branded Screen because the only thing in the Screen is a standard Adaptive-Sync scaler.

Surely this is the point being made here?
 
As you keep saying Free-Sync is AMD's Branding, its their Software, Intel can use I-Sync on a Free-Sync Branded Screen because the only thing in the Screen is a standard Adaptive-Sync scaler.

Surely this is the point being made here?

It's a marketing miscommunication, and it's not the point being made. Some people think, and are trying to make the point Freesync/Async is the same thing, when they're not.

The freesync branding on the monitors is completely irrelevant to anything Intel want to do.

You also avoided my question.
 
If Greenland can give me twice the performance of a 980ti at the same price point, with good temps, power usage and noise levels of course, then I'll say goodbye to Nvidia for a while (unless they do better). My 970 is waiting to be retired by a card at least 3 times as powerful as it next year.
 
Even still, the code will have to be written specifically for each Intel IGPU to work with A-Sync, so I don't see why Intel would even need to chat with AMD about it. Freesync and A-Sync and G-Sync are essentially the same with different ways of doing refresh rates of the monitor synced with games refresh rate.

I don't want to continue off-topic in this thread but I don't get why people don't understand that the VESA adaptive sync standard is built into all the new Freesync enabled monitors (they are advertised as Freesync).

Intel will be using the same VESA standard so their gpu's will access the feature exactly the same way AMD gpu's do. They could call their version Kitchen-sync for all I care.

edit: pointless going on. Martini will not understand. 16 yr olds should stick to playing games.
 
Last edited:
I don't want to continue off-topic in this thread but I don't get why people don't understand that the VESA adaptive sync standard is built into all the new Freesync enabled monitors (they are advertised as Freesync).

Intel will be using the same VESA standard so their gpu's will access the feature exactly the same way AMD gpu's do. They could call their version Kitchen-sync for all I care.

Everyone understands what the monitors are, and nobody is saying any differently :confused:
A-sync is only part of the Freesync solution. AMD still have proprietary tech to utilise it!
Same thing for Intel.

You're ridiculously immature, while implying that I don't understand, yet you can't seem to separate the actual technical information.
 
Last edited:
It's a marketing miscommunication, and it's not the point being made. Some people think, and are trying to make the point Freesync/Async is the same thing, when they're not.

The freesync branding on the monitors is completely irrelevant to anything Intel want to do.

You also avoided my question.


I didn't avoid your question. its not relevant as fs123 is not saying Intel are using Free-Sync, he's saying exactly the same thing you and i are.

I keep trying to make the point..Freesync is just the branding name for AMD's implementation of Adaptive sync.
At the end of the day a Freesync monitor is essentially an Adaptive sync monitor (due to VESA standard) so Intel will be able to use it with their software/hardware without paying AMD. They can also use the Freesync name without licensing or royalty fees if they decide to.

Do you agree on that point?

This is why i asked, you are in agreement with fs123.
 
I think what fs123 is saying, is that monitors carrying the FreeSync label will work with N-Sync by default, it's the same tech at the monitor end. The difference between FreeSync and N-Sync will be at the gpu/software side.

That's what you're saying, right :?
 
No I'm not, he keeps saying Freesync/Async are the same thing. He doesn't understand that they're each an ingredient to the solution. You just jumped in as you normally do.

Freesync is a proprietary software solution used to utilise adaptive sync on the monitors, it's not just AMD's brand name for adaptive sync. It's AMD's brand name for their sync solution, which utilises adaptive sync. There's a distinction to be made.
 
Last edited:
I think what fs123 is saying, is that monitors carrying the FreeSync label will work with N-Sync by default, it's the same tech at the monitor end. The difference between FreeSync and N-Sync will be at the gpu/software side.

That's what you're saying, right :?

Thats how i read it.
 
I'm not really a fanboy, just someone who likes to support the underdog. If Nvidia was in AMD's position I'd support them too, as long as they supported open standards though. I'm also a cheapskate who likes to promote competition so I can get cheaper gpu's :p

Funnily enough, that is why I like nvidia.
When I first got into computers, gaming and gelahics, they were the new kids on the block in a world of giants like ATI and 3DFX. They were the underdogs, far smaller budget. They succeeded by simply making the best cards AND supporting he industry standard of OpenGL and DX while the other companies were largely offering their own proprietary low-level API such as 3DFX GLIDE and S3 Metal.

Moreover, I used open-source Linux at the time heavily, and still do. I had an ATI Rage Fury, and boy did it make me rage furiously. The Linux drivers were shocking, the openGL support in Linux and windows was dire. Even as an amateur OpenGL programming making a clone of quake3 (everyone did back then), I was finding driver bugs for ATI that some ATI engineers on the OpenGl forums would acknowledge and pass on to the driver team. At first it felt kind if cool that a newbie could be helping ATI make better drivers, but the fixes never really came, the drivers continued to suck and in general it was a nightmare in Linux or windows.

I swapped out for a Nvidia TNT2 and the difference was night and day. Although the Linux drivers were closed-source they worked perfectly, and I really didn't care if a company made their drives open or closed source as long as they worked. Massive performance boosts in OoenGL. Over the next 15 years it has pretty much always been the exact same experience, get a laptop with an ATI/AMD cars and the OpenGL drivers sucked, Linux drivers sucked and the windows DX drivers ranged from poor to good. Every nvidia card had awesome drivers, incredibly fast Linux with solid support.

Then there is the hardware, I've had 3 graphics cards fail all ATI/AMD, the most recent being in my 27"imac.

Because Nvidia is still by far the best choice for gaming and development with the open source API OpenGl on open sources operating systems like Linux. The fact that Nvida Linux drivers (until recently) were closed source made absolutely no difference to me, they worked and let me use the open source OS and open source cross-platform API I wanted to use. that Is why I prefer Nvidia whenever possible. But I'm not a fanboy, my main computer is an imac with a AMD 6970M. The other computer runs kubuntu with a nvidia gts250.
 
I can't take this anymore.

It's the constant changing the goal posts etc when caught out etc.
And when the personal insults come out, you already know that you've lost.
 
Funnily enough, that is why I like nvidia.
When I first got into computers, gaming and gelahics, they were the new kids on the block in a world of giants like ATI and 3DFX. They were the underdogs, far smaller budget. They succeeded by simply making the best cards AND supporting he industry standard of OpenGL and DX while the other companies were largely offering their own proprietary low-level API such as 3DFX GLIDE and S3 Metal.

Moreover, I used open-source Linux at the time heavily, and still do. I had an ATI Rage Fury, and boy did it make me rage furiously. The Linux drivers were shocking, the openGL support in Linux and windows was dire. Even as an amateur OpenGL programming making a clone of quake3 (everyone did back then), I was finding driver bugs for ATI that some ATI engineers on the OpenGl forums would acknowledge and pass on to the driver team. At first it felt kind if cool that a newbie could be helping ATI make better drivers, but the fixes never really came, the drivers continued to suck and in general it was a nightmare in Linux or windows.

I swapped out for a Nvidia TNT2 and the difference was night and day. Although the Linux drivers were closed-source they worked perfectly, and I really didn't care if a company made their drives open or closed source as long as they worked. Massive performance boosts in OoenGL. Over the next 15 years it has pretty much always been the exact same experience, get a laptop with an ATI/AMD cars and the OpenGL drivers sucked, Linux drivers sucked and the windows DX drivers ranged from poor to good. Every nvidia card had awesome drivers, incredibly fast Linux with solid support.

Then there is the hardware, I've had 3 graphics cards fail all ATI/AMD, the most recent being in my 27"imac.

Because Nvidia is still by far the best choice for gaming and development with the open source API OpenGl on open sources operating systems like Linux. The fact that Nvida Linux drivers (until recently) were closed source made absolutely no difference to me, they worked and let me use the open source OS and open source cross-platform API I wanted to use. that Is why I prefer Nvidia whenever possible. But I'm not a fanboy, my main computer is an imac with a AMD 6970M. The other computer runs kubuntu with a nvidia gts250.

OpenGL is the past, Vulkan is the future.

That's what I'm saying.

Yeah thought so :)
 
Back
Top Bottom