• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD freesync coming soon, no extra costs.... shocker

Well, If for example Screen Vendors are choosing to pass on a cost for using more upto date Display ports then it is them who are imposing extra costs.

While it maybe perfectly reasonable for Screen Vendors to do that its hardly AMD's responsibility or fault.

Wow, talk about denial.

AMD have lobbied for certain things to be added to the DP1.3 specification so that their 'Freesync' can work, by their own admission these things are adding $10-20 to the manufacturing cost which means any monitor which is DP1.3 compliant will cost $10-20 extra to manufacture just because of Freesync. This cost is likely to passed onto consumers whether they have an AMD card or not. GO AMD.
 
Oh dear......there seems to be a lot of confusion about all this as usual.

Free-Sync is not the power saving feature.
The VESA standard 1.2a is not Free-Sync.

They are two completely separate things.

The VESA standard is for Adaptive-Sync. This allows compatible hardware to utilise the feature that was previously for power saving for adapting the refresh rate enabling the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

Free-Sync is AMD's proprietory hardware and software that will allow their GPU's to communicate with Adaptive-Sync monitors.

So yes anyone will be able to utilise Adaptive-Sync monitors, but they will need to use their own hardware and software to do so. I cannot see AMD just giving away their GPU scaler technology to people like Intel/Nvidia and of course that is quite within their rights, just as I do not expect Nvidia to give away their G-Sync technology to anyone else either.

Thank you. :)

@ Frosty. Nvidia's GPU communication Scaler is on the G-Sync Modle, it has to be there because without the V-Vblank compatible DP the scaler has no way of communicating with the screen's Adaptive Sync scaler.

AMD's GPU communication scaler is on the GPU because with the (Now V-Blank Compatible) VESA DP it can communicate with the screen.

And guess what, so can Nvidia. (Now)
 
Last edited:
Thank you. :)

@ Frosty. Nvidia's GPU communication Scaler is on the G-Sync Modle, it has to be there because without the V-Vblank compatible DP the scaler has no way of communicating with the screen's Adaptive Sync scaler.

AMD's GPU communication scaler is on the GPU because with the (Now V-Blank Compatible) VESA DP it can communicate with the screen.

And guess what, so can Nvidia. (Now)

NVidia's display scaler is just as able to do this in conjunction with drivers so why do you think they opted to use a scaler with a 768mb buffer? As I was saying, the ASIC is probably more intricate than anything that panel manufacturers are planning on implementing.
 
NVidia's display scaler is just as able to do this in conjunction with drivers so why do you think they opted to use a scaler with a 768mb buffer? As I was saying, the ASIC is probably more intricate than anything that panel manufacturers are planning on implementing.

The G-Sync scaler / Module.... can and does do everything Free-Sync does.

What I'm saying is the reason for its location is down to the original DP not being compatible with V V-Blank, so it has to be behind the DP input. round peg square hole

Now with the compatible VESA DP it can be on the GPU, where AMD's is. because it can communicate through the new DP. *Round hole.

Its all good :)
 
Last edited:
@ Frosty. Nvidia's GPU communication Scaler is on the G-Sync Modle, it has to be there because without the V-Vblank compatible DP the scaler has no way of communicating with the screen's Adaptive Sync scaler.

AMD's GPU communication scaler is on the GPU because with the (Now V-Blank Compatible) VESA DP it can communicate with the screen.

And guess what, so can Nvidia. (Now)

Sorry humbug can you provide a link for your source on that please or is it just your guess as to what the G-Sync module is, and lack of scalar on Nvidia GPU's.
 
Sorry humbug can you provide a link for your source on that please or is it just your guess as to what the G-Sync module is, and lack of scalar on Nvidia GPU's.

If I can find it, I think its in Nvidia's G-Sync technical data.... no time right now, can I do it later?

Nvidia's GPU's do have a scaler like any other, I would think they do or they wouldn't give you much of a picture.
 
Great reference :D :cool:

*Bebop
Lol I did wonder but didn't fancy checking :p
The G-Sync scaler can and does do everything Free-Sync does.

What I'm saying is the reason for its location is down to the original DP not being compatible with V V-Blank, so it has to be behind the DP input.

Now with the compatible VESA DP it can be on the GPU, where AMD's is.

Its all good :)

Got any proof of this? What's stopping them doing it on the GPU? It's obviously not an issue using it on the Onboard ASIC. Which is better? Variable refresh rate is something that has to be compatible on both the scaler and the GPU. Surely if the panel is having to resend frames to the LCD it's quicker from...well the LCD...

Lol :D
 
Last edited:
I like how this whole FreeSync thing has gone from being 'it won't cost anything extra (but you will need a new monitor, obviously)' to something like "it's-free-between-2:31-and-2:32pm-every-3rd-saturday-of-each-month-without-an-R-in-it-if-there-is-a-rainbow-visable-from-the-west-face-of-big-ben" as to what the 'free' part refers to. :)

From my point of view I'd quite expect it to be a bit like Mantle and DX12. Mantle will be better because it was designed around specific hardware. I expect GSync to be much the same. That's not to say DX12 and 'FreeSync' won't do a perfectly good job, but it wouldn't surprise me if the custom designed option worked a little bit better than a generic 'standard'.
 
Ahhh I'm with you now!! :)

:D

Closer comparison now?

I understand that wasn't the 'ultimate' comparison, but it's the best there is yet of added cost for a feature.

G-Sync is here already. Of course AMD knew about it. I know I have to take the bins out tomorrow. Maybe I'll leave it 6 years though. **** it.

Yeah, know how you feel, I left the bins out for 2 years waiting on surround gaming...
 
Uneducated silence would have sufficed :p


You chose a couple of crap examples there considering Nvidias tessellation performance has been better up until very recently. Even after they added multiple geometry engines it was still behind. Surround support was less than the time between Titan and 290x. Nice derailment though guys.
 
Uneducated silence would have sufficed :p


You chose a couple of crap examples there considering Nvidias tessellation performance has been better up until very recently. Even after they added multiple geometry engines it was still behind. Surround support was less than the time between Titan and 290x. Nice derailment though guys.

TruForm was an early tessellation implementation created by ATI and employed in DirectX 8 and OpenGL graphics cards, on both Mac and Windows. The technology was first employed on the Radeon 8500 in 2001. It was never accepted into the DirectX or OpenGL specifications.

Contents [hide]
1 Overview
2 Games with TruForm support
3 References
4 External links
Overview[edit]
Before the adoption of pixel shader-enhanced bump mapping methods such as normal and parallax mapping that simulate higher mesh detail, curved 3D shapes in games were created with large numbers of triangles. The more triangles are used, the more detailed and thus less polygonal the surface appears. TruForm creates a curved surface using the existing triangles, and tessellates this surface to make a new, more detailed polygonal model. It is designed to increase visual quality, without significantly impacting frame rates, by utilizing hardware processing of geometry.

TruForm was not significantly accepted by game developers because it ideally required the models to be designed with TruForm in mind. To enable the feature without causing visual problems, such as ballooned-up weapons, the models had to have flags identifying which areas were to be tessellated. The lack of industry-wide support of the technique from the competition caused developers to ignore the technology.

In later version of Catalyst drivers, the TruForm feature is removed.

Beginning with Radeon X1000 series, TruForm was no longer advertised as a hardware feature. However, Radeon 9500 and higher (as well as hardware supporting Shader Model 3.0) include Render to Vertex Buffer feature, which can be used for tessellation applications.[1] In the case of Radeon X1000 series, it supports binding up to 5 R2VB buffers simultaneously. Tessellation as dedicated hardware has returned in Xenos and Radeon R600 GPUs.

Notice this part

The lack of industry-wide support of the technique from the competition caused developers to ignore the technology.

Any how i was only having a little fun as i think you are.
 
Yeah and when it had matured all people did was accuse developers of deliberately using too much as if that was in their interests lol.


No pleasing :p. I don't think you really were having fun though lol. Was making a perfectly valid point regarding G-Sync until someone crashed a page or 2 late :D.

I think we all know none of it's really important. Some people are just easier to please than others that's all..

:D
 
Yeah and when it had matured all people did was accuse developers of deliberately using too much as if that was in their interests lol.


No pleasing :p. I don't think you really were having fun though lol. Was making a perfectly valid point regarding G-Sync until someone crashed a page or 2 late :D.

I think we all know none of it's really important. Some people are just easier to please than others that's all..

:D

I was having fun and still am but it can be tough with you :D.

The only point i was referring to was your bin and waiting for years. It happens in the world of tech all the time. My point is just as valid as yours. Let's end the fun here as it won't go anywhere.
 
Its pretty pointless arguing semantics when we are all waiting to see if Sam gives us more info or not. Because of the lack of concrete info regarding the monitors (not technical specs on a white paper, im on about something I can actually buy), its a bit silly trying to do a comparison.

I for one look forward to the first project freesync supported screens hitting the market, as its bound to drive gsync prices down with them. Obviously I'm an early adopter of gsync and have used it for months, but I'd still like to see this tech rolled out as a standard for all gamers to enjoy as its the best tech I've used in an extremely long time.
 
Its pretty pointless arguing semantics when we are all waiting to see if Sam gives us more info or not. Because of the lack of concrete info regarding the monitors (not technical specs on a white paper, im on about something I can actually buy), its a bit silly trying to do a comparison.

I for one look forward to the first project freesync supported screens hitting the market, as its bound to drive gsync prices down with them. Obviously I'm an early adopter of gsync and have used it for months, but I'd still like to see this tech rolled out as a standard for all gamers to enjoy as its the best tech I've used in an extremely long time.

+1, some of the posts in this thread would have been better served in Sam's Free-Sync FAQ thread.
 
I for one look forward to the first project freesync supported screens hitting the market, as its bound to drive gsync prices down with them. Obviously I'm an early adopter of gsync and have used it for months, but I'd still like to see this tech rolled out as a standard for all gamers to enjoy as its the best tech I've used in an extremely long time.

+2.

I get so tempted to get a cheap monitor on the typical spam emails from popular UK outlets, it's only this tech is so near as to why I am holding out.
 
Back
Top Bottom