I know that using comments in a tech article is a pretty poor source (especially since they are translated from another language), but if what they are saying is true then it will make freesync virtually useless for gaming because it doesn't work very well with a constantly varying refresh rate (which would explain why AMD went with a demo that had a perfectly constant framrate):
http://be.hardware.info/nieuws/38467/ces-amd-toont-gratis-alternatief-voor-g-sync-freesync
I don't pretend to understand the technical details of how vsync and gsync are working, and I also freely admit that the source I'm getting this from is a bit crap, but if this is actually the case then it would certainly explain why nvidia went with the gsync route when they must have known about the vesa standard that was in development that freesync is using.
If this guy is talking from his behind then do please carry on as you were
He isn't, but he doesn't realise this is what g-sync must do.
If you watch the pendulum demo the key to smoothness isn't variable framerate but the smooth change in frame rate.
Going 60 to 30 to 60fps, means frame times of 16.67 to 33.33ms, to 16.67ms.
Now think v-sync, drops below 60fps, hits 30fps.... goes back to 60fps. This is precisely the stutter g-sync attempts to eliminate. It CAN'T do this by just changing frame rate, it has to smooth the frame rate. The pendulum demo goes from 60 to 59 to 58fps and so on. This means 16.67ms, to around 17ms, to 17.5ms between each frame. With a smooth change that tiny difference(sub 1ms) between frames is what induces the smoothness.
You literally can't explain this as possible unless g-sync is predicting the next frame rate. If 10 frames in a row at 16.67ms apart but the next one is at 30fps and is going to come 32ms later, how do you make that smooth?
Well, you have hardware keeping track, it knows the previous frames came 16.67ms apart, the only way to maintain smoothness is to decide the biggest gap after 16.67ms is acceptable to stay smooth, say 2ms max, and refresh with the same frame again. So 18ms later, the next frame is lets say comes in after 32ms(from the original frame) which is going to be(32-18 = ) 14ms later. so it has this, but knows the last frame time was 18ms... and knows the last frame took 32ms to calculate. So it says, we need it smooth, but know the current frames are being produced at 32ms apart. So it will hold this frame to say 20ms, then show it. Because it wants to maintain smoothness to the previous 18ms frame time... but it knows the frames are being created 32ms apart... so it needs to essentially work towards that one small step at a time. When frame rate increases it's as important to smooth the frame rate the other way, though I suspect it will do quicker steps as though bigger frame time differences, the actual faster frame rate itself likely compensates for that.
Frame smoothing is the absolute main feature of g-sync, not variable frame rate. Frame prediction/tracking will be a monumental part of this, for both companies. Without frame smoothing g-sync is only as smooth as the frame rate change, which could be perfect, or absolutely awful. I'm literally 100% certain g-sync has to absolutely do frame smoothing and will be doing loads of it's own prediction to tell the gpu when it's best to send the next frame.
There will be consistently lots of marketing bull crap from both companies about it because explaining it simply is what 99.9999% of users want, most won't read what I posted let alone if Nvidia/AMD tried to explain it to most people. It's significantly more complex than what I've stated, but essentially impossible to produce g-sync or an AMD equivalent without a huge amount of calculating and tracking. I think/guess Nvidia has done this in hardware and maybe only done this since Kepler. Likely an addition of frame pacing features. ultimately this is like 80% of the frame pacing tech(monitoring frame rates and keeping it smooth) with 20% more stuff being done with matching these changes to refresh rates in the smoothest way possible.
If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's
This is the original post you were responding to and your response
Quote:
Originally Posted by weldon855 View Post
its a vesa standard please tell me how you think AMD plan to lock this?
If it needs the hardware that is present in AMD GPU's then it wont work on non AMD GPU's.
Didn't the original article say it was a proposed VESA standard? or have I got confused during the reading of this thread?
It was specifically in reference to how AMD would lock this in to them and your insistence that if AMD use hardware to make this work then it can't work on non AMD gpu's. This is wrong, as I pointed out.
It's fairly obvious as well, there are pretty much thousands of standards gpu's, cpu's, apus, memory, hdd's all adhere too that allow them to communicate with the same messages and enable the same things, yet the hardware that generates those signals is completely different.
I can plug any of a hundred hdd's into my computer, many/most of them will use different algorithms, memory types, controllers, buses, yet they all send the same messages to my mobo and it can read them all despite the hardware in those devices being different. Likewise one hdd can send and receive data the same from an Intel or AMD motherboard which both have entirely different hardware for doing the same task.
If the monitor is using an industry standard, it does not matter how AMD generates their signal, how they determine what frame rate they want, Intel will be able to generate the same signal from their own hardware any way they please and use the same monitor with the same modes.
It's the entire reason for industry standards, to do precisely what you are saying they won't do. So no you aren't right, AMD wouldn't be able to lock it in by any info we've heard and no a monitor with this option wouldn't need an AMD gpu.
Again to HDD's, how can AMD read and send data to a industry standard sata 3 hdd, and Intel can do the same, using different hardware?
No wonder this industry is in such a mess when standards body (ie VESA) is basically run by no one else but those MAD guys (amd). Sometimes someone has to take a charge and make things happen for REAL
Yup, the more things change the more they stay the same.
They're all insane, greed plays a huge part in it.
HDMI usage costs a company $10k a year and $0.04 per device. Sell 100mil monitors with hdmi ports and that is $4million in income, in one year. Add up every laptop, every monitor, every console, every bluray player, every tv that use HDMI, multiple by the ports available on those devices and realise what the HDMI guys(i forget who else is in it except Sony, there are others) and you realise there are basically $100's millions at stake a year over which format moves forward to become the defactor industry standard and you realise why Display Port, royalty free, (yes a fee to join but it's not insane for big businesses and numerous millions less, effectively free vs using things like HDMI) has encounted such fierce opposition and slow adoption.
AFAIK certification of display port devices has increased like 63% in the past year. It's finally getting there, Intel finally threw it's support behind it, Apple is one of the biggest pushes of it outside AMD. It's pretty much the only viable option for 4k(afaik still the only current single cable that can push 4k above 30hz).