• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD : FreeSync Monitors Shipping in December – Will Cost $100 Less Than Nvidia G-Sync

Status
Not open for further replies.
"Nvidia's Tom Petersons says..."

Oh for goodness sake.

More intriguing is another possibility Nalasco mentioned: a "smarter" version of vsync that presumably controls frame flips with an eye toward ensuring a user perception of fluid motion.

Petersons was also mentioned in the article, but only regarding frame metering & existing stutter problems. (very important and has delivered great improvements for us users, but not about adaptive sync)
 
Last edited:
er, wut? nothing beyond 1080p?
then how come DP1.2 supports 2560x1440 and 4K via SST, not MST as you are trying to claim, wow bizaro world, lets just make stuff up to try and look clever and claim AMD invented everything
the first few 4K monitors used MST because there were no SCALERS that supported greater than 2560x1600, so they had to cobble two SCALERS together to get 4K working, not 2 cables... the newer 4K monitors use a SINGLE SCALER so they use a SINGLE CABLE and SINGLE TILE... they are DP1.2 NOT DP1.3

It would so so so help if you could read. See the comma after the word standard? Is it meaningless, no it's god damned not.

What was the major problem with 4k gaming up till pretty much the beginning of this year. HDMI was stuck at 30hz, half the industry is based around tv's/bluray players and the like so half of everything 4k had to account for HDMI. Until this year there was no standard cables for all devices to simply hit 60hz on a single cable.

The lack of HDMI 2.0(even being announced/finished) meant almost every 4k panel was designed to be used as two tiles or a single 30hz tile. IE everything either came with a single hdmi port and was stuck at 30hz(in which almost every gaming review would say... it sucks balls, wait for 60hz), or two hdmi ports OR a display port(or in exceptional circumstances... both) meaning the industry made chips that in either case would take two tiles and make one image. This created problems all around.

There was a proper standard all around for 1080p, it was thought up ahead of time, everyone got ready and everyone did what was required.

With 4k... nope. They didn't get together and say, hey HDMI make a 60hz 4k cable for 2 years from now, hey panel makers you make 4k screens for 2 years from now, hey scalar chip makers you make 4k 60hz capable scalars for hdmi/dp for 2 years from now, hey DP... oh, I see you're ready for 4k... as you were then.

Screen makers made screens, HDMI which dominates half the industry did nothing, scalar makers had to cater to HDMI, screen makers had to cater to HDMI/what the scalar chips were doing... DP was there but wasn't being used effectively. A standard that everyone adheres to is required, one cable being good enough but one not isn't good enough. Everyone could make 120hz screens with tile supported dp cables, but they won't because there is a standard coming out to support 120hz and it will be easier to stick to that.

DO you think it's some surprise that this year prices dropped, production increased, more companies are making more models and HDMI 2.0 with 60hz 4k minimum standard being reached is a complete and utter coincidence? Yes things got better before screens even had it, but when the goal posts moved(ie screen/scalar makers knew it was coming) they prepared for it.

This works for software as well, if everyone had a specific "we'll be doing 4k in two years together" message and tell dev's to support 4k UI's from today because it makes sense, you might have all games from then on be fine with 4k. When the message is... we might be doing 4k, at some point, but we're not going to really announce properly how or when we'll do it.. well software devs being paid by companies who care about bottom lines will often not support things they aren't pretty much required to. So many games in 4k are crap with unreadable text or awful tiny UI's.

Everyone being on the same page, working to the same goal and being ready means better things for everyone.
Unfortunately HDMI is here to stay on the tv/console/cable box side of things thus monitor makers will cater to the standards of the lowest common denominator, it's genuinely ridiculous how HDMI taking an age to make something useful for 4k held back everything so much but that is how the industry works in many areas. Standards are the ONLY reason things move on significantly, everyone doing their own thing would have so much competing rubbish that is incompatible that no progress would ever get made.

There are alternatives, if Nvidia or AMD could design and build their own screens and cables then they could pretty much do what they wanted, in an industry where hundreds/thousands of companies work together on devices that need each other, standards are crucial...
 
yes, you said "because there was no official standard, cable, nothing beyond 1080p"

both the standard and the cable existed since DP1.2

There were 2560x1600 screens and the like but these used tiling methods, there was no official standard to push the industry towards. 4k was the answer to that yet no one actually came up with a proper cabling standard for it... seemingly because everyone involved is stupid.

2560x1600 monitors never used tiling, DP1.2 was an official standard, it supports all of the things you claim is doesn't - 2560x1440 up to 144hz, 4K@60hz

in previous posts you stated that 4K@60hz used 2 cables and that AMD invented the capability to do 4K@60hz via a single cable... having been proven wrong you've now written a massive long totally off topic ramble to try to deflect from the fact that you keep making stuff up
 
Last edited:
What he was trying to say was: People who can't read more than one line of text often miss the point. I think. I only read the first line. :p j/k

Edit: ohh, 1k posts! Bit of a lame post for the milestone :(
 
Last edited:
Well if all else fails.... Hundreds of words will fix it :D

Lmao he just spent the last hour or so writing that out and checking his sources from Google. Words fail me and him both

I like the last bit where he mentions if they could build their own monitors but when NVidia distribute their own module as the 'industry' is taking so long, they're not allowed to reap any profit and should give it to their direct competiton.

Not that it's relevent to the topic anyway, but seriously wake up
 
Last edited:
I've spoke to Shane several times, and his name is clearly on those white papers. They will have had very little say in what goes and what doesn't in terms of specification. Totally reliant on getting panel manufacturers on board.
Obviously adoption for everyone is the way forward, but as far as investment is concerned with R&D in terms of 'driving' force, I think most would agree that is a little bit of an exaggeration lol. Think it might be time to move on, the Sammy panels driven by Free Sync will speak for themselves, logistics aside. Who knows it may even work better than G-Sync.
 
Laugh a minute this thread..:)

Just going back to the whitepaper date thing, I see that nobody mentioned that if AMD and VESA were talking about this tech in a white paper in May 2013 but then didn't announce till Jan 2014, 8 months latter.

Then Nvidia announced in Oct 2013, so by that reckoning they would have been working on it at least 8 months before hand as well, making it earlier than the AMD/VESA whitepaper.

Just thought I would put that completely irrelevant thought out there.


End of the day does it really matter who was working on what before who, these people change companies so often it might have even been the same bloke who originally thought of it for all we know.

We just need to see the Freesync monitors hit the shelves and see some reviews, to see if it is does the job the same as GSync.
 
End of the day does it really matter who was working on what before who, these people change companies so often it might have even been the same bloke who originally thought of it for all we know.


lol the amount of people that jump ship to either company is pretty overwhelming. All I know is this man who's name is the first on the G-Sync patent is not to be sniffed at. He's got more patents to his name than Gabe Newell has chins
 
There were 2560x1600 screens and the like but these used tiling methods, there was no official standard to push the industry towards.

u wot m8?

I had a Dell 3008wfp. One of the very first display port capable monitors released. It never used tiling, a single scalar, single stream only device from the off (it also supported 1600p over DVI, which does not support multiple display streams at all). Displayport 1.0 fully supported 1600p displays.


You really need to learn about things before you start splurging words onto the screen, it would save you the effort, and us the time required to trawl through yet another word heavy, content light post.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom