• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
After such a long wait if FreeSync 2 monitors

Yeah. They are taking way to long to announce them. At this point I may just get a normal FreeSync monitor with LFC if Vega is any good. Starting to lose interest in it slowly now as what I really want is OLED with proper HDR, not none of this HDR10 crap. Just fancy something new too keep me happy for a couple of years until proper OLED monitors hit the market at a decent price that does not require KY.
 
After such a long wait if FreeSync 2 monitors are not ready when Vega launches.... :o

Yeah. They are taking way to long to announce them. At this point I may just get a normal FreeSync monitor with LFC if Vega is any good. Starting to lose interest in it slowly now as what I really want is OLED with proper HDR, not none of this HDR10 crap. Just fancy something new too keep me happy for a couple of years until proper OLED monitors hit the market at a decent price that does not require KY.

I got my MG278Q last year, and it's been a stellar monitor. Exact same as the ROG Swift hardware-wise; just missing the G-Sync module.

Having used both, there's a not any noticeable difference between them performance wise. Only difference for me were cosmetics ( I do love the thinner bezels on the SWIFT ), and the price. :)

LFC is very important though, but I can recommend normal FreeSync panels if you buy a quality one.
 
Last edited:
I am badly in need of a new monitor... more so than a GPU to be honest. My monitors tears a lot and even causes massive hitching in BF1 because my fps are too high. I've helped matters by setting fps cap to 60 in graphics settings. But still get tearing a lot.

If Vega launches minus Freesync 2 monitors... I'll be so disappointed.
 
Last edited:
I am badly in need of a new monitor... more so than a GPU to be honest. My monitors tears a lot and even causes massive hitching in BF1 because my fps are too high. I've helped matter by setting fps cap to 60 in graphics settings. But still get tearing a lot.

Since you're in the market for both, it's really best to wait until Computex.

Monitors usually get showcased there, and we have Vega as well. I'm sure AMD won't miss the opportunity to show case both running together there.

Worst case you end up with a current "top" spec Freesync one, and it'll still do you really well.
 
I got my MG278Q last year, and it's been a stellar monitor. Exact same as the ROG Swift hardware-wise; just missing the G-Sync module.

Having used both, there's a not any noticeable difference between them performance wise. Only difference for me were cosmetics ( I do love the thinner bezels on the SWIFT ), and the price. :)

LFC is very important though, but I can recommend normal FreeSync panels if you buy a quality one.

Trouble I had was I got the IPS version of that monitor (MG279Q I believe) and I found it to be such a drop down in IQ from my 4K Dell, that it had to go back. Well to be honest it would have went back anyway for a replacement if I decided to keep it, as it had crazy blb and yellow tint on bottom right as I recall.

My options now are either trying another ultrawide and see how I get on with it, or another 4K panel with Freesync. Ideally should wait for Freesync 2 obviously, but when one gets the upgrade itch, it is hard. Don't even know why I have the upgrade itch to be honest as I am very happy with what I got, just fancy playing with something new I suppose.

The only reason I am even considering ultrawide is because there are hardly any games coming out this year now that I want to play, so might just play some flight/space sim's and ultrawide is obviously very nice for that, not to mention much easier to run vs 4K. But the drop in image quality is a real concern and is off putting.
 
After such a long wait if FreeSync 2 monitors are not ready when Vega launches.... :o

Yeah. They are taking way to long to announce them. At this point I may just get a normal FreeSync monitor with LFC if Vega is any good. Starting to lose interest in it slowly now as what I really want is OLED with proper HDR, not none of this HDR10 crap. Just fancy something new too keep me happy for a couple of years until proper OLED monitors hit the market at a decent price that does not require KY.

I got my MG278Q last year, and it's been a stellar monitor. Exact same as the ROG Swift hardware-wise; just missing the G-Sync module.

Having used both, there's a not any noticeable difference between them performance wise. Only difference for me were cosmetics ( I do love the thinner bezels on the SWIFT ), and the price. :)

LFC is very important though, but I can recommend normal FreeSync panels if you buy a quality one.

I am badly in need of a new monitor... more so than a GPU to be honest. My monitors tears a lot and even causes massive hitching in BF1 because my fps are too high. I've helped matters by setting fps cap to 60 in graphics settings. But still get tearing a lot.

If Vega launches minus Freesync 2 monitors... I'll be so disappointed.

I'm not sure what Freesync 2 monitors will require that isn't already on good Freesync monitors?
As long as you buy one with LFC and a wide working range like 30-144hz or 30 to
whatever the monitors max hz rate is you've got a good Freesync monitor,
That's the main thing that's missing with a lot of early Freesync monitors.

The point of Freesync 2 should be too make sure monitor makers do not short-change us on the minimum specs,
not create a high end line of Freesync monitors and still leave those not wanting to spend
hundreds of additional pounds on features like HDR and quantum dot at the mercy of the monitor makers.

I hope a lot of the features mentioned such as HDR are just additional techs not actual requirements.
It'll be a shame if they are requirements because the market could still get flooded with Freesync monitors that
don't meet the most important requirements. We'd still be in the same boat.

I'm looking forward to getting some clarification from AMD on what is or isn't part of the spec.
 
Last edited:
I'm not sure what Freesync 2 monitors will require that isn't already on good Freesync monitors?
As long as you buy one with LFC and a wide working range like 30-144hz or 30 to
whatever the monitors max hz rate is you've got a good Freesync monitor,
That's the main thing that's missing with a lot of early Freesync monitors.
The point of Freesync 2 should be too make sure monitor makers do not short-change us on the minimum specs,
not create a high end line of Freesync monitors and still leave those not wanting to spend hundreds of additional pounds on features like HDR and quantum dot at the mercy of the monitor makers.
Other than that I think a lot of the features mentioned such as HDR are just additional techs not actual requirements.
It'll be a shame if they are requirements because the market could still get flooded with Freesync monitors that
don't meet the most important requirements. We'd just be in the same boat.
I'm looking forward to getting some clarification on what is or isn't part of the spec.
I'm not sure what Freesync 2 monitors will require that isn't already on good Freesync monitors?
As long as you buy one with LFC and a wide working range like 30-144hz or 30 to
whatever the monitors max hz rate is you've got a good Freesync monitor,
That's the main thing that's missing with a lot of early Freesync monitors.
The point of Freesync 2 should be too make sure monitor makers do not short-change us on the minimum specs,
not create a high end line of Freesync monitors and still leave those not wanting to spend
hundreds of additional pounds on features like HDR and quantum dot at the mercy of the monitor makers.
Other than that I think a lot of the features mentioned such as HDR are just additional techs not actual requirements.
It'll be a shame if they are requirements because the market could still get flooded with Freesync monitors that
don't meet the most important requirements. We'd just be in the same boat.
I'm looking forward to getting some clarification on what is or isn't part of the spec.

From what I remember FS2 certified monitors will have LFC and HDR. And support the a good day range of sync.

Basically they will need to be certified by AMD to get FS2 certification.
 
From what I remember FS2 certified monitors will have LFC and HDR. And support the a good day range of sync.

Basically they will need to be certified by AMD to get FS2 certification.
It will then be interesting to see how prices compare to Fsync, I expect there to be a much smaller difference over time.
 
From what I remember FS2 certified monitors will have LFC and HDR. And support the a good day range of sync.

Basically they will need to be certified by AMD to get FS2 certification.

Is HDR going to be an expensive monitor addition though?
And if it is an expensive addition does it mean there's still going to be loads of
non HDR monitors entering the market using the Freesync name that are not vetted and
controlled in anyway?
 
Is HDR going to be an expensive monitor addition though?
And if it is an expensive addition does it mean there's still going to be loads of
non HDR monitors entering the market using the Freesync name that are not vetted and
controlled in anyway?

Yes. But I imagine to have the FS2 logo it will have to have been validated. Otherwise it will just say Freesync.

A quick Google seems to suggest LFC and HDR.

It's a good thing. It gives you confidence. If you buy a FS2 monitor it will have support for all the basic techniques and everything is more black and white.
 
Last edited:
Thats true and mine the temperature rarely exceeded 55c even running full on, which makes you wonder if they overengineered it. The rad/fan was huge.

Only time i have got one above 55c is running doom 4k at high settings... i think 65c is as high as it went and thats with a push/pull set up (cool intake) rather than the exhuast set up the cards came packed with. im guessing due to the low over heads of vulkan the card can really stretch its legs. It makes me feel like the only reason fury x's were under water was because they thought vulkan and dx12 would have been more mainstream quicker..
 
Rumoured I read was they might be a bit more expensive by the fact AMD have to certify the monitor.

AMD only have to test and certify one of each model, not every monitor that comes off the production line. The extra cost should be negligible across the life of a model. The only reason to charge more would be monitor manufacturers gouging because they can market under the Freesync 2 badge, or manufacturers actually having to do more work to get a fully compliant Freesync 2 monitor, rather than half-assing it as so many of them do today.
 
Yes. But I imagine to have the FS2 logo it will have to have been validated. Otherwise it will just say Freesync.

A quick Google seems to suggest LFC and HDR.

It's a good thing. It gives you confidence. If you buy a FS2 monitor it will have support for all the basic techniques and everything is more black and white.

It'll be a shame if HDR adds a premium not everyone will want or can afford to pay as it will mean there's still an unregulated range of Freesync monitors. It'd be better if they made it so that any monitor made after a certain date must meet the requirements if it uses either Freesync or Freesync 2.

or manufacturers actually having to do more work to get a fully compliant Freesync 2 monitor, rather than half-assing it as so many of them do today.

This is the problem they need to address for all Freesync monitors as it gives Freesync an undeserved bad name.
 
the first batch was.

The pump had a high pitch issue, not the fan. And all of them were replaced if you went to RMA process.


Thats true and mine the temperature rarely exceeded 55c even running full on, which makes you wonder if they overengineered it. The rad/fan was huge.

Yes it is over engineered. Is the same AIO (different card block) used on the 295X2. Far higher power consuming and heating generating card.

Had one and was getting loud (100% fan speed) only when the 295X2 was pushed to the end of the power delivery limits due to heavy overclock, running benchmarks on a hot summer day. And still replacing the fan with two Akasa Apaches or Gentle Typhoons in push pull, dropped the temps down to the floor.

And seems the same AIO goes in the Vega also.
 
From what I remember FS2 certified monitors will have LFC and HDR. And support the a good day range of sync.

Basically they will need to be certified by AMD to get FS2 certification.
I seriously doubt Freesync 2 will only be used for HDR monitors. That will be very limiting.

I think Freesync 2 is simply a tech that has HDR-related benefits when applicable.
 
It'll be a shame if HDR adds a premium not everyone will want or can afford to pay as it will mean there's still an unregulated range of Freesync monitors. It'd be better if they made it so that any monitor made after a certain date must meet the requirements if it uses either Freesync or Freesync 2.

This is the problem they need to address for all Freesync monitors as it gives Freesync an undeserved bad name.

As i understand it FS and FS2 will co-exist. FS2 monitors will likely not cost as much as GSYNC HDR for the fact there is no dedicated hardware chip. It's unclear if AMD's certification process will cost anything ATM. Of course FS2 monitors will cost more than FS1 but like a lot of things you get what you pay for.
 
As i understand it FS and FS2 will co-exist. FS2 monitors will likely not cost as much as GSYNC HDR for the fact there is no dedicated hardware chip. It's unclear if AMD's certification process will cost anything ATM. Of course FS2 monitors will cost more than FS1 but like a lot of things you get what you pay for.

This leads me on to a question. What advantage does Gsync have by having a dedicated hardware chip?
 
Status
Not open for further replies.
Back
Top Bottom