• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Official RX580,RX570,RX560 and RX550 series review thread

Soldato
Joined
22 Nov 2006
Posts
23,360
I dont think nvidia makes enough on the G-sync modules for them to care about those sales alone from a profit perspective. It's all about keeping people locked to the green ecosystem and you see similar behavior with other companies ranging from Tech to cars. It's common practice these days. Which is why I believe nvidia would have to be somehow forced to support adaptive sync before that actually happens. They would rather take a loss on the G-sync modules I think

Gsync will die off eventually, it's enviable. When you have two standards doing the same thing, the cheapest always wins and freesync/adaptivesync is free. The only reason it's still alive is because nvidia is forcing it on geforce users, but they already loose customers because of it.
 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
Gsync will die off eventually, it's enviable. When you have two standards doing the same thing, the cheapest always wins and freesync/adaptivesync is free. The only reason it's still alive is because nvidia is forcing it on geforce users, but they already loose customers because of it.

As it stands right now AMD has one major problem with freesync monitors and that is flickering at the lower end of the range. It depends on the monitor of course but the current implementation of LFC and freesync behavior could use some tweaking as in give some options to what should happen when you reach the minimum support freesync range because as it is now with a good deal of monitors due to brightness differences between high refresh rate and low refresh rate it turns into a flicker fest. However i'm also sure something could be done from the monitor manufacturers side.
 
Last edited:
Soldato
Joined
13 Jun 2009
Posts
6,847
As it stands right now AMD has one major problem with freesync monitors and that is flickering at the lower end of the range. It depends on the monitor of course but the current implementation of LFC and freesync behavior could use some tweaking as in give some options to what should happen when you reach the minimum support freesync range because as it is now with a good deal of monitors due to brightness differences between high refresh rate and low refresh rate it turns into a flicker fest. However i'm also sure something could be done from the monitor manufacturers side.
What do you mean? I haven't heard of this flicker problem. Surely the whole point of LFC is to avoid the monitor dropping its refresh rate too far - for example, viewing 30 fps content at 30 Hz would probably introduce more flicker compared to viewing the same content at 60 Hz. With mine, I know it's not going to drop below 57 Hz. If a game drops to 50 FPS, the monitor will be at 100 Hz. I haven't noticed any brightness change due to the refresh rate changing dynamically.
 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
What do you mean? I haven't heard of this flicker problem. Surely the whole point of LFC is to avoid the monitor dropping its refresh rate too far - for example, viewing 30 fps content at 30 Hz would probably introduce more flicker compared to viewing the same content at 60 Hz. With mine, I know it's not going to drop below 57 Hz. If a game drops to 50 FPS, the monitor will be at 100 Hz. I haven't noticed any brightness change due to the refresh rate changing dynamically.

The flicker happens when a monitor hits a fps value lower than the minimum and LFC kicks in causing the refresh rate to double up. An example would be a monitor with a range of 45-100. You are gaming at around 50ish frames but due to more things happening the fps takes a hit and goes to 43 which is then doubled to 86. Due to brightness differences at 50 hz and 86 hz you will see a flicker. Now if the fps keeps going to 44 and then 47 and then 42 and forth and back all the time then the flicker becomes a big problem but as I also said it depends on the monitor cause some are much better calibrated in this department and wont have the brightness differences which then removes the flicker issue. Its not a fault of freesync itself but due to adaptive sync being an open standard there isn't much control over how its implemented which leads to some inferior solutions from time to time and that again leads to a worse experience when using freesync and that gives freesync a bad rep when it happens to no fault of freesync itself.
 
Soldato
Joined
25 Sep 2009
Posts
9,627
Location
Billericay, UK

The results in those titles aren't that bad but no xfire suport for Doom yet? I find that a bit surprising given how its held up as the poster child for AMD hardware.

I know reviewers can't benchmark every title but a lot places all benchmark the same titles which for sli and crossfire reviews I find annoying as Nvidia and AMD can just spend time making sure multiple GPU support is functional in those handful of titles. This might give a perception to readers that this technology actually works pretty well when in fact the opposite is true.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland

This has an extra 2GB of memory on the 6GB GTX 1060 as well, which in theory should help with higher resolutions and tip-top texture quality in games like Shadow of War. In practice, though, these two cards perform more or less as well as each other (except in VR, for which the GTX 1060’s Pascal architecture is a little better optimised than Polaris). Both cost about the same as well, so unless you’ve already bought into the whole Vive/Oculus thing, it’s probably wiser to go with the RX 580.

Any truth in this regarding VR? I thought VR apps would have to make use of any specific tech, which none seem to do so far.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Any specific tech? Like...? I do not understand your question.
VRWorks - multi res rendering, multi projection rendering and single pass stereo. Nvidia's features introduced partly with Maxwell gen2 and fully with Pascal. I don't think any of it has been used so far in VR gaming or even how the tech performs.
 
Permabanned
Joined
2 Sep 2017
Posts
10,490
VRWorks - multi res rendering, multi projection rendering and single pass stereo. Nvidia's features introduced partly with Maxwell gen2 and fully with Pascal. I don't think any of it has been used so far in VR gaming or even how the tech performs.

Aww, I see now. Well, your point just reiterates their statement - go for a RX 580 8GB is the wiser choice ;)
 
Soldato
Joined
6 Aug 2009
Posts
7,071
If I were you, I'd wait another week or two, see if there is anything to these rumours of a Polaris 30 being released.

Yes sounds like there could be some 12nm versions of the 5XX cards. Could be interesting. I'd like to swap my 1070 for an AMD card but I think I'll have to wait for 7nm.
 
Back
Top Bottom