• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The AMD Driver Thread

Permabanned
Joined
2 Sep 2017
Posts
10,490
How can I check whether freesync is available on a monitor? If the freesync option can be turned off and in in Radeon Settings does it mean the monitor definitely supports it?

Edit: I know the simple answer is to check the monitor specs but there is conflicting info on this monitor.

My LG 24UD58-B supports Freesync but in the Radeon Settings it is highlighted as Not supported. With an HDMI cable.
Maybe it asks for a DisplayPort connection.
 
Man of Honour
Joined
19 Oct 2002
Posts
29,524
Location
Surrey
My LG 24UD58-B supports Freesync but in the Radeon Settings it is highlighted as Not supported. With an HDMI cable.
Maybe it asks for a DisplayPort connection.
Thanjs. Yes I believe it needs displayport for freesync.

The reason for asking is that I have a thread going in the monitor section about the Alienware AW3420DW. I bought it not expecting it to support freesync as it is not advertised with it. It is advertised with only gsync. But someone else checked with Dell who said it does support freesync too. We all assumed it was a mistake by the person on the helpdesk but it is actually showing as enabled in my Radeon Settings. So I am trying to confirm if that proves it does actually have freesync. It's currently looking like it does.
 
Caporegime
Joined
8 Jul 2003
Posts
30,062
Location
In a house
AMD Adrenalin 2020 Edition teased with “Radeon Boost” technology


https://videocardz.com/newz/amd-adrenalin-2020-edition-teased-with-radeon-boost-technology

HiAlgo Boost:

intercepts and on the fly modifies commands sent from the game to the graphics card, optimizing performance frame by frame. When you are in action and the camera moves, BOOST decreases the resolution to boost the framerate. When camera stops, it restores full resolution. This provides optimal responsiveness and smoothness for the game, even with weak graphics card.

:D
 
Last edited:
Associate
Joined
26 Jun 2015
Posts
669
Not funny, TBH. Someone might attack it as being cheating. I don't like it. Hope there will be an option to turn it off.

sure it will be, though options are never a bad thing, however what would be good scenario is actually setting the ingame res higher then your own monitors i.e, having a native 1440p display but turning on the game to be at 4K and the game can down res from that point in between ideally whilst maintaining the desired frame rate.

This is a big boon to games that lack proper AA but is demanding non the less at high resolution.

So this way, you can have best of both worlds and I think gears 5 actually runs like this in a similar way.
 
Soldato
Joined
1 Jun 2013
Posts
9,315
Not funny, TBH. Someone might attack it as being cheating. I don't like it. Hope there will be an option to turn it off.

How is it cheating? Competitive gamers already play low res, low poly, sometimes even no texture to get maximum framerate and responsiveness. It's dynamic res, just like dynamic chill, or cutting mad tessellation down to normal levels.

It's just another clever way to gain more performance without throwing more hardware grunt at it, just like RIS. If it works, people won't be able to tell the difference, and I'm sure it will be a toggle in the drivers if you don't want it.
 
Caporegime
Joined
18 Sep 2009
Posts
30,112
Location
Dormanstown.
How is it cheating? Competitive gamers already play low res, low poly, sometimes even no texture to get maximum framerate and responsiveness. It's dynamic res, just like dynamic chill, or cutting mad tessellation down to normal levels.

It's just another clever way to gain more performance without throwing more hardware grunt at it, just like RIS. If it works, people won't be able to tell the difference, and I'm sure it will be a toggle in the drivers if you don't want it.

I imagine cheating in the terms of benchmarking.
If it's the default option then that could allow AMD to gain performance improvements, at the cost of IQ that may not come across in an out and out benchmark.
 
Soldato
Joined
1 Jun 2013
Posts
9,315
I imagine cheating in the terms of benchmarking.
If it's the default option then that could allow AMD to gain performance improvements, at the cost of IQ that may not come across in an out and out benchmark.

It depends I guess if the res-drop is noticeable. It may just be a case that new tech comes along and makes current benchmarking paradigms redundant. Tech like this is designed to increase performance, and as long as everyone knows what is being benched, then it's not a cheat IMO. As long as it's not pretending to be something it isn't and it's clear that it's a performance or IQ improvement that you know is a trade-off that you're happy to make because you can't see the difference.

Otherwise you would discount the use of tech like RIS or AA. The former sacrifices res for performance and then filters to bring back higher-res-like details, and the latter sacrifices performance for improved IQ. Where do you stop? Is polygon culling a cheat because it improves performance and you can't tell the difference, even though it's not rendering the polygons you can't see? Is AMD's tessellation limitation a cheat which was an AMD response to Nvidia doing the exact opposite to choke all AMD and lesser Nvidia cards with vast amounts of unnecessary tessellation?

We shouldn't test cards with all their improvements turned off, because that's not a real world test, but it's just easier for the testers. You then end up with unrepresentative performance results and ever-more synthetic tests that are irrelevant to real world performance.
 
Caporegime
Joined
18 Sep 2009
Posts
30,112
Location
Dormanstown.
It depends I guess if the res-drop is noticeable. It may just be a case that new tech comes along and makes current benchmarking paradigms redundant. Tech like this is designed to increase performance, and as long as everyone knows what is being benched, then it's not a cheat IMO. As long as it's not pretending to be something it isn't and it's clear that it's a performance or IQ improvement that you know is a trade-off that you're happy to make because you can't see the difference.

Otherwise you would discount the use of tech like RIS or AA. The former sacrifices res for performance and then filters to bring back higher-res-like details, and the latter sacrifices performance for improved IQ. Where do you stop? Is polygon culling a cheat because it improves performance and you can't tell the difference, even though it's not rendering the polygons you can't see? Is AMD's tessellation limitation a cheat which was an AMD response to Nvidia doing the exact opposite to choke all AMD and lesser Nvidia cards with vast amounts of unnecessary tessellation?

We shouldn't test cards with all their improvements turned off, because that's not a real world test, but it's just easier for the testers. You then end up with unrepresentative performance results and ever-more synthetic tests that are irrelevant to real world performance.

It all depends how noticeable that it is.
If there's a perceivable IQ difference caused by default driver optimizations then I have a problem with it from a benchmark point of view.

I mean the Heaven thread here years ago had people needing to provide a screen shot of their run to validate the IQ because there was noticeable tessellation differences that could be used via driver.

If you test 1440P with RIS but call it 4K results because it's on a 4K monitor, then that's a problem for example, but the use of RIS isn't inherently going to negatively effect the IQ as you can use native IQ with RIS and benchmark (Though I don't see the point from a benchmark POV)
If something is dynamically lowering the resolution it's going to pretty much void the point of benchmarking at X or Y resolution.
 
Soldato
Joined
1 Jun 2013
Posts
9,315
If something is dynamically lowering the resolution it's going to pretty much void the point of benchmarking at X or Y resolution.

Yes, but if it works, then there's an argument that current tests have been made out-dated. Did G-sync or Freesync void benchmarks, because now the frame rate achieved is decoupled from whether it's a good game-playing experience? ie, high framerates become less relevant to whether the experience is "good" or "bad", where previously, high framerates were considered desirable.

Maybe this will do the same thing to the tipping point between the framerate and resolution trade off? It's an interesting idea that is certainly worth checking out.
 
Soldato
Joined
24 Oct 2005
Posts
16,279
Location
North East
So games are gona look like streams on twitch so when someone moves left or right quick and it goes a bit blury or the bitrate is low it looks blocky is that what we would see with that dynamic res switching thing? Sounds like that what it might be like to me.
 
OcUK Staff
Joined
17 Oct 2002
Posts
38,229
Location
OcUK HQ
Remember it is December and as per usual fashion AMD generally every December release a driver that gives a decent performance boost on all their GPU's, not just the new stuff. :)

NAVI about to get even faster! ;)
 
Soldato
Joined
1 Jun 2013
Posts
9,315
So games are gona look like streams on twitch so when someone moves left or right quick and it goes a bit blury or the bitrate is low it looks blocky is that what we would see with that dynamic res switching thing? Sounds like that what it might be like to me.

Nobody outside of AMD knows yet. It might be like RIS, which sounds like it would be a low-res mess of upscaling, but actually works really well with almost no framerate impact and provides better visuals.
 
Caporegime
Joined
8 Jul 2003
Posts
30,062
Location
In a house
So games are gona look like streams on twitch so when someone moves left or right quick and it goes a bit blury or the bitrate is low it looks blocky is that what we would see with that dynamic res switching thing? Sounds like that what it might be like to me.

Remember it is December and as per usual fashion AMD generally every December release a driver that gives a decent performance boost on all their GPU's, not just the new stuff. :)

NAVI about to get even faster! ;)

Yup :D
 
Soldato
Joined
18 Feb 2015
Posts
6,484
https://videocardz.com/newz/amd-announces-radeon-adrenalin-2020-edition-drivers

Adrenalin-2020-3.jpg
 
Back
Top Bottom