• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX 5700 XT Owners Thread.

Morning folks.
I am a new owner of the Red Devil 5700XT but I seem to be struggling with it. I very rarely see the card hit anything near the default clock speed, let alone the advertised boost speed. It is running stock on a fresh installation of Windows 10 with the following hardware;

MSI B450 Tomahawk
Ryzen 2600
16GB TUF DDR4 PC4-24000C16 3000MHz
Superflower Gold 650W PSU

The BIOS, GPU drivers and chipset are all up to date.

Any pointers would be greatly appreciated.

What clock speed are you seeing in game?

What is the default clock speed and voltage as shown in Radeon Software?
For a start running +50% power limit should help increase clock speed.
 
1905MHz (Game) /up to 1770MHz (Base) / up to 2010MHz (Boost)

all depends on the game and your system.

The card doesn't hold the same frequency constantly. The stock fan curve is pretty low so if your case is strangled for air flow the card will clock down to stay at its temp target.

Run superposition and loop at the power and clock speeds. Make sure you are running the OC bios not the quiet one.

Thanks for the response Ross.
I ran Superposition several times on the 1080p High settings; the card reached a maximum of 2024MHz 221W draw at 99% load according to GPU-Z, so I suppose it is actually functioning correctly. I also set a much more aggressive fan curve despite, what I thought to be, acceptable ambient temperatures inside of the case. Is there any particular way I can force the card in to performing more aggressively with certain games? It seems then that the ones I do play don't take full advantage of what is available.

What clock speed are you seeing in game?

What is the default clock speed and voltage as shown in Radeon Software?
For a start running +50% power limit should help increase clock speed.

The Radeon software shows the idle clock speed to be 786MHz and 725mV.
 
Thanks for the response Ross.
I ran Superposition several times on the 1080p High settings; the card reached a maximum of 2024MHz 221W draw at 99% load according to GPU-Z, so I suppose it is actually functioning correctly. I also set a much more aggressive fan curve despite, what I thought to be, acceptable ambient temperatures inside of the case. Is there any particular way I can force the card in to performing more aggressively with certain games? It seems then that the ones I do play don't take full advantage of what is available.



The Radeon software shows the idle clock speed to be 786MHz and 725mV.
I meant what are the values shown in Radeon Software that you can adjust yourself?
 
What clock speed are you seeing in game?

What is the default clock speed and voltage as shown in Radeon Software?
For a start running +50% power limit should help increase clock speed.

Forgive my apparent ignorance in this, as I am not entirely sure where you mean.
If I open up the fine tuning controls for the GPU tuning, I can see the following (Frequency / Voltage) 800/750 1450/800 2100/1190.
 
Thanks for the response Ross.
I ran Superposition several times on the 1080p High settings; the card reached a maximum of 2024MHz 221W draw at 99% load according to GPU-Z, so I suppose it is actually functioning correctly. I also set a much more aggressive fan curve despite, what I thought to be, acceptable ambient temperatures inside of the case. Is there any particular way I can force the card in to performing more aggressively with certain games? It seems then that the ones I do play don't take full advantage of what is available.



The Radeon software shows the idle clock speed to be 786MHz and 725mV.

Morning folks.
I am a new owner of the Red Devil 5700XT but I seem to be struggling with it. I very rarely see the card hit anything near the default clock speed, let alone the advertised boost speed. It is running stock on a fresh installation of Windows 10 with the following hardware;

MSI B450 Tomahawk
Ryzen 2600
16GB TUF DDR4 PC4-24000C16 3000MHz
Superflower Gold 650W PSU

The BIOS, GPU drivers and chipset are all up to date.

Any pointers would be greatly appreciated.

I'm going to make the assumption here that because you are seeing up to 2024Mhz in Superposition you are expecting to see that clock speed in all your games and that's not what is happening?

Is there a particular game you are worried about? What game is it, what resolution and what GPU clocks are you seeing in it? can you give me an example?
 
Forgive my apparent ignorance in this, as I am not entirely sure where you mean.
If I open up the fine tuning controls for the GPU tuning, I can see the following (Frequency / Voltage) 800/750 1450/800 2100/1190.
Okay, decent high starting clock at least. What clock speed do you see in superposition 1080p extreme at stock with +50% power limit set?
 
Thanks for the response Ross.
I ran Superposition several times on the 1080p High settings; the card reached a maximum of 2024MHz 221W draw at 99% load according to GPU-Z, so I suppose it is actually functioning correctly. I also set a much more aggressive fan curve despite, what I thought to be, acceptable ambient temperatures inside of the case. Is there any particular way I can force the card in to performing more aggressively with certain games? It seems then that the ones I do play don't take full advantage of what is available.


card is performing to spec then. If you increase the power limit you may get more frequency but not really worth it for the increase in power.

You can't force the card to do more.
 
You should be able to remove a fair bit of voltage which will lower temps and fan noise.

I actually did just that. FPS remained basically the same, I used Far Cry 5/New Dawn internal benchmark.

Stock 5700xt (2004mhz, 1.2v), 4.5ghz CPU
Fps: 68, 87, 123
Temp: 67, 83c
182W

5700xt (1876mhz, 1080mv), 4.5ghz CPU
Fps: 67, 86, 125
Temp: 62, 78c
167W

5700xt (1876mhz, 990mv), 4.5ghz CPU
Fps: 66, 86,121
Temp: 60, 73c
150w
 
I recently installed my new MSI 5700 XT Gaming X, and may have won the silicon lottery. It happily runs stock settings 2100/1750 @ 1050mV, and maxes the sliders to 2150/1900 @ just 1100mV. I have only had the card 2-days, but so far it takes everything I throw at it @ 2150/1900, holding 2070-2098MHz within games and Superposition loops.

The cooler on this thing is a beast - it's by far the heaviest and most-solid feeling card I have owned.

edit: It also runs 1900/1900 @ 925mV. I may end up using this setting 24/7 as card consumes just 130W under load, runs VERY cool on a hot day, and retuns near-100% of stock performance.
 
Just wondering what everyones GPU mhz in idle is ? mine used to be 0-200mhz when doing nothing now I see it 800mhz in idle even put everything back to stock default no OC have the Sapphire 5700 xt Nitro+ SE
 
I recently installed my new MSI 5700 XT Gaming X, and may have won the silicon lottery. It happily runs stock settings 2100/1750 @ 1050mV, and maxes the sliders to 2150/1900 @ just 1100mV. I have only had the card 2-days, but so far it takes everything I throw at it @ 2150/1900, holding 2070-2098MHz within games and Superposition loops.

The cooler on this thing is a beast - it's by far the heaviest and most-solid feeling card I have owned.

edit: It also runs 1900/1900 @ 925mV. I may end up using this setting 24/7 as card consumes just 130W under load, runs VERY cool on a hot day, and retuns near-100% of stock performance.
Yes, definitely one of the best samples I’ve seen, congratulations .
 
I recently installed my new MSI 5700 XT Gaming X, and may have won the silicon lottery. It happily runs stock settings 2100/1750 @ 1050mV, and maxes the sliders to 2150/1900 @ just 1100mV. I have only had the card 2-days, but so far it takes everything I throw at it @ 2150/1900, holding 2070-2098MHz within games and Superposition loops.

The cooler on this thing is a beast - it's by far the heaviest and most-solid feeling card I have owned.

edit: It also runs 1900/1900 @ 925mV. I may end up using this setting 24/7 as card consumes just 130W under load, runs VERY cool on a hot day, and retuns near-100% of stock performance.

thats lucky are you using AMD or MSI afterburner to OC ?, I have trouble getting anything from my Sapphire 5700 xt Nitro+ SE so I just left it on stock dont think it has much headroom comes pretty well clocked 2035 boost ingame it stays around 2050-2059, even if i slightly lower the voltage the ingame mhz drops I dropped voltage to 1185v and ingame clock went to 2015-2020 or bumping up the boost and lowering voltage doesnt do much either cant match the stock mhz ingame , guess 2050-2059 ingame isnt too shabby

the one thing I love about Sapphire it has trixx software having it set to 85% cant see any noticable difference in quality and it really boosts up the FPS

ixxY3IR.jpg
 
I recently installed my new MSI 5700 XT Gaming X, and may have won the silicon lottery. It happily runs stock settings 2100/1750 @ 1050mV, and maxes the sliders to 2150/1900 @ just 1100mV. I have only had the card 2-days, but so far it takes everything I throw at it @ 2150/1900, holding 2070-2098MHz within games and Superposition loops.

The cooler on this thing is a beast - it's by far the heaviest and most-solid feeling card I have owned.

edit: It also runs 1900/1900 @ 925mV. I may end up using this setting 24/7 as card consumes just 130W under load, runs VERY cool on a hot day, and retuns near-100% of stock performance.

thats lucky are you using AMD or MSI afterburner to OC ?, I have trouble getting anything from my Sapphire 5700 xt Nitro+ SE so I just left it on stock dont think it has much headroom comes pretty well clocked 2035 boost ingame it stays around 2050-2059, even if i slightly lower the voltage the ingame mhz drops I dropped voltage to 1185v and ingame clock went to 2015-2020 or bumping up the boost and lowering voltage doesnt do much either cant match the stock mhz ingame , guess 2050-2059 ingame isnt too shabby

the one thing I love about Sapphire it has trixx software having it set to 85% cant see any noticable difference in quality and it really boosts up the FPS

SORRY I DONT KNOW HOW IT POSTED TWICE CAN DELETE ONE

ixxY3IR.jpg
 
thats lucky are you using AMD or MSI afterburner to OC ?, I have trouble getting anything from my Sapphire 5700 xt Nitro+ SE so I just left it on stock dont think it has much headroom comes pretty well clocked 2035 boost ingame it stays around 2050-2059, even if i slightly lower the voltage the ingame mhz drops I dropped voltage to 1185v and ingame clock went to 2015-2020 or bumping up the boost and lowering voltage doesnt do much either cant match the stock mhz ingame , guess 2050-2059 ingame isnt too shabby

the one thing I love about Sapphire it has trixx software having it set to 85% cant see any noticable difference in quality and it really boosts up the FPS
I am just using AMD's latest driver software - Wattman or whatever they call it now. Default volts on my GPU are 1142mV @ 2100Mhz indicated within Wattman, but I can drop that by 100mV before framerates and boost begin to suffer. Stock boost values are around 2040MHz within game and are rock-solid @ 1050mV. 1025mV shows occasional dips.

I am currently running a bunch of benchmarks to find the exact sweetspot for my card. I suffered one glitch @ 1900mem when looping F1 2019, so may need to back-down 25MHz or so. From my initial testing, there is little point pushing these cards past 2000GPU/1800Mem. You gain more heat & noise than noticeable FPS, and without an FPS counter running I really wouldn't tell the difference between 2000/1750 & 2150/1900.

I'll stick some stats up once I'm finished benching.
 
Last edited:
I'm going to make the assumption here that because you are seeing up to 2024Mhz in Superposition you are expecting to see that clock speed in all your games and that's not what is happening?

Is there a particular game you are worried about? What game is it, what resolution and what GPU clocks are you seeing in it? can you give me an example?

@humbug
Sorry, I took a few days away for work and also spent some time playing a couple of games that I know run well. I had been playing Post Scriptum quite a bit with friends, at 1680x1050(!), GPU clocks seemed to hit peak of around 2020MHz. Unfortunately, I don't think PS is terribly well made and performance is wildly variable. It definitely wasn't the right game to run through with a new GPU.

Okay, decent high starting clock at least. What clock speed do you see in superposition 1080p extreme at stock with +50% power limit set?

@LtMatt
At 1080p extreme, I reach a maximum of 2021 and an average of 39 frames.
At 1080p high, the clockspeed hits the limit again and averages at 92 frames.

card is performing to spec then. If you increase the power limit you may get more frequency but not really worth it for the increase in power.

You can't force the card to do more.

@Ross Thomson
Agreed, I don't really want to draw any more power right now. I didn't really see much of an improvement in Superposition to warrant the increase. I am also a little suspect of my aged PSU, I've had a couple of funny BSOD errors where only a hard restart will work. Power to peripherals seems to drop away whilst the LEDs internal to the case (mobo&GPU) are still lit. It will only really happen if I, for example, have my phone charging through a USB slot.


Many thanks for taking the time to respond gents, it is appreciated. I had a little bit of buyer's remorse on the back of this, and chose the wrong games to gauge performance I suppose. Post Scriptum and War of Rights, both indie titles, probably aren't the best choices. I had a much more enjoyable time in Warhammer 2 Total War. Aside from those two titles, everything else seems to clip along at a lovely pace. Card seems stable, it's hitting the specifications and I can't really complain too much. I found that adding an aggressive custom fan curve seemed to help a little also, the default seems too mellow.
 
@Jimothy to get right to the point, sometimes games are poorly optimized and the CPU can bottleneck the GPU, if that happens the GPU will downclock its self to save power, its not that the GPU has a problem, its actually working exactly as intended, its just that the CPU in these games can't feed the GPU fast enough so it just says "well i might as well take it easy as the CPU can't give me what i need fast enough"

With that you may see the GPU fall to a lower clock and your performance might by low, that's not because the GPU is only running at 1600Mhz or whatever.... its because the GPU is only working as hard as the CPU in this game or particular part of the game is able to allow it.

If you use Super resolution in that game and crank it right up to 1440P or even higher you will offload more of the work from the CPU to the GPU and it will work harder and clock higher, it will not improve performance because the game is still badly optimized but as a test to validate its not a problem with the GPU its worth looking at.
 
@Jimothy to get right to the point, sometimes games are poorly optimized and the CPU can bottleneck the GPU, if that happens the GPU will downclock its self to save power, its not that the GPU has a problem, its actually working exactly as intended, its just that the CPU in these games can't feed the GPU fast enough so it just says "well i might as well take it easy as the CPU can't give me what i need fast enough"

With that you may see the GPU fall to a lower clock and your performance might by low, that's not because the GPU is only running at 1600Mhz or whatever.... its because the GPU is only working as hard as the CPU in this game or particular part of the game is able to allow it.

If you use Super resolution in that game and crank it right up to 1440P or even higher you will offload more of the work from the CPU to the GPU and it will work harder and clock higher, it will not improve performance because the game is still badly optimized but as a test to validate its not a problem with the GPU its worth looking at.

@humbug Understood. It's been so long since I've had a card that is capable beyond the performance of the rest of my system, I was fairly ignorant of the relationship between CPU and GPU until very recently. I'm still running a Zen+ 2600 (stock) and 16GB of DDR4 2999MHz (XMP#2), which I don't think is really helping in these scenarios where the CPU is being relied upon. It's a really poor time to be upgrading with the pending Zen3 launch, but I am seriously looking at the 3600 to tide me over for a couple of years until AMD roll over to the next chipset.

Thanks again.
 
I am just using AMD's latest driver software - Wattman or whatever they call it now. Default volts on my GPU are 1142mV @ 2100Mhz indicated within Wattman, but I can drop that by 100mV before framerates and boost begin to suffer. Stock boost values are around 2040MHz within game and are rock-solid @ 1050mV. 1025mV shows occasional dips.

I am currently running a bunch of benchmarks to find the exact sweetspot for my card. I suffered one glitch @ 1900mem when looping F1 2019, so may need to back-down 25MHz or so. From my initial testing, there is little point pushing these cards past 2000GPU/1800Mem. You gain more heat & noise than noticeable FPS, and without an FPS counter running I really wouldn't tell the difference between 2000/1750 & 2150/1900.

I'll stick some stats up once I'm finished benching.

Thats true you aint really gonna notice the few less fps while playing but it still doesnt stop me trying to get that extra out of the card , currently i have it set to 2100mhz 1185v 1850 memory Ingame its around 2060-2069, at Stock its 2035mhz boost ingame around 2055-2044 with 1.20 voltage so not much difference even if i drop voltage slightly it drops to 2020-2030 ingame so with 2100mhz 1185v 1850 memory Ingame its around 2060-2069 is slightly better with slightly lower voltage and higher ingame mhz I found warzone is good to test stabilty before It would work fine on other games and some GPU stress software but fail on warzone

this Sapphire 5700 xt Nitro+ SE i bought used for £280 which i think was great deal , warzone maxed @1440p with trixx 85% resolution scaling its mostly 125-158fps
 
Back
Top Bottom