• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX9070XT / RX9070 Owners Thread

I used DDU As always. How do you delete the shader cache? I've had Nvidia for 13 years. Just changed to AMD. so it's abit different. Got to get used to it.
Steam has a shader cache so there’s a good place to start. For micro stutters update bios. Update chipset drivers. Make sure your pcie slot is running x16. I would also update peripherals that use USB as this caught me out lovely when my nephew bought a usb3 extender that caused horrendous stutter.
 
I used DDU As always. How do you delete the shader cache? I've had Nvidia for 13 years. Just changed to AMD. so it's abit different. Got to get used to it.
Navigate to the graphics cache folder located at %localappdata%\Temp\Turn10Temp.scratch\GraphicsCache and delete all files in this folder.
Delete the Dxcache files located at %localappdata%\NVIDIA\DXCache.
 
Hi, does anyone have a XFX Mercury AMD Radeon RX 9070XT ?

I'm looking at getting one and would be interested in any feedback on how you've found it. ;)
 
Hi, does anyone have a XFX Mercury AMD Radeon RX 9070XT ?

I'm looking at getting one and would be interested in any feedback on how you've found it. ;)
The word I would use is mahoosive. Make sure you take measurements to make sure this chunky boy will fit. I really like it and the temps are good. Fan can be a little noisy at full tilt. But you can tune it. I only wish I bought the white one.
 
The word I would use is mahoosive. Make sure you take measurements to make sure this chunky boy will fit. I really like it and the temps are good. Fan can be a little noisy at full tilt. But you can tune it. I only wish I bought the white one.
B) great stuff thanks. I'll check the dimensions
 
so i have to say im very wrong...
i only play squads so not a demining game at all but @1440 165hz this GPU is idling in the 40's, im even seeing drops into the high 30's

i turns out the taichi runs super cool, even the Nitro would un the mid to high 50's
 
so i have to say im very wrong...
i only play squads so not a demining game at all but @1440 165hz this GPU is idling in the 40's, im even seeing drops into the high 30's

i turns out the taichi runs super cool, even the Nitro would un the mid to high 50's
In Cyberpunk and BF6 open Beta I hit low 50s at 4k with fan maxing out at only 20%. That's with a Nitro+. In less demanding games the fan doesn't even kick in and just gets by with passive. It's in a Corsair 5000d Airflow case in current high humidity and high ambient temps. If I ran a none zero fan custom fan profile it would obviously be a lot cooler. I prefer the silence though. It's incredible how cool these things run. My 7900 XTX ran nowhere near this cool.

Are your temp figures running the default fan curve with zero fan enabled? The temps of your Nitro seem high for a none demanding game at 1440.
 
i always have fan off disabled in game, but i have found the taichi to be cooler in my case to be cooler than the 5 Nitro's i have owned.
there may have been a driver update because it is a lot cooler than then the other GPU's i have tested
 
I don't think drivers have made any difference. I think in my case, no pun intended, I have extremely good airflow over the card which helps the GPU at 0 rpm. The Nitro+ seems to have a cooler that works well with passive cooling, despite the solid rear end. I don't know how this card can use over 360w and stay so cool. I run at +10% power and it's seriously impressive how well this GPU dissipates heat.
Not great for the ambient temps in my room though haha.
 
I don't think drivers have made any difference. I think in my case, no pun intended, I have extremely good airflow over the card which helps the GPU at 0 rpm. The Nitro+ seems to have a cooler that works well with passive cooling, despite the solid rear end. I don't know how this card can use over 360w and stay so cool. I run at +10% power and it's seriously impressive how well this GPU dissipates heat.
Not great for the ambient temps in my room though haha.

yer i have a very good test bench, and ever GPU as been in the same bench
 
i am genuinely stunned at how cool this GPU runs, ive been playing squads for about 2hr and the nitro+ would be at about 50/55c in this game with the same setting and undervolt.
this GPU is sitting at about 39/41c



power limit +10%
GPU at -70
ram at 2700 / fast timings
 
Last edited:
i am genuinely stunned at how cool this GPU runs, ive been playing squads for about 2hr and the nitro+ would be at about 50/55c in this game with the same setting and undervolt.
this GPU is sitting at about 39/41c



power limit +10%
GPU at -70
ram at 2700 / fast timings
That is a lovely build!

I am joining the 9070XT club this week as well.

Game at 1440 165Hz so this is a "perfect" upgrade from my 6700 which I will pop into my lad's machine (currently using my other older card, the 480 8GB which tbh, has been a solid card over the years!).
 
i am genuinely stunned at how cool this GPU runs, ive been playing squads for about 2hr and the nitro+ would be at about 50/55c in this game with the same setting and undervolt.
this GPU is sitting at about 39/41c



power limit +10%
GPU at -70
ram at 2700 / fast timings
You should in theory be able to smash a very high Timespy GPU score if it runs low temps. Timespy allows crazy undervolts, way lower than gaming stable. For example I run -100mV for gaming but can hit -180mV in Timespy no problems.
With this I hit just under 34200 but I'm at 44 degs on my Nitro+ in the recent heatwave and that's relatively high. I'm 70th in the world for GPU score and 9th for overall score but the people above me have way cooler temps to achieve their scores.
Low temps mean higher clock boost and higher scores so I'd expect a very high Timespy score from your GPU.
 
Last edited:
Back
Top Bottom