• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX9070XT / RX9070 Owners Thread

i will attempt to find out later i guess lol
I still half think this could be a good excuse to get a 9060 xt sapphire pure it might look really cool if we can get the cable management looking decent
im guessing there's gonna be other issues doing that like adrenaline maybe not liking me trying to set different settings for each of the two cards? id obviously be setting a [very small] undervolt on both and im guessing that might not want to be the same amount although i guess i find a number that is stable on both and accept that's a limitation

on the other hand the last few days since ive noticed ive been forcing myself to alter my workflow to use just the 2 screens and i can sorta get away with it but its deffo something id prefer to not be doing a lot lol

I still love this card though and wouldnt want to change it just a shame it didnt have 3 display port connections lol
Pretty sure adrenalin will be fine with two GPUs. My igpu shows up fine as second one in it
 
Pretty sure adrenalin will be fine with two GPUs. My igpu shows up fine as second one in it
its whether it would allow me to do an undervolt seperately on both gpu's although saying that i only undervolt very slightly purely to save a bit of energy and a bit less work for my cooling to deal with [which is almost pointless my thermals are more than fine id go as far as saying fine is an understatement lol]
i do think maybe getting a 9060xt pure to bung that odd one out monitor onto then i can run all at the same 165hz refresh all on display port and tbh assuming i can do it in a way that doesnt result in one gpu basically heating up the cooling intake air from the other one up then it might also look pretty nice
il have a proper think, half of me is saying if your gonna run a second gpu maybe one with different features rather than jsut a cut down version of my primary but yeah i think it wouuld look nice with either a pair of 9070 or a second 9060xt in there lol
 
its whether it would allow me to do an undervolt seperately on both gpu's although saying that i only undervolt very slightly purely to save a bit of energy and a bit less work for my cooling to deal with [which is almost pointless my thermals are more than fine id go as far as saying fine is an understatement lol]
i do think maybe getting a 9060xt pure to bung that odd one out monitor onto then i can run all at the same 165hz refresh all on display port and tbh assuming i can do it in a way that doesnt result in one gpu basically heating up the cooling intake air from the other one up then it might also look pretty nice
il have a proper think, half of me is saying if your gonna run a second gpu maybe one with different features rather than jsut a cut down version of my primary but yeah i think it wouuld look nice with either a pair of 9070 or a second 9060xt in there lol
Pretty sure if its only running desktop stuff the fans won't even come on the 9060. Also can't really see the need for an under volt on the card as it clocks will never go very high. Only Guessing here but that would be my take
 
i also am trying to get into the whole audio over gpu thing where they use the gpu to offload some vst/plugins instead of using the cpu for them [music production or should i say attempted music production in my case lol]
so there is actually a use case to actually use both cards fairly hard not sure if i actualy can do this with 2 cards though probably find out hte whole audio over gpu thing only works on one card or something knowing my luck lol
 
just realised id only have 4 lanes of pcie to feed the second gpu with so perhaps trying to use it to process some of my plugins might be a bit fruitless lol
still think it would look cool though LOL!
 
im probably going crazy in which case you guys will at least get a laugh out of my thought process lol.
the whole issue with the xt and multiple screens pinning the vram at max clock and using 50-60w at idle and getting rather warm problem..............
has anyone tried using a display adaptor/pixel pusher/really low end gpu just to run a secondary screen off of? wouldn't be for gaming purposes i just use one screen on the really rare occasion that i play any kind of games
probably a bad idea but i was thinking there must be some kind of really really low end gpu that would handle the job of literally half the time a discord/web browser and the rest of the time is for music production where i really find having the extra portrait position monitor pretty much indispensable [i use 3 monitors for music production lol the 9070xt seems fine with the 2 matching ones both in landscape as soon as i use the third not entirely matching but same make and series as soon as i add in that 3rd it really doesn't like it and I'm sure having my vram pinned at max for 15+ hours a day is probably not good for the longevity of the system lol
hopefully there's a work around for this but if not a pixel pusher to do the job of running the extra monitor?
I had this issue and absolutely no-one helped haha. Mine was related to the 144hz refresh rate and being 4k. At anything above 60hz the VRAM will jump to some value, I can't remember off the top of my head, which uses around 40w.
I fixed this by setting up VRR in windows, dynamic refresh rate in windows display options, so it could dip to 60 when not gaming. This allowed the GPU to drop to around 16w.

Are any of your monitor refresh rates above 60hz. If so, humour me and set them to 60hz and see if that helps. Then if fixing them at 60 helps, turn on dynamic refresh for windows of available. Make sure you have the correct drivers for your monitors installed.
 
I fixed this by setting up VRR in windows, dynamic refresh rate in windows display options, so it could dip to 60 when not gaming. This allowed the GPU to drop to around 16w.
This is what happened after installing new drivers a few weeks ago, VRR in the driver got disabled on one of my displays making the memory clock go higher, think it was around 900-1500Mhz. As soon as I enabled VRR on the fourth display I literally watched the memory clock in GPU-Z drop to double digit figures.

This is with 4 displays at 144Hz at 1080p, I guess higher resolutions have a lower refresh rate to where memory clocks start to ramp up.

Edit.. Ok just did a test, set Dynamic Refresh Rate to 'On' for all displays, clocks are low double digit but as soon as I move my mouse around the memory ramps up to 2505Mhz. Doing this with Dynamic Refresh Rate set to 'Off' and memory only hits up to 150Mhz when moving mouse around.

What a weird setting as it states 'To help save power..' :cry:
 
Last edited:
I had this issue and absolutely no-one helped haha. Mine was related to the 144hz refresh rate and being 4k. At anything above 60hz the VRAM will jump to some value, I can't remember off the top of my head, which uses around 40w.
I fixed this by setting up VRR in windows, dynamic refresh rate in windows display options, so it could dip to 60 when not gaming. This allowed the GPU to drop to around 16w.

Are any of your monitor refresh rates above 60hz. If so, humour me and set them to 60hz and see if that helps. Then if fixing them at 60 helps, turn on dynamic refresh for windows of available. Make sure you have the correct drivers for your monitors installed.
yes all of them are above 60
they are 165hz on display port and 144hz on hdmi and mine being the sapphire has 2 of each and it wont let me set the same refresh on the screens for some reason i only have different options i am working on this though
and similar power usage il see if i can find out how to turn the dynamic/variable refresh off lol might work
im still thinking a s econd gpu might be hte only way out of it lol and yes this seems to be since a driver update that this issue has started happening for me lol
 
I "downgraded" my display from a 4k mini LED 43" Samsung QN90C to the MSI 34" OLED 341CQP mainly to gain some FPS as the 9070 XT struggles at 4k but another retailer had it for £389 with bluelight card. £80 steam voucher too.
It's a Samsung second gen QD-OLED Ultrawide panel so no idea why they were so cheap. It's absolutely fantastic, basically identical to the 341CQPx with lower refresh rate.
Enjoying the higher FPS of the lower resolution.
 
I "downgraded" my display from a 4k mini LED 43" Samsung QN90C to the MSI 34" OLED 341CQP mainly to gain some FPS as the 9070 XT struggles at 4k but another retailer had it for £389 with bluelight card. £80 steam voucher too.
It's a Samsung second gen QD-OLED Ultrawide panel so no idea why they were so cheap. It's absolutely fantastic, basically identical to the 341CQPx with lower refresh rate.
Enjoying the higher FPS of the lower resolution.
And you don't miss the 4k resolution. I had a 34" then I went back to 32" 1440p. Now I have a 32" 4k OLED and it's amazing.

I am curious about the 39" Asus OLED though.
 
Last edited:
And you don't miss the 4k resolution. I had a 34" then I went back to 32" 1440p. Now I have a 32" 4k OLED and it's amazing.

I am curious about the 39" Asus OLED though.
No, I sit exactly the same distance away from the 34" UW as the 43" 4k, so I've basically just lost height which wasn't important to me. It actually works better for me. 43" was probably too big for sitting 80cm away to be honest haha. The OLED is a big step up from the mini LED so it feels like an overall upgrade. Alan Wake 2 looks absolutely amazing on it, and I can actually run it native, unlike with 4k. I only game on my PC and I find ultrawides perfect for gaming. I had one before and missed it.

The 43" QN90C is still in the household. It's just now fitted to my lad's Series X.
 
Last edited:
I guess I haven't played everything that's been releasing recently but I'm still managing at 4K with my 9070XT. FSR 4 on performance for the most demanding games. I used to use custom ultrawide resolutions on the 3080 I had before to claw back some frames.
 
Back
Top Bottom