• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX 7900 XT(X) Owners Thread.

Increasing min frequency seems to increase reported frequencies but decrease performances.
1075mV seems to be stable for my MBA 7900 XTX. 1070mV is crashing. But for some reasons it's not stable while playing Warzone even at 1085mV (all other games are OK), but i don't know if this is a game issue.
Also reported voltage never go above 950mV.
Max hotspot is 92° (only if PL+15%).
What you will find is that if you are power limited in the game you are running, you may get away with a lower voltage set as the core clock is not boosting as high. However, if you then play a game where the GPU is not bouncing off the maximum power limit (aka Warzone), you will need more voltage to attain stability as the GPU has the power headroom available to boost the core clock higher. This higher core clock needs more voltage when not in a power limited scenario.
 
Last edited:
What you will find is that if you are power limited in the game you are running, you may get away with a lower voltage set as the core clock is not boosting as high. However, if you then play a game where the GPU is not bouncing off the maximum power limit (aka Warzone), you will need more voltage to attain stability as the GPU has the power headroom available to boost the core clock higher. This higher core clock needs more voltage when not in a power limited scenario.
Should I then lower the maximum frequency? I currently have it set at 3050 but anyway no game makes my frequency rise above 2850 even with PL set to +15% (Horizon, A plague tale Requiem, Warzone)
 
Should I then lower the maximum frequency? I currently have it set at 3050 but anyway no game makes my frequency rise above 2850 even with PL set to +15% (Horizon, A plague tale Requiem, Warzone)
I've always set my min frequency 100 below the max - not sure if this still the right thing to do this gen or not
 
I've always set my min frequency 100 below the max - not sure if this still the right thing to do this gen or not
increasing min frequency decrease performances by a lot on my card. Even if setting it just at 2800 and even if reported frequencies are actually higher framerate is worse.
 
Last edited:
3342 :eek:

What kinda gpu clocks are you getting in games?

Depends on the game, but I've just benched Callisto with it and it boosts a bit lower.

image.png


Huge upgrade from my old 2080 :D
image.png


image.png

I should get around to buying 3D Mark at some point.. Apparently, I'm the only person who ever uses superposition :p
 
Whoa! I just ran the Cyberpunk Ray tracing:Ultra preset benchmark. I'm averaging 68FPS with a min of 52fps with a max of 92fps. This is @ 3440 x 1440 @175Hz.. It looks stunning. Super impressed.

Currently waiting on Parcel Force to deliver my VR base station today.. Well soon actually.

Can you believe it! I ordered it on the 29th of November.. Some how it got lost, so I had to order another one, and it's apparently getting delivered today. So.. hopefully I will have some VR Benchmarks posted sometime soon if anyone is interested?
 
I'm having a really bizarre display issue since installing my 7900 xt and I'm fresh out of ideas!

Basically, sending my display to my monitor over display port is absolutely fine. When sending to my second display (LG 4k TV), the display is squashed to fill just half the screen. This is from the hdmi port on the card. I've bought a new hdmi 2.1 cable as thought it might be because my old cable was a 1.4 but it changed nothing. It fills the screen if I use the USB c using an adapter, but this introduces a bit of a flicker every so often.

Maybe oddly, but when I play around with eyefinity, I can turn the 2 displays into one but that's not what I want. It does then fill the second screen fully though. Even just had a display port to hdmi port adapter delivered to see if that magically fixed things, but it just fills half the screen again.

So in short, primary display to monitor work fine using either of the cards display ports. Second display which is 4k TV display out but display is squashed horizontally to fill about half of the screen. Using the usbc port on the card via an adapter fills the screen but has a flicker. Coming from a 1080 gtx which was connected identically was fine. I have a feeling its a driver/software thing as eyefinity does seem to fill the second screen but that's not what I want.

Sorry about the long post, but I'm stuck! Any ideas? I've heard noises that the drivers are broken a bit with multi monitor/display setups?

Thanks!
 
This might sound like a dumb question, but say when you are just using the PC to browse on the web (minimal stuff), does the GPU know just to use the bare minimum wattage automatically without having to do anything to it?
 
This might sound like a dumb question, but say when you are just using the PC to browse on the web (minimal stuff), does the GPU know just to use the bare minimum wattage automatically without having to do anything to it?
Yes, it should do, but doesn't work well with these early RDNA3 drivers.

Purely at desktop, mine pulls around 16-18W, when browsing around 50W and when watching a video on YouTube it pulls anywhere from 80-100W.
 
Thanks for that. There is nothing on the TV enabled though. Plus, it was working fine with the older card. Plus plus, it fills the screen on eyefinity mode which makes me think its a driver/software thing.

It also fills the screen via the usbc connector.

Open to other suggestions so keep em coming!
 
Yes, it should do, but doesn't work well with these early RDNA3 drivers.

Purely at desktop, mine pulls around 16-18W, when browsing around 50W and when watching a video on YouTube it pulls anywhere from 80-100W.
Ok. Cool. I'm glad I asked then. :)

So hopefully we should see lower idle power consumption in future driver revisions?
 
This might sound like a dumb question, but say when you are just using the PC to browse on the web (minimal stuff), does the GPU know just to use the bare minimum wattage automatically without having to do anything to it?
image.png


Mine tends to vary from 35w-57w. Around 35w just staring at the desktop and mid 50s when browsing the web, using discord etc.
 
Ok. Cool. I'm glad I asked then. :)

So hopefully we should see lower idle power consumption in future driver revisions?
Hopefully, AMD have said they're aware so hopefully it's just rushed drivers and not some hardware level issue.

106W currently watching a video on YouTube at 1080p60 with hardware acceleration enabled in Microsoft Edge.

If you disable hardware acceleration, drops to around 40-50W.
 
Last edited:
Back
Top Bottom