• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX 6900/6950 XT Owners Thread.

So my Sapphire Nitro+ 6900XT has a Delta of 30 degrees at the hot spot with an OC and power set to at 15, but at the 303W max power used under load. Kinda disappointed. Temps on the GPU will be 70-75 under heavy load while the hot spot is 98-110 degrees depending on when I am looking.

My son has the skills to re-paste with our Kryonaut, but I'm worried about voiding the warranty. He is not worried, and he has an EK waterblock on his 6 month old MSI Gaming X Trio 3090...
You can improve things with a re-paste, but it may void warranty.
 
You can improve things with a re-paste, but it may void warranty.

I was going to ask if you think this occasionally hitting as high 108-111° at the hotspot with the GPU at 75° will cause damage, when Radeon power is set to +15 and I’m seeing it draw only 303 watts? This was 1-2 weeks ago. But as I type this my peak temps seem a bit lower lower tonight, maybe due to 21.8.2?

The hot spot was more often 100-105 degrees under heavy load with a 28-30 degree delta, but it occasionally had been peaking at 108-111 with a 30-35 degree delta (when watching the overlay during time spy).

EDIT - so as I type this I decided to run a Radeon stress test, and my GPU clock was at 2306 MHz, the ram was at 2090, temperature was 71° with a Junction temperature of 98°. That’s cooler than I’ve been seeing just a week ago.

I ran the stress test a couple more times - I got 2345 MHz GPU with the vram 2090, 303W each time (over 60 seconds). The first one GPU temperature was 69° with a junction temperature of 98°. The highest it got tonight on the 3rd try was 73° GPU temperature and 101° junction with the same GPU clock and vram speeds.

Then I ran Time Spy with Radeon overlay to watch the temps, and saw it was drawing 303W with a GPU clock hitting peak 2415 MHz, and the highest junction temperature I saw was 105° with a delta of 32°. But most of the time the junction temperature was closer to 101° to 103° During this test the GPU temperature 69 to 72°. This is cooler than it was running last week.

The only reason I can think of why it’s not going over 105° anymore at the hotspot is the newest Radeon software 21.8.2. The delta is still 28° to 32°.

EDIT - GPU Time Spy is 21612 after shutting down and re-running, so temps are down and performance is not.
 
Last edited:
I was going to ask if you think this occasionally hitting as high 108-111° at the hotspot with the GPU at 75° will cause damage, when Radeon power is set to +15 and I’m seeing it draw only 303 watts? This was 1-2 weeks ago. But as I type this my peak temps seem a bit lower lower tonight, maybe due to 21.8.2?

The hot spot was more often 100-105 degrees under heavy load with a 28-30 degree delta, but it occasionally had been peaking at 108-111 with a 30-35 degree delta (when watching the overlay during time spy).

EDIT - so as I type this I decided to run a Radeon stress test, and my GPU clock was at 2306 MHz, the ram was at 2090, temperature was 71° with a Junction temperature of 98°. That’s cooler than I’ve been seeing just a week ago.

I ran the stress test a couple more times - I got 2345 MHz GPU with the vram 2090, 303W each time (over 60 seconds). The first one GPU temperature was 69° with a junction temperature of 98°. The highest it got tonight on the 3rd try was 73° GPU temperature and 101° junction with the same GPU clock and vram speeds.

Then I ran Time Spy with Radeon overlay to watch the temps, and saw it was drawing 303W with a GPU clock hitting peak 2415 MHz, and the highest junction temperature I saw was 105° with a delta of 32°. But most of the time the junction temperature was closer to 101° to 103° During this test the GPU temperature 69 to 72°. This is cooler than it was running last week.

The only reason I can think of why it’s not going over 105° anymore at the hotspot is the newest Radeon software 21.8.2. The delta is still 28° to 32°.

EDIT - GPU Time Spy is 21612 after shutting down and re-running, so temps are down and performance is not.
No it won't cause any damage, the GPU will just lower clock speed and voltage to keep the temperature at or below 110c.
 
I have noticed computers with a 5950 CPU do a lot better with a 6900 than a 5800x or should I say I've noticed more pairings of 5950 and 6900 then 5800x and 6900 in the top tables.
Pretty sure if I unleashed my 6900 with MPT and OC'd my 5800 (or messed about with it) I could get my score higher but I just cant be bothered. Since I got the 6900 my Timespy has shot up around 1000 points I think
so via driver updates thats pretty good.
 
I have noticed computers with a 5950 CPU do a lot better with a 6900 than a 5800x or should I say I've noticed more pairings of 5950 and 6900 then 5800x and 6900 in the top tables.
Pretty sure if I unleashed my 6900 with MPT and OC'd my 5800 (or messed about with it) I could get my score higher but I just cant be bothered. Since I got the 6900 my Timespy has shot up around 1000 points I think
so via driver updates thats pretty good.
What tables are you referring to? If it'#s the OcuK 3DMark threads that would be expected as although the CPU scaling is far from ideal, the extra cores do result in higher CPU and Combined scores.
 
I have noticed computers with a 5950 CPU do a lot better with a 6900 than a 5800x or should I say I've noticed more pairings of 5950 and 6900 then 5800x and 6900 in the top tables.
Pretty sure if I unleashed my 6900 with MPT and OC'd my 5800 (or messed about with it) I could get my score higher but I just cant be bothered. Since I got the 6900 my Timespy has shot up around 1000 points I think
so via driver updates thats pretty good.

I occasionally swap out my 3080Ti and 6900XT between my 5950X rig and my 5800X rig, and my graphics or GPU scores don't seem to suffer with the 5800X, at least as far as I know, although I've had them on different drivers over time as they get moved around.

On July 29th my 6900XT was paired with my 5950X rig and my GPU score was 21388. Last night on August 29th my GPU while in the 5800X rig was higher at 21612, which has been consistently in the216xx range on both Radeon 21.8.1 and 21.8.2 on the 5800X CPU rig over the past week. Granted, the total score was 19821 in the 5950X rig and only 19036 in the 5800X rig, despite the higher GPU score with the 5800X, due to the 8-coreCPU scoring 2500 points lower on that portion.

I believe that if I put the 6900X back with the 5950X CPU I would be even higher with the latest Radeon software, but I have them paired as they are for a reason. I think 4K gaming on my HDTV with the 6900XT is great as long as I don't turn on ray tracing, so when gaming in the family room with my 4K 60Hz HDTV I pair my3080Ti with the family room rig for gaming because I can use DLSS when using ray tracing at 4K. My 6900XT is an absolute beast in my bedroom rig with my Asus VG27AQ 1440P 165Hz monitor, and ray tracing isn't a total bust with 1440p (or 1080p gaming) on the smaller monitor.

I know I can run my RX 6900XT at 1440p @120 Hz with my 4K TV in the family room, but that isn't as crisp running native 4K on the 4K TV or as crisp as running native 1440p on the 165 Hz 27" monitor. I also know I can swap my 5950X rig into the bedroom with the 6900XT on the 27" 165Hz monitor, but it heats up my bedroom more. Also, I use the family room 4K HDTV rig much more often, and would prefer to have the faster CPU out there. They both have equivalent amounts of RGB, but I have 3 RGB fans on the front of the 5950X rig, and you can't see them with how the system has to be oriented in the bedroom (their marvelous RGB would be be hidden).
 
Last edited:
Don't use 21.9.1 as in my case the power fluctuation up to 540-630W for RX 6900XT even no overclock applied this happen in Deathloop and RE Village even in 3DMark
 
Don't use 21.9.1 as in my case the power fluctuation up to 540-630W for RX 6900XT even no overclock applied this happen in Deathloop and RE Village even in 3DMark
Looks like it's a miss read of the ASIC power as the PPT is correct. Give HWINFO a run. I noticed the power all over the place yesterday. I'll check my sons 6700xt when I get 5 mins too.
 
If you go into the Overlay area, you can do loads of tweaking. Location X/Y Position, Size %, Columns, Transparency, Text colour.
Size is either 50% or 100%, its either too small or too big no amount of tinkering makes it suitable so I just stick to HWinfo OSD :)

Checked Timespy and my score was 100 higher so no detrimental affect there to the benchmarks.

Also HWInfo reports the correct Watts for me.
 
Back
Top Bottom