• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

ASROCK 6900 XT NOW ONLY £689.99 (£649.99 FOR LOYAL FORUM MEMBERS)

Had a bit of a nightmare installing 6900xt, new power supply and AIO as im a bit of a novice, had that heart attack moment when nothing worked but eventually worked out it was because i hadnt swapped some of the old power supply cables out but had just switched them over into new power supply. When i replaced the rest of the cables its all good and running smoothly so far.

You should never use old power supply cables with a new PSU and if a different brand can cause serious damage as they are not all wired the same at the PSU end of the cable. different brands use different pin outs PSU end of cable.
 
You should never use old power supply cables with a new PSU and if a different brand can cause serious damage as they are not all wired the same at the PSU end of the cable. different brands use different pin outs PSU end of cable.
yep found that out when i was troubleshooting my issues last night but not something that i was aware of beforehand.
 
yep found that out when i was troubleshooting my issues last night but not something that i was aware of beforehand.

You're lucky nothing went pop as said above never use cables from a different psu even from the same brand as depends how each was wired, even corsair have different wiring psus and state the extra cables type needed on their site for extensions etc.

Someone was on your side and and lucky it may have tripped a short detection even before powering up the rest of the system. Make sure to stick old cables with old psu and not to mix them up.
 
Yeah unplupgging all my original PSU cables were a nightmare but I heard of a horror story about some guy who left his old PSU cables in at it was not good. Fortune favours you! This heatsink is this huge for a reason the temperatures are pretty cool even when running at practically 90%+ utilization. Quite happy with the card so far.
 
Yeah unplupgging all my original PSU cables were a nightmare but I heard of a horror story about some guy who left his old PSU cables in at it was not good. Fortune favours you! This heatsink is this huge for a reason the temperatures are pretty cool even when running at practically 90%+ utilization. Quite happy with the card so far.
I find I can hardly hear the fans in normal use (not touched the profile, but 35% is about where I can hear them so must be below that most of the time).
 
Was that the hotspot temp?
Ok so I gave it a go, here is the peak temperature from my test playing Call of Duty. I haven't adjusted anything on the card, all looks pretty decent from a layperson here!

DateGPU Clock [MHz]Memory Clock [MHz]GPU Temperature [°C]GPU Temperature (Hot Spot) [°C]Fan Speed (%) [%]Fan Speed (RPM) [RPM]GPU Load [%]Memory Controller Load [%]Memory Used (Dedicated) [MB]Memory Used (Dynamic) [MB]GPU Chip Power Draw [W]GPU Voltage [V]CPU Temperature [°C]System Memory Used [MB]
29/11/2022 20:53​
2496​
1988​
65​
85​
50​
1661​
99​
53​
7811​
724​
273​
1.175​
48.9​
9418​
 
Probably not the best gauge, I was just referring to the temperature in the Modern Warfare 2 ingame reading, so I didn't see the hotspot temperature. I haven't measured using any monitoring software per se. MSI Afterburner for this?
First thing I installed after setting up my new PC was Warzone 2.0 and I panicked at the heat it was showing.... however both AMD Adrenaline and the NZXT Fan software were reporting temps much, much lower (but matched each other).
So I've come to the conclusion that the heat shown in MW2/Warzone 2 is seriously wrong somewhere.
 
First thing I installed after setting up my new PC was Warzone 2.0 and I panicked at the heat it was showing.... however both AMD Adrenaline and the NZXT Fan software were reporting temps much, much lower (but matched each other).
So I've come to the conclusion that the heat shown in MW2/Warzone 2 is seriously wrong somewhere.
Unless it's showing the hot spot temp?
 
Ok so I gave it a go, here is the peak temperature from my test playing Call of Duty. I haven't adjusted anything on the card, all looks pretty decent from a layperson here!

DateGPU Clock [MHz]Memory Clock [MHz]GPU Temperature [°C]GPU Temperature (Hot Spot) [°C]Fan Speed (%) [%]Fan Speed (RPM) [RPM]GPU Load [%]Memory Controller Load [%]Memory Used (Dedicated) [MB]Memory Used (Dynamic) [MB]GPU Chip Power Draw [W]GPU Voltage [V]CPU Temperature [°C]System Memory Used [MB]
29/11/2022 20:53​
2496​
1988​
65​
85​
50​
1661​
99​
53​
7811​
724​
273​
1.175​
48.9​
9418​

Unless it's showing the hot spot temp?
Quite possibly. My hotspot temperature from GPU-Z was showing 85C, MW2 somwhere in between the GPU and the hotspot. I'm guessing MW2/WZ2 isn't overly accurate.

I'm not concerned as my fan looked to be only running at 50% (and quiet), was just curious as to what everybody else's card was running at. Really can't fault the card at all! Honestly can't get over the difference between this at 1440p and my PS4 running 1080p at 60fps! Worlds apart.
 
Just out of interest what is the Power Consumption going up to for other people? Just worried that mine is not drawing enough as I saw on some reviews that a 6900XT typically is 330w but mine never draws more than 280w from what I have seen. Is anyone else getting this as well?
 
Really impressed with this card for gaming, I had a faulty 3090ti I picked up when prices dumped that went back and then this basically replaced it. In gaming this thing seems to match it almost like for like, probably because I'm only running at 1440p but even things like MSFS I honestly don't see any difference (I never liked DLSS in MSFS, makes cockpits and controls blurry).

I'm sure for Video Editing, etc the Nvidia is better but its not something I do at all. All in though £650 seems like a bargain was had compared to the other options.
 
Just out of interest what is the Power Consumption going up to for other people? Just worried that mine is not drawing enough as I saw on some reviews that a 6900XT typically is 330w but mine never draws more than 280w from what I have seen. Is anyone else getting this as well?
Mine was drawing 273W at 99% GPU load by the looks of it. I don't know wether that was the peak though, my table a few posts above was sorted by highest hotspot temperature.
 
Mine was drawing 273W at 99% GPU load by the looks of it. I don't know wether that was the peak though, my table a few posts above was sorted by highest hotspot temperature.
Yeah that is pretty similar to mine, when it goes up to 99% it can sometimes hit 282 but I mean it mostly runs lower than that, can different 6900XTs draw that much more power than this card? If that is a thing than fair enough I just did not think that there would be this much of a difference in typical gaming.
 
I don't really get this FSR stuff. The game has an option of balanced, quality or performance.

If I set it to quality, for example, am I getting a worse visual experience than if I just left FSR off entirely?

The only option I have enabled in the adrenaline software is the VRR one.
 
Last edited:
I don't really get this FSR stuff. The game has an option of balanced, quality or performance.

If I set it to quality, for example, am I getting a worse visual experience than if I just left FSR off entirely?

The only option I have enabled in the adrenaline software is the VRR one.
Upscaling tech is usually visually inferior to native.

The tradeoff is visual fidelity for frame rate. One improves, the other gets worse.
 
Upscaling tech is usually visually inferior to native.

The tradeoff is visual fidelity for frame rate. One improves, the other gets worse.

So if I'm happy with the frame rate there is no incentive to turn it on?

If I just wanted a little boost would I work from quality and then go backwards if I wanted more FPS with performance being the obvious biggest hit to visual quality but bigger frame rate boost.
 
So if I'm happy with the frame rate there is no incentive to turn it on?

If I just wanted a little boost would I work from quality and then go backwards if I wanted more FPS with performance being the obvious biggest hit to visual quality but bigger frame rate boost.
That is correct
 
So if I'm happy with the frame rate there is no incentive to turn it on?

If I just wanted a little boost would I work from quality and then go backwards if I wanted more FPS with performance being the obvious biggest hit to visual quality but bigger frame rate boost.
I think part of the point of FSR is that it is meant to upscale the resolution in a way that gives you more FPS whilst also making it impossible to tell the difference between if it is upscaled or not, I suggest you give it a try to see if you can even tell the difference or not.
 
Back
Top Bottom