• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Radeon RX9070XT / RX9070 Owners Thread

It’s between 7-10pts slower than 3.1 on average, but MILES better.

Another factor is that even performance mode (let alone balanced) looks better than the older FSR quality model. So it’s not a regression in performance, it’s a jump.

So comparing performance at quality vs quality is not remotely apples to apples.

Even balanced in FSR4 looks as good or better than DLSS3 quality. All this talk about DLSS4 transformer still being better is based on the caveat that it’s “marginal” differences.
 
Last edited:
whats the general justification for the price differential between 5070 ti and 9070 xt, is rt performance driving a big wedge between them?

15% difference is marginal and mainly due to outliers. In the majority of games my 9070 XT competes well against my 4080 in raster, in RT it’s only the big outliers like PT in CP2077 or Wukong that making the gap look artificially worse than it is.

Go look at TPU RT individual scores to see what I mean.
 
15% difference is marginal and mainly due to outliers. In the majority of games my 9070 XT competes well against my 4080 in raster, in RT it’s only the big outliers like PT in CP2077 or Wukong that making the gap look artificially worse than it is.

Go look at TPU RT individual scores to see what I mean.
Indeed, Wukong and Indiana Jones are massive outliers, 9070xt gets about 50% perf with full PT which skews things somewhat. Neither games are getting 60fps on the 5070ti either, so it's not much of a comparison.
 
Another factor is that even performance mode (let alone balanced) looks better than the older FSR quality model. So it’s not a regression in performance, it’s a jump.

So comparing performance at quality vs quality is not remotely apples to apples.

Even balanced in FSR4 looks as good or better than DLSS3 quality. All this talk about DLSS4 transformer still being better is based on the caveat that it’s “marginal” differences.
My issue is that in Black Ops 6 when I keep the same setting but turn on FSR3 and set it to Quality it performs worse than running at native with no scaling tech used. I can't see why people would use FSR or DLSS if it makes performance worse?
 
My issue is that in Black Ops 6 when I keep the same setting but turn on FSR3 and set it to Quality it performs worse than running at native with no scaling tech used. I can't see why people would use FSR or DLSS if it makes performance worse?
I probably wouldn't use any upscaling on a FPS game at all regardless. Are there any videos that show FSR4 working as it should in BO6?
 
Indeed, Wukong and Indiana Jones are massive outliers, 9070xt gets about 50% perf with full PT which skews things somewhat. Neither games are getting 60fps on the 5070ti either, so it's not much of a comparison.

Yeah, TPU for example include Elden Ring RT in its benchmark suite. I means seriously!

I have also done the numbers on TPU RT individual scores at 1440p. 5070Ti 92.56 average and 9070 XT pulse was 86.3 and that includes some heavy RT games.
 
My issue is that in Black Ops 6 when I keep the same setting but turn on FSR3 and set it to Quality it performs worse than running at native with no scaling tech used. I can't see why people would use FSR or DLSS if it makes performance worse?
Fsr3 is awful in bo6 and warzone

Fsr4 is excellent
 
I probably wouldn't use any upscaling on a FPS game at all regardless. Are there any videos that show FSR4 working as it should in BO6?
Been running fsr4 since the new season started prefer it to native on warzone and bo6 crisp and clear

Unlike fsr3 which is mushy
 
My issue is that in Black Ops 6 when I keep the same setting but turn on FSR3 and set it to Quality it performs worse than running at native with no scaling tech used. I can't see why people would use FSR or DLSS if it makes performance worse?
What do you mean? Turning on FSR lowers your framerates?
 
Quick update on the issues I had. I've swapped my Sapphire Nitro+ OC 9070XT for the Aorus Gigabyte OC 9070XT and the bracket fits perfectly! I'm sending the other back to OCuk today for a refund on it. The booting and BIOS issues were due to a rogue USB device being in the EZ Flash port so that was a red herring. Good to have a card that fits properly though and not bodged in. Feels amazing to have a solid, reliable system again. One last note I did have to DDU again to get proper performance. I was having microstutters and weirdness despite going from 9070XT to 9070XT. I guess that may be down to the difference in clock speeds and maybe some other gear cached in the background. As ever thanks to everyone who helped me out, appreciate it.
 
well this has to be the weirdest thing il ever say ive done
i just did a full on goal celebration after discovering my tv is broken
normally this discovery would be unwelcome but needing a new telly is far less headache than returning a gpu
so thanks again everyone turns out my telly just randomly decided to choose now to die lol
 
Power virus? lol. It just stresses the GPU unrealistically, but will often find instability if it is there. It’s pretty much the Prime95 of GPU testing.
Prime95 stresses a CPU with a chunky huge heatsink/aio cooling just the cpu which doesn't pull remotely near the same power as a discrete graphics card, CPU will also down clock/throttle and reduce power draw if it has too.

V's

Furmark running combined GPU/Vram/vrms 100% sustained maximum board power draw at full throttle (360w+ 9070s) under a single heatsink, add that new 12V connector to the mix, it just loves heat...

And people can run Furmark for hours...

At a bare minimum Furmark IS cooking all the oil out of heatsink Vram/vrm pads/putty and severely degrading the Tim under the GPU, reducing the thermal performance=ironically making your VRAM warmer.:thumbsup
 
My issue is that in Black Ops 6 when I keep the same setting but turn on FSR3 and set it to Quality it performs worse than running at native with no scaling tech used. I can't see why people would use FSR or DLSS if it makes performance worse?
Are you CPU bottlenecked? You shouldn't have lower performance with FSR 3 / 4 upscaling than Native unless you are CPU bottlenecked.
 
Last edited:
Back
Top Bottom