• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** The AMD RDNA 4 Rumour Mill ***

When I was looking at the computerbase.de comparisons yesterday it seemed to me like FSR4 sharpened slightly more on objects close to the camera and slightly less on things far away compared with DLSS 4. I'm not sure which I preferred, possibly FSR4 (seemed less like a sharpening filter), I think in motion DLSS4 is probably still better though, I want to see FSR4 displaying those fan blades and things they used to show how much better it was than DLSS3.

EDIT actually I may have been thinking of the DLSS3 comparisons with that sharpening difference. I can't remember, so ignore me.
 
Last edited:
Its a mid range GPU, its not meant for that.

I also don't get why people would want to upgrade gen to gen anyway. Even more when it's a mid range to mid range.

I looked on my OCUK account last night as I was saving my address to try and get an order through later. My last card i bought in 2018...if we think 2 gens time will be 2029/2030 I can literally by buying 1 GPU a decade upgrading every gen is madness to me especially in the current climate.
 
Last edited:
I also don't get why people would want to upgrade gen to gen anyway. Even more when it's a mid range to mid range.

I looked on my OCUK account last night as I was saving my address to try and get an order through later. My last card i bought in 2018...if we think 2 gens time will be 2029/2030 I can literally by buying 1 GPU a decade upgrading every gen is madness to me especially in the current climate.
I've been rocking my 2080ti for the last 3+ years, the 9070xt looks like a viable upgrade to this without sending the wife crazy!
 
I also don't get why people would want to upgrade gen to gen anyway. Even more when it's a mid range to mid range.

I looked on my OCUK account last night as I was saving my address to try and get an order through later. My last card i bought in 2018...if we think 2 gens time will be 2029/2030 I can literally by buying 1 GPU a decade upgrading every gen is madness to me especially in the current climate.
For some people, this is their main hobby. Not necessarily gaming but just being involved/invested in latest tech and enjoying that part to it.

The yearly amount I spend on PC parts is nothing compared to what my best mate spends on golf...!
 
For some people, this is their main hobby. Not necessarily gaming but just being involved/invested in latest tech and enjoying that part to it.

The yearly amount I spend on PC parts is nothing compared to what my best mate spends on golf...!

Golf bats are expensive.....
 
2Nnb.gif
 
Yes, to that point, Tim@HWUB commented he thought the FSR4 demo he saw was running Quality mode. Then found out it was running performance mode (!)...so its a decent step up for sure, especially in motion.
 
Last edited:
I'm thinking about switching from Nvidia and getting a 9070 XT but have a question. On Nvidia GPUs if you set low latency mode to ultra, Vsync on and enable G-sync, then it will automatically cap the frame rate slightly below your monitor's refresh rate and eliminate tearing without increasing latency. This is better than using an in-game frame rate limiter which can still have tearing unless you set the frame rate limit much lower. Does AMD have an equivalent feature?
 
Yes, to that point, Tim@HWUB commented he thought the FSR4 demo he saw was running Quality mode. Then found out it was running performance mode (!)...so its a decent step up for sure, especially in motion.

What was the input resolution?
 
I'm thinking about switching from Nvidia and getting a 9070 XT but have a question. On Nvidia GPUs if you set low latency mode to ultra, Vsync on and enable G-sync, then it will automatically cap the frame rate slightly below your monitor's refresh rate and eliminate tearing without increasing latency. This is better than using an in-game frame rate limiter which can still have tearing unless you set the frame rate limit much lower. Does AMD have an equivalent feature?

Don't know about all of that but you can set V-Sync and cap the frame rates as per game basis or globally in the drivers. I don't know what "Ultra low latency mode" is.
 
AMD have Anti-Lag and Anti-Lag2, the latter requires in-game support.

Ah i see, in that case yes, you can set custom settings in the driver for pretty much anything you can think of as per game as a game profile or globally, even per game custom overclocks, AMD's driver functionality goes very deep and its very extensive.

I'm not at my PC right now or i would show you.
 
Last edited:
I'm thinking about switching from Nvidia and getting a 9070 XT but have a question. On Nvidia GPUs if you set low latency mode to ultra, Vsync on and enable G-sync, then it will automatically cap the frame rate slightly below your monitor's refresh rate and eliminate tearing without increasing latency. This is better than using an in-game frame rate limiter which can still have tearing unless you set the frame rate limit much lower. Does AMD have an equivalent feature?
Cool and Quiet. Been there for years. Mapped to hotkey and appplied on all those games that want to draw 700w over 1000fps in the damn menus.
 
Back
Top Bottom