• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Many UE5 games require GPUs like the RX 9070 or RX 7900 XT to get a consistent 60+FPS (1080p Ultra)

Soldato
Joined
30 Jun 2019
Posts
8,159
Some of the latest Techspot graphics card reviews are including UE5 games, like:
  • Oblivion: Remastered
  • Clair Obscur: Expedition 33
  • Stalker 2: Heart of Chornobyl
Example:

Elder-1080-p.webp


Review link:
https://www.techspot.com/review/2992-nvidia-geforce-rtx-5060/#The_Elder_Scrolls_IV

On recent Techpowerup reviews, Avowed uses UE5 also.

Each of these games ideally need a GPU like a RX 9070 or RX 7900 XT to achieve a consistent 60 FPS (1% low above 60 FPS).

That is before taking into consideration, issues with loading / traversal stutter present in some titles (a different topic).

So, I think that’s worth considering when building any new serious gaming PC, or upgrading.

So a minimum of ~£560 for the GPU is what you’ll be looking at. Particularly if you want to future proof your system a bit, this is probably what you will need for games like the Witcher 4 when that comes out.

Potentially less, if you use technologies like frame generation to boost your framerate in these titles.

Full list of UE5 games:
https://en.wikipedia.org/wiki/Category:Unreal_Engine_5_games
 
Last edited:
I was typing that out on my defective iPad earlier, so some of the text was in bold due to text formatting problems.
 
For those UE games it's possible to play a lot of them in VR using something called UEVR, a 4090 is needed for a good experience with many and a 5090 is preferable. The games needs to run usually at minimum 90 FPS and they render to 2 screens so more than double the graphical processing.
 
UE5 has the option to run games with hardware based Lumin which is very demanding on GPU's, if you turn it down or off performance is significantly improved (or use software Lumin).

If you want games with better lighting and visuals it's the price you have to pay. Also UE5 is going to be with us for a while, as developers get more familiar with how the engine works games will start to run better..
 
That's true. HW lumen does significantly change the look of games like TES IV: Oblivion Remastered, though. I wouldn't particularly want to turn that option off.
 
You're not obliged to play every game on maximum settings with every single slider cranked. I'm not sure why that's seemingly become an expectation for people buying mid-range GPUs. If you're getting an x60-class card then you should expect to have to make settings compromises in the most demanding games. That's simply not anything new. I just went and had a look at a GTX 660 review for funsies and that didn't get anywhere near 60fps (on average even) at 1080p in the more demanding games at the time of its release, such as Battlefield 3 and Crysis 2. Of course, then just as now, you could generally simply tweak a few settings and get a very playable experience. If anything, things have gotten much better on that front with the upscaling crutches we now have, which didn't exist back then. Whether people like it or not, these games are pretty much made with the idea that you'll be playing them with DLSS/FSR/XeSS enabled. People who own a 5090 may not like having to rely on such things, but they're great tools for boosting the performance of lesser cards, and certainly better than just playing at a lower resolution and dealing with monitor scaling like back in the day. I'd also suggest that it's extremely foolish to be trying to "futureproof" your machine for a game like The Witcher 4 at this point, several years away from its release (at least). Doubly so if you're in the market for mid-range cards.

I'm not trying to defend the current state of the GPU market, but I'm also tired of this idea that benchmarks performed on maximum settings are a realistic real world scenario for mid-range cards. You can usually get a massive boost in framerate for a tiny loss of visual fidelity by tweaking things. Digital Foundry show that all the time. If you want to crank every single slider and never have to compromise, buy a 5090.
 
It is worth the consideration when building a system but assuming ultra quality is actually a decent quality level personally I wouldn't expect to be able to run it and get decent frame rates on just anything. For me the benchmark for reasonableness would be with high settings but again that depends how good the graphics actually are which is always going to be a bit subjective.
 
If you want to crank every single slider and never have to compromise, buy a 5090.
It's not necessary, you will still have to deal with issues sometimes. There are several reasons to buy higher end hardware, like to enable raytracing, or play at higher resolutions or framerates.

The general problem with always buying the very best, is that you will find yourself wondering when the next new product will come along, and replace the previous offering and it might only be a year.

Like with Nvidia's Super cards.

There's also value to consider. Nvidia's top products (like the RTX 5090) are typically priced ~£2000.

If you pay double the price of the RX 9070 or RX 9070 XT, you might reasonably expect to get double the performance. But you don't.

For the sake of argument, lets say you can buy a RTX 5090 for £1,300 (instead of £2,000). Arguably not a bad price, but still not a 100% improvement in 3D / raster performance.

That won't be possible until Nvidia's next generation, and again, probably for £2,000 or more.

Additionally, you will need a pretty beefy power supply to handle a card like this - Maybe 1000w or more.
 
Last edited:
You know what i find hilarious about all this?

These games run 60 FPS at 1440P on a PS5, which is equivalent to an RX 6600 XT.
 
Correct, you don't need to play at the Ultra preset. But I would typically only mess with the presets or individual settings when nothing else has worked. For one thing, you can play around with settings for a long time, rather than actually enjoy a game.

Ideally, if a PC meets the recommended spec, I would expect 1080p ultra at 60 FPS. Developers now sometimes specify resolutions in system requirements as well.

The 'High' preset is often a good compromise, especially on a GPU that is a few years old.

That's the nice thing about PCs, you can choose a preset like High, Medium or Low, depending on how out of date the hardware is.
 
Last edited:
You're not obliged to play every game on maximum settings with every single slider cranked. I'm not sure why that's seemingly become an expectation for people buying mid-range GPUs. If you're getting an x60-class card then you should expect to have to make settings compromises in the most demanding games. That's simply not anything new. I just went and had a look at a GTX 660 review for funsies and that didn't get anywhere near 60fps (on average even) at 1080p in the more demanding games at the time of its release, such as Battlefield 3 and Crysis 2. Of course, then just as now, you could generally simply tweak a few settings and get a very playable experience. If anything, things have gotten much better on that front with the upscaling crutches we now have, which didn't exist back then. Whether people like it or not, these games are pretty much made with the idea that you'll be playing them with DLSS/FSR/XeSS enabled. People who own a 5090 may not like having to rely on such things, but they're great tools for boosting the performance of lesser cards, and certainly better than just playing at a lower resolution and dealing with monitor scaling like back in the day. I'd also suggest that it's extremely foolish to be trying to "futureproof" your machine for a game like The Witcher 4 at this point, several years away from its release (at least). Doubly so if you're in the market for mid-range cards.

I'm not trying to defend the current state of the GPU market, but I'm also tired of this idea that benchmarks performed on maximum settings are a realistic real world scenario for mid-range cards. You can usually get a massive boost in framerate for a tiny loss of visual fidelity by tweaking things. Digital Foundry show that all the time. If you want to crank every single slider and never have to compromise, buy a 5090.
To be fair, CRT handled motion much better. 30fps on oled or tn is not a good experience, and is better on CRT. I was playing Alan Wake 2 a couple of months back on CRT and posted that at such a low resolution it still looked great. Think it was 480p i was playing at. Commented that it was free like DLSS. I'm not implying that we should go back to CRT, but it does show that screen render technique can make a big difference on the power of the gpu you need. I still hold out for a new tech that can combine the pro's of OLED with the pro's of CRT (input lag, motion clarity and resolution scaling). That would reduce the gpu power needed for equal experience. 100fps crt is like 200+ oled, resolution scaling would also cut out the need for DLSS, or if combined with DLSS would give another big jump. I have seen that there are people experimenting with new types of BFI to simulate CRT, that could potentially become a hardware feature in new monitors an TV's :)

EDIT* maybe i have got ahead of myself. You may mot have had crt in mind and i go on a spiel about the good old crt days. Just saw the games you mentioned and automatically remember playing on CRT and was commenting from that perspective :D
 
Last edited:
You know what i find hilarious about all this?

These games run 60 FPS at 1440P on a PS5, which is equivalent to an RX 6600 XT.
That's partly because the Playstation isn't being crippled by bloody Windows.

I don't see this as a problem or a big deal, Crysis/Cry Engine 1 crippled everything when it came out back in the day. Eventually graphics cards become more powerful and Crysis become easy to run. Fast forward to 2025 I don't see any difference to today with UE5 and it's fancy ray tracing techniques to were we we're with Cry Engine 1.
 
Last edited:
If you want everything on Ultra just go for a walk. Having said that, many poor folk still only get medium or low settings, though in some cases at least they can enable spectacle-based upscaling.
 
Ark Survival Ascended just put out a big update the other day that took it from UE5.2 to 5.5, it's definitely improved minimum framerates a bit for me at Epic settings 1440p (granted I don't have the grass and water variables maxed out), but then I am on a 7900XT anyway.
5.6 has also just been released, which should enable devs to optimise more for lower end hardware.

But you definitely don't need to play everything on max to have a good experience, I was running it on a mix of medium and high earlier in the year (pre-patch) when I was playing on a competitive server.
This was just as true years ago, I can think of loads of games that I couldn't play at max on release, often not until several generations of GPUs later (Crysis obviously, but also Metro 2033, GTA4 etc.)
 
medium or low settings, though in some cases at least they can enable spectacle-based upscaling.
These seem like good options, if using older hardware. The goal here, is to make sure that you can at least play the latest games with some fluidity.

One thing I hope never happens, is that ray tracing becomes part of the default 'Ultra' preset that has become the de facto standard in (PC) games, in basically all resolutions.

RT should remain in it's own preset above Ultra, otherwise I think it just becomes a game of tweaking settings to find the right RT options...

If RT becomes the norm for game presets, all that will do is push up the prices of graphics cards, and push people towards more RT capable graphics cards, e.g. Nvidia.
 
Last edited:
Back
Top Bottom