• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Well my Hogwarts key has just arrived, will be putting up some XTX and 4090 videos to see how good/bad the game runs.

Guess we know it runs bad already, but some vids as to how it runs now for comparison over the coming months, to see if they can/will optimize performance would be good. Or w'lle all need to accept an upgrade to a 5090 to play it.

Nvidia conditioning titles for FOMO's like me.

Already had NExus yesterday with talk like; 'Next Gen' games on UE5/5.1, upgrade FOMO conditioning language.

Off on holibobs - reigning in wife's spends, so can I save for FOMO 5090.
 
Last edited:
Im more interested in 1080p RT results to be fair, cause im pretty certain steve ***cked up. There is no way a 7700x suffers that badly from the nvidia overhead.

Computerbase and pcgamershardware will hopefully have their results soon so we can see what is what, I suspect they are probably waiting for the day 1 patch.

So far Jansn and techpowerup have shown similar performance figures where everything ***** the bed and a 3080 actually does all right compared to rdna 2 and even rdna 3 especially when dlss/fsr is used......

Guess we know it runs bad already, but some vids as to how it runs now for comparison over the coming months, to see if they can/will optimize performance would be good. Or w'lle all need to accept an upgrade to a 5090 to play it.

Nvidia conditioning titles for FOMO's like me.

Already had NExus yesterday with talk like; 'Next Gen' games on UE5/5.1, upgrade FOMO conditioning language.

Off on holibobs - reigning in wife's spends, so can I save for FOMO 5090.

Don't worry matt will have his fresh install of windows on, best components there is on the market all highly tuned and running custom bios etc. with the air conditioning unit right beside the pc to show how it performs lovely for him and how all review sites performance figures are wrong :D :cry:
 
Last edited:
Don't worry matt will have his fresh install of windows on, best components there is on the market all highly tuned and running custom bios etc. with the air conditioning unit right beside the pc to show how it performs lovely for him and how all review sites performance figures are wrong :D :cry:
You're such a troll on these forums lol. If that's what LtMatt gets, great or do you call out other users for getting the best out of their systems and posting on OVERCLOCKERS UK. :D

Strange with all the benchmark threads that have existed for more than a decade with users doing just that for the top spots.
 
Last edited:
Why is this game even on Unreal Engine 4 not 5 ? From my googling it states Unreal Engine 4 why couldn't they have just used the latest engine on it before release ?

You can't just trivially swap engine versions like that. It takes a ton of work, and masses more to get the best of especially on bigger games. You need to decide early in the dev cycle which engine version to use.
 
Last edited:
You're such a troll on these forums lol. If that's what LtMatt gets, great or do you call out other users for getting the best out of their systems and posting on OVERCLOCKERS UK. :D

Strange with all the benchmark threads that have existed for more than a decade with users doing just that for the top spots.

Thing is I'm not trolling, Matt has said that himself and even posted pics of an air con unit right beside his PC :cry: :D

But in all seriousness, I got no problem with that, it's just funny when these review sites are the source of truth when it comes to nvidia ******** the bed but when it comes to amd gpus, they're wrong as matt can get an extra 25+% from his setup :p :D
 
Or it'll get tweaked so that vram/texture/asset is properly handled/switched/managed i.e. called optimisation, which is clearly lacking if people are having issues at 720p too.......

Unless you think that vram usage @ 1080P should be the same as 4k vram usage (which just to highlight shouldn't be happening if you know anything about how vram usage scales with your resolution)

yB83Rq1.png
I mean, looking at these VRAM amounts it seems clear that RT isn't an option for about 90% of cards.

Lowering the resolution makes little difference, we've still using over 13GB of VRAM.

We've seen this over and over with RT enabled, it uses a lot more VRAM in many titles.

I agree that they could probably improve the texture LOD or streaming system... But I don't expect much improvement.
 
Last edited:
I mean, looking at these VRAM amounts it seems clear that RT isn't an option for about 90% of cards.

We've seen this over and over with RT enabled, it uses a lot more VRAM in many titles.

I agree that they could probably improve the texture LOD or streaming system...

Even regardless of vram, it's not an option for 99% of gpus, even £1300+ ones unless you game at 1080p/1440p :cry:

On a more serious note, hopefully computerbase or/and pcgamershardware will get performance figures with dlss/fsr as "everyone" will be using these in hogwarts and will be a better representation for the masses :)

EDIT:

Also when you have people who have set everything to low and playing at 720p getting drops to 10-20 fps, don't tell me that this is a case of just needing more vram........ :o

I don't suspect there will be "much" improvement for probably 4k but there is certainly room for improvement at res. less than 4k.
 
Last edited:
This is what Compuerbase.de said about enabling RT:
"GeForce RTX 4070 Ti does not have any performance problems in Ultra HD and DLSS Quality with otherwise full graphics quality, but the textures are not loaded for some objects, others disappear completely for a short time, since the engine is probably trying to fix the situation somehow to rescue".

So, the game doesn't seem to punish much for going over the VRAM budget (in performance terms).
 
Last edited:
You can't just trivially swap engine versions like that. It takes a ton of work, and masses more to get the best of especially on bigger games. You need to decide early in the dev cycle which engine version to use.
UE4 is easily performant enough. This game's use of this engine is not the reason why GPUs are struggling in this title.
 
I wonder if a lot of the people testing this game are using 16GB of RAM, rather than 32GB.

Maybe this hits thw game particulalry hard if you exceed your VRAM budget, e.g enable RT.
 
From the other thread.

Finally someone addressing the VRAM to price/power mismatch from Nvidia.

Timestamped link.


I think unless you ok with low quality textures, playing games on medium to low settings, I would aim for at least 16 gig
VRAM now, and this includes RT fans as RT is proving to be a VRAM hog alongside textures.
 
After reading the last few pages about Hogwarts Legacy and then all the claims that consoles are a sub-optimal way of playing games and you need a 4090 to experience them in all their glory. Yeah, good luck with that.

If consoles became an open system, so basically cheat engine, unencrypted saves and mods, I think I would switch in a heart beat. PC gaming is more broken than usual right now. DLSS/FSR has done what I heard some predict a few years back which was to encourage devs to do less optimisation.
 
Last edited:
Or it'll get tweaked so that vram/texture/asset is properly handled/switched/managed i.e. called optimisation, which is clearly lacking if people are having issues at 720p too.......

Unless you think that vram usage @ 1080P should be the same as 4k vram usage (which just to highlight shouldn't be happening if you know anything about how vram usage scales with your resolution)

[snip vram pic]

The only thing here if that vram pic is telling a full story, is why don't 8gb cards like the 6600xt and 3070 fall off the edge of a cliff on their 4K results?

@4K the 6600xt still outperforms the 12gb 3060 (often the 6600xt falls behind the 3060 at 4K in many games), or why the 3070 is exactly the same performance as a 2080ti at 4K?

Do they have more stuttering problems? So far the couple of article on the stuttering just say "it happens even on top spec machines". Do they auto hit higher levels of compression?* Your linked picture's article doesn't go anywhere near that sort of speculation, bar just showing image quality screens for a single card - a 3080.

*this happens with the traditional path tracing programs I use. You hit your mem cap, then it has another pass and compresses the textues used in your scene to the next level until it either fits or crashes at max.
 
Thing is I'm not trolling, Matt has said that himself and even posted pics of an air con unit right beside his PC :cry: :D

But in all seriousness, I got no problem with that, it's just funny when these review sites are the source of truth when it comes to nvidia ******** the bed but when it comes to amd gpus, they're wrong as matt can get an extra 25+% from his setup :p :D
Something is possibly bugged with 'Arry Potter as the numbers seem to be all over the place atm. I'm more content with user results around here until review sites are posting in the realm of that when it comes to games that are displaying strange behaviour in results. The game is just released, it's just a matter of time and everything will become clear.

But in all seriousness, I got no problem with that, it's just funny when these review sites are the source of truth when it comes to nvidia ******** the bed but when it comes to amd gpus, they're wrong as matt can get an extra 25+% from his setup :p :D
If both top end Nvidia and AMD users in this forum (with similar CPUs) are posting results in the realm of what is being shown on review sites then that is a better indication of performance. Thinking that someone is constantly leaning towards bias is a bit strange, especially with LtMatt as he has both top end GPUs (4090/7900XTX) so I don't quite understand your problem with his results.
 
I wonder if a lot of the people testing this game are using 16GB of RAM, rather than 32GB.

Maybe this hits thw game particulalry hard if you exceed your VRAM budget, e.g enable RT.

So far out of the complete benchmarks we have, jansn, tpu and hub, they are all using 32gb but the game does suffer worse performance if using 16gb ram with gpus that have <16gb vram

The only thing here if that vram pic is telling a full story, is why don't 8gb cards like the 6600xt and 3070 fall off the edge of a cliff on their 4K results?

@4K the 6600xt still outperforms the 12gb 3060 (often the 6600xt falls behind the 3060 at 4K in many games), or why the 3070 is exactly the same performance as a 2080ti at 4K?

Do they have more stuttering problems? So far the couple of article on the stuttering just say "it happens even on top spec machines". Do they auto hit higher levels of compression?* Your linked picture's article doesn't go anywhere near that sort of speculation, bar just showing image quality screens for a single card - a 3080.

*this happens with the traditional path tracing programs I use. You hit your mem cap, then it has another pass and compresses the textues used in your scene to the next level until it either fits or crashes at max.

Essentially the game is a mess regardless of your spec :D

But yeah, we'll need to wait for DF/Alex to get a proper analysis on what is going on, however, not much point until the day 1 patch arrives to see if any issues do get fixed :)

Something is possibly bugged with 'Arry Potter as the numbers seem to be all over the place atm. I'm more content with user results around here until review sites are posting in the realm of that when it comes to games that are displaying strange behaviour in results. The game is just released, it's just a matter of time and everything will become clear.


If both top end Nvidia and AMD users in this forum (with similar CPUs) are posting results in the realm of what is being shown on review sites then that is a better indication of performance. Thinking that someone is constantly leaning towards bias is a bit strange, especially with LtMatt as he has both top end GPUs (4090/7900XTX) so I don't quite understand your problem with his results.

Did you miss the bit I posted:

I got no problem with that, it's just funny when these review sites are the source of truth when it comes to nvidia ******** the bed but when it comes to amd gpus, they're wrong as matt can get an extra 25+% from his setup

Again, no issue at all, just highlighting the above ;) Also, matt goes to the extreme of using liquid metal among many other tweaks, which most people even on this forum wouldn't do, after all, he doesn't get number 1 spot in various benchmarks (in the world too and not just on here) by keeping a stock system with only a mild overclock/undervolt :D
 
Yeah same here, until now there hasn't really been anything to justify the cost, with dlss and some settings sacrificed, 3080 still doing pretty well at 4k and 3440x1440, if more and more games end up performing like deliver us mars, hogwarts, forspoken then there isn't much choice but to buy something to be able to power its way through the ****** optimisation, that and having FG is a nice pro to have when so many games appear to have rubbish cpu utilization too :(




3QkieNi.png

6GS2AJz.png

326lIeh.png

f4BdEXi.png

GLKyd9z.gif
Those figures are an embarrassment to the dev team of that game, also interestingly very different to HUB results.
 
Back
Top Bottom