• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Maybe but still the game needs dlss on both the 3090 and the 4070ti, because regardless of vram, it performs like crap anyways. I mean the 3090 has 24gb, are you going to play the game at 33 fps?

Thats why i always thought the vram complaints were always childish. Vram never was and never will be the reason you upgrade a card, cause always it will be something else that tanks your perfornance before that (at settings you would actually play the game)
In this case it looks like the TI would probably have the grunt to run it (4K with RT) but can't due to lack of video memory. The 3090 can run it better, but I agree 33 FPS is not ideal, I guess you'd have to enable DLSS. Hopefully he tests all these scenarios. If the 3090 is getting 33 FPS, what is the 3080 10GB getting? Would it be 10% or so lower, or much lower due to lack of memory? That's what we'd all like to know. :cry:
Ufxax7q.png
 
  • Haha
Reactions: TNA
In this case it looks like the TI would probably have the grunt to run it (4K with RT) but can't due to lack of video memory. The 3090 can run it better, but I agree 33 FPS is not ideal, I guess you'd have to enable DLSS. Hopefully he tests all these scenarios. If the 3090 is getting 33 FPS, what is the 3080 10GB getting? Would it be 10% or so lower, or much lower due to lack of memory? That's what we'd all like to know. :cry:
Ufxax7q.png
How would the 4070ti be able to run it even with 2 terrabytes of ram when the 3090 is getting 33fps? Is the 70ti twice as fast? Cause that's what it needs to lock to 60.
 
  • Like
Reactions: TNA
I'll get a copy of this and test it on the 7900 XTX and the 4090 just to see how performance is. I've no interest in playing it outside of that though, bloody Harry Potter indeed. :cry:
 
How would the 4070ti be able to run it even with 2 terrabytes of ram when the 3090 is getting 33fps? Is the 70ti twice as fast? Cause that's what it needs to lock to 60.
It supports DLSS3, but frame generation probably can't overcome video memory limitations.
 
Last edited:
Lets for the sake of argument agree that its totally a vram issue and it has nothing to do with the game and it cant get fixed by patches etc

You drop textures from ultra to high and... Voila. Now you are going to argue that you don't want to spend 900 euros on a card and have to drop settings. Well tough luck even with a 4090, undoubtedly by far the fastest card on the planet, you HAVE to drop settings at some games. Cyberpunk is literally unplayable without dlss for example. So whats the issue? You think the 7900xt with its 16gb is going to be playing the game maxed out just fine?

I mean regardless of the vram issue, the game is running at 33fps on a 3090 and 40 FPS on a 4070ti. You will have to activate dlss and or drop settings to make it playable, so vram usage will also drop

Issues when it comes to memory leaks, poor texture/asset loading and vram management and those getting patched......... don't be so silly!



Like I said in another thread or maybe it was here, with all the recent games, it seems that most games are now running like crap unless you have a 4080 or 4090 and maybe a 7900xtx at a push especially where cpu bottlenecks are concerned, tinfoil hat time, nvidia paying developers to gimp on cpu utiliziation to push fg/DLSS 3 ;) :D
 
Last edited:
  • Like
Reactions: TNA
It supports DLSS3, but frame generation probably can't overcome video memory limitations.
FG at 30 or 40 fps is dog though. Try your 4090 at cyberpunk native 4k with fg on, the input latency is insane. You need to hit at least 50 to make it playable with FG.
 
As per hub "the game is enjoyable on the 3090, unlike the 4070ti"
This is just trolling. If 33fps is enoyable to you then ok, i can argue the constant stuttering of the 4070ti is enjoyable to me cause its 30% faster on average. Nonsense.
 
Last edited:
In this case it looks like the TI would probably have the grunt to run it (4K with RT) but can't due to lack of video memory. The 3090 can run it better, but I agree 33 FPS is not ideal, I guess you'd have to enable DLSS. Hopefully he tests all these scenarios. If the 3090 is getting 33 FPS, what is the 3080 10GB getting? Would it be 10% or so lower, or much lower due to lack of memory? That's what we'd all like to know. :cry:
Ufxax7q.png

Good point but the guys like ronseal changed thier % like the wind. 5% 10% no back to 5% then it ended up with 15% after migrating to the 4090! It didnt matter back then though.. but some of us have not had to shell out £1200+ on another card yet which was the point we were making - the stingy vram ampere cards were always going to be monitored how well they age.
 
0tB7S8m.png


The results show several things at once. First, Hogwarts Legacy - typical of Unreal Engine 4 - is very demanding, unless upscaling is used. This is especially true when the ray tracing upgrades are turned on. In our nightly test with the fresh game version, the Radeon RX 7900 XT (and thus also the 7900 XT) has problems that go beyond that: Regardless of whether ray tracing is switched on or off, the vegetation is faulty. Depending on the level of detail, bushes are initially conspicuously white and only displayed in the correct color when you get closer. In addition, there is a very weak ray tracing performance, which is apparently (also) due to the insufficient utilization of the GPU. We therefore emphasize again that comparative benchmarks will only be worthwhile once the game and driver are officially available for everyone, and challenge early access stunts like Hogwarts Legacy. Those who can wait will enjoy a more sophisticated treat. There is a good chance that the healing AMD Software 23.2.1 will be released tonight.

We generally assume that the developers and possibly the graphics driver forges can still reduce the memory requirement a little, with full ray tracing splendor, Hogwarts Legacy would otherwise be one of the currently most demanding games in terms of graphics memory. And the performance requirements with activated ultra ray tracing effects are also very high - even an RTX 4090 struggles in native resolution and without upsampling with a smooth 60 fps.

forspoken and now this, 2023 shaping up to be a great year for pc gaming.....

X8Thzm2.gif
 
  • Haha
Reactions: TNA
Good point but the guys like ronseal changed thier % like the wind. 5% 10% no back to 5% then it ended up with 15% after migrating to the 4090! It didnt matter back then though.. but some of us have not had to shell out £1200+ on another card yet which was the point we were making - the stingy vram ampere cards were always going to be monitored how well they age.

If you're referring to the 3080 - 3090 performance difference, that's because it varies, in some games there is <5% difference, on average it is 8-10% and in very rare cases, it's 15%, maybe even 20%, see Daniel Owen video on a great comparison to show this, as his video was called "don't get ripped of".

Looking at hogwarts and other titles, seems like a 3090 is also having to sacrifice it's settings regardless of all vram, same way a 3080 is also having to sacrifice those settings. @Woodsta888 did the right move, saved all that money by getting a 3080 over a 3090 and thus used that money saved to get himself a 4090 which provided a huge boost to his gaming experience as he isn't having to make sacrifices because of lack of grunt any more.
 
@LtMatt the only testing I fully trust is OCUK forum testing but if true one of the biggest games of the year just kicked Jensen in the nadgies.

On a serious note though, if anyone has pre-ordered a 3k laptop this week with a 4080 12GB inside, you might want to hold off until the OCUK forum has put this game through the ringer, if you are intending to use said laptop a lot with this game at max settings.

@Boomstick777 do you have a 4070ti (memory is hazy) and if so do you want to be a gineau pig by trying hogwarts at 4k?
 
forspoken and now this, 2023 shaping up to be a great year for pc gaming.....

Wut!?

Don't tell me the usual crew are still pulling every new game released as - performance expectations for graphics cards for going forward.

I really don't understand why so many supposed forum experts think that new game releases are a good representation of the game, then immediately point it at the fault of the hardware rather than the state of the game. It doesn't play well on anything. USing that thinking - the same people moaning about prices will need another £5k for a PC build to finally play Harry Potter when the hardware allows as the game in fact - was programmed, written and tested to perfection before release. MUst be these new texture packs that look little better than before texture packs (re-badged game hotfixes) were a thing.

The continual trolling by using new game releases (betas) just to put down others on a forum is tiring. Many claiming that RT is worthless, makes no difference, rubbish tech - but used when suits - benchmarks n all - laughable.

If a 4090 cant play an Unreal 4 game at decent FPS @ 4k then the game release is poor - very poor. Unless, the graphics really are stand out 'something else'.

Lets revisit this game in 6 months for a realistic performance analysis on gfx cards. Not keep using poor beta releases which is typical these days of people stupid enough to pay MSRP for games on release.

Does have a bnetter ring to it though,

Can it play Crysis Harry Potter :cool:
 
Question for those with a Gigabyte OC 4090.

How are the fans on it? Are they silent at say 30-40% like the Suprim ones? What's the lowest RPM after fan stop at idle?
My pny 4090 fan noise is great I have 3 nzxt 140 fans running and I can just about hear them ..this is at default settings card isnt overclocked ..
 
If a 4090 cant play an Unreal 4 game at decent FPS @ 4k then the game release is poor - very poor. Unless, the graphics really are stand out 'something else'.
Basically this. Unless the graphics for a new release are something on another stratosphere, a generation or two ahead, then if it can't run smoothly on a 4090 it's a poorly optimized game.
 
If a 4090 cant play an Unreal 4 game at decent FPS @ 4k then the game release is poor - very poor. Unless, the graphics really are stand out 'something else'.
Something we agree on, and this one is Nvidia sponsored too.

Seems like performance is pretty good until you enable Ray Tracing, that goes for all GPUs.
 
Last edited:
I'm surprised at the 7900xtx getting hit so hard with RT, 19 fps and low fps of 11.... Will be interesting to see how the 3090 does in comparison since both have similar RT performance and same vram but obviously raster of 7900xtx is way ahead of the 3090 which might give it the lead over a 3090???
 
  • Haha
Reactions: TNA
Back
Top Bottom