• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

If RT hardware was good enough, Nvidia would never have had to invent new software technologies to try to make up for the significant loss in 3D performance.

Wasn't this the original reason why they developed DLSS?

I am quite impressed with frame generation, but not everyone seems to agree.

Console games tend to be very well optimised, but RT limits console GPUs to around 30 FPS - That's not because of a puny GPU in the Series X. But the RT hardware can't cope.
 
Last edited:


1080P :cry:

59XxeTq.gif

yKpjzd2.jpg

Rw4OESm.jpg









:cry:

IZpzXI6.gif


Why is this game even on Unreal Engine 4 not 5 ? From my googling it states Unreal Engine 4 why couldn't they have just used the latest engine on it before release ? Unreal Engine 4 has been broken since release on pc with stutter issues from what I remember, have they actually even fixed that yet on UE4?

These sort of titles that a large part of the gaming community want are going to seriously damage pc gaming at this rate if this is the performance on high end gpus and a 3070 is not old and should be running these titles fine at 1440p at least not 15fps at 1080p.. Geeze really the whole industry is self sabotaging themselves now from software to hardware.. Just wait for streaming services to push it's cheaper to stream than buy hardware as streaming will always allow 60fps/120fps.. not these low fps on two year old hardware. :rolleyes:


The greed is real... I thought it couldn't get worse.. Not sure how many patches are going to help this game and other recent releases with poor performance or worse poor game in general and so called AAA titles..
 
Last edited:
Considering how much unified RAM consoles have to play with it was inevitable 8GB/10GB GPUs were going to age extremely quickly. Hell even on my Steam Deck running Hogwarts the deck is reporting it's using 6.4GB VRAM and 12.1GB RAM and that's at 1280x800 all low and textures on medium. Scale that resolution and graphic features up and 8GB to 10GB GPUs have no chance.
 
I find this all hilarious. Pc peasant race struggling to justify the insanely low memory graphics cards when consecutive releases of big games show them to be pathetic.
 
Last edited:
Haven't seen much coverage of frame generation in this game. The video I saw showed decent performance at 4K.

Would be interesting to see frame gen tested on a RTX 4070 TI in this title, at 1440p and 4K.
 
Last edited:
Yet TPU results:

QcdDcnT.png

ZYTLtxp.png

Boi oh boi, I sure do wish I spent that extra £750 for more vram :D

vdwgRSj.gif

:p

But in all seriousness now.... like a few us said all along, grunt would always become the problem first thus needing dlss/fsr (probably higher presets) or/and reducing settings thus also reducing vram usage:

gZAEh7r.png

766xN5o.gif

:D
 
Last edited:
I find this all hilarious. Pc master race struggling to justify the insanely priced graphics cards when consecutive releases of big games perform like crap.
I'd imagine they perform like that deliberately and is part of a collaboration with Nvidia to make tech like FG a necessity which then forces people to upgrade else they'll have a poor gaming experience even on cards that were halo products a couple of years ago, Also its easier for Nvidia to gimp older cards going forward by locking out the software.
 
I'm a bit surprised by some people who say RT performance isn't an issue on the new cards. In a lot of games like this one, it cuts minimum framerate in half, from a comfortable 60+ with some cards, at 1440p.

Half again roughly at 4K, sometimes a bit less.
 
Last edited:
I'd imagine they perform like that deliberately and is part of a collaboration with Nvidia to make tech like FG a necessity which then forces people to upgrade else they'll have a poor gaming experience even on cards that were halo products a couple of years ago, Also its easier for Nvidia to gimp older cards going forward by locking out the software.
I don't think that's it. The hardware really isn't upto it.

It's really not much better than the RTX 3000 series RT hardware.

Seems that Nvidia's marketing convinced (some) people otherwise though. The thing is, that the problematic RT supporting games will in most cases support frame gen, so it does somewhat mitigate the problem.
 
Last edited:


1080P :cry:

59XxeTq.gif

yKpjzd2.jpg

Rw4OESm.jpg









:cry:

IZpzXI6.gif


Why is this game even on Unreal Engine 4 not 5 ? From my googling it states Unreal Engine 4 why couldn't they have just used the latest engine on it before release ? Unreal Engine 4 has been broken since release on pc with stutter issues from what I remember, have they actually even fixed that yet on UE4?

These sort of titles that a large part of the gaming community want are going to seriously damage pc gaming at this rate if this is the performance on high end gpus and a 3070 is not old and should be running these titles fine at 1440p at least not 15fps at 1080p.. Geeze really the whole industry is self sabotaging themselves now from software to hardware.. Just wait for streaming services to push it's cheaper to stream than buy hardware as streaming will always allow 60fps/120fps.. not these low fps on two year old hardware. :rolleyes:


The greed is real... I thought it couldn't get worse.. Not sure how many patches are going to help this game and other recent releases with poor performance or worse poor game in general and so called AAA titles..
There's something weird going on with this title, HUB saying something is wrong with TPUs results on Radeon. This is just wetting my appetite to get the game even though I have no interest in playing it for actual gameplay. :cry:
zLIaP4S.png
Cg3k0YI.png
 
Computerbase got some results up now too:

JjzA0KD.png

dQzFCOQ.png

a0ZlSn0.png

GLKyd9z.gif

At least dlss is saving the day, seems like performance mode is great in this :cool:

Image quality compared with upsampling​


At a standstill, DLSS 2 and FSR 2 deliver absolutely stunning results in Hogwarts Legacy. Even with the performance setting, i.e. a rendering resolution of Full HD, the game looks absolutely equivalent to the native Ultra HD resolution and even better in places. With the native Full HD resolution, the image quality is visibly inferior in comparison. DLSS 2 and FSR 2 in direct comparison show slight differences, but there is no better or worse.


In motion it is different. Here, DLSS 2.3 ultimately turns out to be the winner in Hogwarts Legacy, even if FSR 2 keeps up well. There are even places where DLSS has less control over image stability than FSR, but most of the time DLSS is ahead, especially in the performance preset. However, even in the best-case scenario (Ultra HD with DLSS quality) there is no completely stable image, because Hogwarts Legacy offers some apparently nasty surfaces that flicker throughout - FSR 2 can no longer keep up here.

The native TAA anti-aliasing is surprisingly strong in terms of image stability. It doesn't manage to produce a flicker-free image either, but even at low resolutions like Full HD, the image stability is still surprisingly good for what it is. FSR 2 is only slightly better in performance mode with Ultra HD and thus with the same render resolution, DLSS 2 performance is then (again with the exceptions) the quite clear winner.




Where DLSS 2 and FSR 2 can clearly set themselves apart from the native anti-aliasing is in the sharpness of detail. The upsampling technologies lose virtually no image sharpness with reduced rendering resolution, but TAA does. When it comes to image reconstruction, the upsampling cannot trump, TAA is already doing a good job here.


On the other hand, what upsampling generally cannot do is restore the details of the raytraced reflections. Therefore, the RT reflections lose detail with both DLSS 2 and FSR 2, of which there is already very little anyway. Nothing can be done about that, you have to decide where the focus is: The most beautiful RT reflections or more performance through upsampling.

3Z5tVaj.png
 
  • Haha
Reactions: TNA
I don't think that's it. The hardware really isn't upto it.

It's really not much better than the RTX 3000 RT hardware.
The point I was making is this game and a few others recently have been designed to make the hardware not up to it so Nvidia can push features like frame gen as a must have which incidentally is only available on the newer expensive cards.

RT can be implemented well and get highly playable frame rates even on low end cards if done right, but its not in Nvidia's interest as they want people to buy the new cards and the devs can save on development costs while also getting a bung from nvidia.

 
Last edited:
There's something weird going on with this title, HUB saying something is wrong with TPUs results on Radeon. This is just wetting my appetite to get the game even though I have no interest in playing it for actual gameplay. :cry:
zLIaP4S.png
Cg3k0YI.png

Possibly is something wrong with TPU but what about Jansons benchmarks?









9Sl6bqm.jpg

4fRifI7.gif


Seems that ultra performance mode somewhat makes it more playable, thank god nvidia have drastically improved IQ of DLSS performance/UP recently!

gZAEh7r.png
 
Last edited:
Doom Eternal already has very good performance, with or without RT.

The game cuts off the RT effects at a distance (maybe 50 meters?).

Optimisations are possible, but they kind of seem like workarounds.
 
All of the computer base benchmark tests have 12gb of ram now, 4k is starting to get out of reach of 10gb it would seem.

From computer base - "Hogwarts Legacy requires a lot of graphics card memory for maximum texture detail. Specifically, it is the first game in the editorial team's test that only works without problems in all situations from 16 GB of VRAM. Even with 12 GB you have to accept limitations. For example, the GeForce RTX 4070 Ti does not have any performance problems in Ultra HD and DLSS Quality with otherwise full graphics quality, but the textures are not loaded for some objects, others disappear completely for a short time, since the engine is probably trying to fix the situation somehow to rescue. Without ray tracing, it also works with 12 GB. 10 GB is no longer enough even without ray tracing in Ultra HD with DLSS at a sufficient frame rate. If the graphics card only has 8 GB, there are already clear signs of wear and tear in WQHD including DLSS, even without ray tracing, and even in Full HD one or the other texture is missing when playing. Ultimately, 8 GB and maximum textures are not possible."

Hopefully patches can help to sort this out, but we'll see. My answer is not to upgrade gpu, but downgrade monitor. Simples.
 
Back
Top Bottom