• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I wonder if even faster NVME drives (PCIe5) might do a better job of games like this.

Especially if they ever get around to supporting DirectStorage.

PCIe5 X4 slots max out at 15.7GB/s throughput, which does seem like a hell of a lot.

Presumably, the console version streams in lower quality textures, so the already impressive console NVME drives handle this game comfortably?

I'm just wondering when it might be worth getting an NVME drive for games...

The finalised PCIe6 specification was released in January 2022, so I suppose we'll see even faster NVME drives in 2-3 years time, potentially upto 30 GB/s.

Maybe someone could check task manager to see if there's a lot of drive activity when playing Hogwarts Legacy?


Unless something drastically changes gen6 drives will be using as much power as an entire notebook
 
The platforms are quite different now so isnt surprising, even though both x64, its still directstorage type i/o vs what PC has, and large unified memory vs small VRAM combined with lots of system ram on PC. (If you think about it, it feels like consoles they thought about it hard, and looked for the best way to get the most efficiency out of the hardware, hence directstorage and unified memory, whilst PC hasnt modernised and its just the old throw hardware at it mantra. Even steam deck using PC games is unified now.)

The HDD whilst it fixed stutters, of course had long loading times, and pop in galore on textures. With many not appearing until as I passed by them. The best experience was either SATA SSD or NVME SSD with asset loading throttled manually via UE4 console commands.

Also in FF7 remake when entering shinra building is some lighting reflections on floor, on PS4 they looked amazing, but they updated it on PS5/PC, using some newer method and in my opinion was a regression, so newer lighting techniques seem inferior.
GDDR has higher latency as GPUs need bandwidth, while RAM has lower latency and less throughout as is optimized for the tasks that a CPU does. I think the first Ryzens had a performance problem due to latency which obviously translated into games as well.

Probably they're happy with the performance loss while they'll gain something in other areas, but a console is designed to do a very specific task (gaming), while a PC does a lot more.

Moreover that unified momory has everything in a SoC, so lacks the modularity needed and welcomed in the PC world. Btw, we already have faster storage drives thanks to that ...

There is nothing wrong with the concept of the PC hardware, the problem is the price. Xbox and PS4 are making money through games, so their interest is to sell the hardware at the lowest price possible, while AMD and nVIDIA are going for stupid high margins and prices and make zero from gaming ( online streaming sh**t not included).

Nothing is stopping game devs to properly optimize their pos software other than "high margins" (as in invest as little resources as possible while making the best profit possible, **** quality!). Look at Just Cause, Arma or even Star Citizen which have no problems streaming stuff because of vRAM.

Personally I'm not gonna buy and support these practices . At best I'll throw a fiver in a sell.
TBF, if you just want to "game" and don't care too much about FPS or the absolute best visuals then consoles are a much better experience but at the same time, I like 21.9 format and high fps and also K+M for when a game just works better with this :)
They still have huge patch downloads, poor performance and image quality issues that you're stuck with (going down in res to 720p without dlss is probably quite bad) and so on. They were also scarce and scalped, so I wouldn't put that as a clear win. Works ok for a niche, while some aspects are rather marketed as better than actually being significantly better. :)
 
Only played first 20 mins of Hogwarts but some performance numbers: 4k Ultra RT off = 90fps, 4k ultra all RT effects on = 55fps and what I suspect will be my permanent settings for the game: 4k ultra dlss on quality mode RT reflections on RT shadows off = 120fps
 
Only played first 20 mins of Hogwarts but some performance numbers: 4k Ultra RT off = 90fps, 4k ultra all RT effects on = 55fps and what I suspect will be my permanent settings for the game: 4k ultra dlss on quality mode RT reflections on RT shadows off = 120fps

Pays around 2K for a GPU and has to turn off/lower settings, but has the cheek to laugh at people needing to do the same due to a lack of vram on a GPU that is around 3 years old and cost a third the price back then. Lol.
 
Last edited:
Pays around 2K for a GPU and has to turn off/lower settings, but has the cheek to laugh at people needing to do the same due to a lack of vram on a GPU that is around 3 years old and cost a third the price back then. Lol.
To be fair some settings I always turn off as the increased fps is more desirable. Going from ultra to high on some settings has no visual impact but a nice bump in frames.
 
Then you had people saying 3070/ti are better than 2080ti's... The proof is now there.. Vram is very important to keep performance levels up and same for system RAM as we have all known for decades in some cases on here.

The real kicker is they have turned 4k capable cards regarding their grunt into 1080p cards because they don't have enough VRAM. Nvidia.........

200w.gif
 
Then you had people saying 3070/ti are better than 2080ti's... The proof is now there.. Vram is very important to keep performance levels up and same for system RAM as we have all known for decades in some cases on here.

The real kicker is they have turned 4k capable cards regarding their grunt into 1080p cards because they don't have enough VRAM. Nvidia.........
$500 3070 vs $1000 for 2080ti. The vRAM wasn't a problem at the time, so for the same performance it was a decent deal. Probably vRAM shouldn't be a problem if devs would actually do what they suppose to do.
 
$500 3070 vs $1000 for 2080ti. The vRAM wasn't a problem at the time, so for the same performance it was a decent deal. Probably vRAM shouldn't be a problem if devs would actually do what they suppose to do.
Its not good for industry as a whole and general adoption going forward if features like RT are gated behind cards that start at £1200+
 
$500 3070 vs $1000 for 2080ti. The vRAM wasn't a problem at the time, so for the same performance it was a decent deal. Probably vRAM shouldn't be a problem if devs would actually do what they suppose to do.

At that time though people were dumping their 2080ti's for £500 as they were sweating on the inbound 3080.
 
Only played first 20 mins of Hogwarts but some performance numbers: 4k Ultra RT off = 90fps, 4k ultra all RT effects on = 55fps and what I suspect will be my permanent settings for the game: 4k ultra dlss on quality mode RT reflections on RT shadows off = 120fps

Changed my mind, seems RT shadows actually has a significant impact on lighting quality in indoor areas so I've enabled that as well.
 
Last edited:
This is why pc gaming wins for me the ultrawide resolutions and monitors
If this view was correct, more than 1.5% of PC gamers would be running it (ultrawide).

Its been around for years but its never really caught on (ultrawide) it looks good at trade fairs running on a 5k driving game rig but most people just want a decent 27 inch 1440p OLED monitor. Hogwarts on the Xbox SX running on a 65inch 4k OLED telly is good enough imho. The game itself just does not look good enough on PC to engage FOMO.

I would not sweat it though, there's plenty of potential PC gaming FOMO coming for Star Wars, Avatar, Starfield, Jedi Survivor, Redfall and Homeworld 3.

System Shock Redux, which is also forthcoming, caused quite a stir on its original release as you needed to junk your entire 486 rig and buy an eye-wateringly expensive (at the time) Intel Pentium machine (bleeding edge) to run the game at 320x200 resolution, I hope their try to cause a similar stir this time but its unlikely they will risk it.

But the above kinda illustrates the problem, the 4090 is about 10 times faster than the Xbox SX and yet, Hogwarts does not look 10 times better on a 4090 versus consoles, or, even three times better, its a bit of a conundrum. The 4090ti is going to be an era-defining piece of engineering but I do worry about the lack of anything to really need it.
 
Last edited:
Lowering the RT level on AMD cards can give a nice boost to FPS for very little difference in quality I've found so if you need a few extra frames drop the RT level.

Some footage from the 4090 same settings as previous video.
 
Lowering the RT level on AMD cards can give a nice boost to FPS for very little difference in quality I've found so if you need a few extra frames drop the RT level.

Some footage from the 4090 same settings as previous video.

Over here, when I tested earlier I found no difference in framerate between low RT and ultra RT which was dissapointing, so its interesting to see it does change on AMD, I wonder why
 
Last edited:
Only played first 20 mins of Hogwarts but some performance numbers: 4k Ultra RT off = 90fps, 4k ultra all RT effects on = 55fps and what I suspect will be my permanent settings for the game: 4k ultra dlss on quality mode RT reflections on RT shadows off = 120fps
At hogwarts prepare to engage FG. It solves a lot of the cpu bound issues going on in there.
 
To be fair some settings I always turn off as the increased fps is more desirable. Going from ultra to high on some settings has no visual impact but a nice bump in frames.
Yeah, I have no issue with settings being lowered and as you say at times it is even hard to tell the difference between ultra and high in gameplay anyway.

The point is you got people like him celebrating 10/12GB not being enough on one hand and using a much more expensive card and still needing to turn off or lower settings on another.

What gets conveniently lost in this debate is, no one is arguing that less vram is better. The argument is that more vram means it ends up costing a lot more. Not because of the actual physical vram cost, but because of how the product segmentation works to maximise profits.

End of the day if money/value for money is not a problem, just get 4090, turn down a few settings and enjoy the latest games. Lol.
 
$500 3070 vs $1000 for 2080ti. The vRAM wasn't a problem at the time, so for the same performance it was a decent deal. Probably vRAM shouldn't be a problem if devs would actually do what they suppose to do.
Second hand prices were less for the 2080ti than a new 3070. Only issue I would have been careful about buying a second hand 2080ti is the space invaders issue they had with VRAM, but any decent AIB didn't have the issues from what I remember and was the FE models that suffered the failures back then. 2080ti second hand was the better buy especially if it came with warranty and then you had people that were silly enough to let them go for peanuts.
 
Last edited:
Yeah, I have no issue with settings being lowered and as you say at times it is even hard to tell the difference between ultra and high in gameplay anyway.

The point is you got people like him celebrating 10/12GB not being enough on one hand and using a much more expensive card and still needing to turn off or lower settings on another.

What gets conveniently lost in this debate is, no one is arguing that less vram is better. The argument is that more vram means it ends up costing a lot more. Not because of the actual physical vram cost, but because of how the product segmentation works to maximise profits.

End of the day if money/value for money is not a problem, just get 4090, turn down a few settings and enjoy the latest games. Lol.
Vram is absolutely enough, the game has enough options to make it playable regardless of vram with great actual visuals. I would be WAY more worried about my CPU than my GPU if I wanted to play with RT on this game. It's a CPU hog.

Foo-AXQ-WIAEw-SSl.jpg
 
  • Like
Reactions: TNA
Back
Top Bottom