Soldato
Bill.. no more thread locks please!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Come on guys, isnt it only one game?!
All of the computer base benchmark tests have 12gb of ram now, 4k is starting to get out of reach of 10gb it would seem.
From computer base - "Hogwarts Legacy requires a lot of graphics card memory for maximum texture detail. Specifically, it is the first game in the editorial team's test that only works without problems in all situations from 16 GB of VRAM. Even with 12 GB you have to accept limitations. For example, the GeForce RTX 4070 Ti does not have any performance problems in Ultra HD and DLSS Quality with otherwise full graphics quality, but the textures are not loaded for some objects, others disappear completely for a short time, since the engine is probably trying to fix the situation somehow to rescue. Without ray tracing, it also works with 12 GB. 10 GB is no longer enough even without ray tracing in Ultra HD with DLSS at a sufficient frame rate. If the graphics card only has 8 GB, there are already clear signs of wear and tear in WQHD including DLSS, even without ray tracing, and even in Full HD one or the other texture is missing when playing. Ultimately, 8 GB and maximum textures are not possible."
Hopefully patches can help to sort this out, but we'll see. My answer is not to upgrade gpu, but downgrade monitor. Simples.
Thanks. Mad that a 4080 / 7900 XTX can just manage 60fps 4K. Wonder if there are some ultra settings to dial back to high to get over 100fps while retaining image quality.Check videos in my link and TPU post above.
Stumping 3k for a 4070/8gb vram/16gb ram laptop looks like a move many dads
My main point is that from this, not even upscaling is solving the issues, even if you turn RT off which actually surprises me. It's forcing you to go to frame generation, which funnily enough is available at a crazy price.
I definitely hope so. It is new, so I would expect patches to help some of the issues (by the way, expecting day 1 patches SHOULD NOT be a thing). If the price of entry to full performance triple A games stays as is, I'm out.Could be the game is really poorly optimized. There are plenty to games that look really good that run really well.
Could be the game is really poorly optimized. There are plenty to games that look really good that run really well.
I definitely hope so. It is new, so I would expect patches to help some of the issues (by the way, expecting day 1 patches SHOULD NOT be a thing). If the price of entry to full performance triple A games stays as is, I'm out.
I'd go along with this, it's just surprising that it's having such an impact quite so soon.people still aren't learning that the ones at fault are the game developers, it is them who are choosing not to optimise their games properly and instead are relying on upscaling and frame generation to do the "optimisation" for them
The game being poorly optimised, don't be so silly!!!! It's clearly nvidia and amd to blame for doing their shoddy "planned obsolescence" tactics.....
But in all seriousness, this whole pointing fingers at amd or/and nvidia because of lack of grunt, vram, RT perf, FG features or whatever straw people like to clutch on to suit their narrative is just silly tbh, people still aren't learning that the ones at fault are the game developers, it is them who are choosing not to optimise their games properly and instead are relying on upscaling and frame generation to do the "optimisation" for them, the sooner technical review sites and customer base start calling them out and boycotting such games, the better, so far the only ones to do this whether people like it or not is Alex from Digital foundry.
The game being poorly optimised, don't be so silly!!!! It's clearly nvidia and amd to blame for doing their shoddy "planned obsolescence" tactics.....
But in all seriousness, this whole pointing fingers at amd or/and nvidia because of lack of grunt, vram, RT perf, FG features or whatever straw people like to clutch on to suit their narrative is just silly tbh, people still aren't learning that the ones at fault are the game developers, it is them who are choosing not to optimise their games properly and instead are relying on upscaling and frame generation to do the "optimisation" for them, the sooner technical review sites and customer base start calling them out and boycotting such games, the better, so far the only ones to do this whether people like it or not is Alex from Digital foundry.
Its a cheaper reality than paying 3k in 1996 and then every 18 months for a desktop that needed throwing in the skip, like many dads had to do for us, when we were kidsNot in this reality am I buying my son a 3k laptop lol you're in a small minority there.
I'd go along with this, it's just surprising that it's having such an impact quite so soon.
Agreed. The developers would have known the performance is **** and decided to ship it now and patch it going forward. Classic move. Que articles 6 months from now "Hogwarts is getting a big patch to improve performance"
How did you guys manage to jump to the conclusion that the RT hardware in RTX 3000 or 4000 series cards is up to the job though?
How did you determine that the hardware is upto the demands of the latest games?
Its a cheaper reality than paying 3k in 1996 and then every 18 months for a desktop that needed throwing in the skip, like many dads had to do for us, when we were kids
Its about priorities, many Dads spend 3k a year on cigarettes and another 3k on booze.