Quite common at least in threads like this.
This is basically on point condensing pages of waffle. What's comical is some of the highly active defenders have moved on from a
3080 to a 3080Ti and 4090's. This is exactly what we predicted as it avoids the problem.
Or maybe as they have all said themselves is because they ran out of grunt and weren't hitting fps/settings they wanted???? Like I've said many times when/if I upgrade, it is going to be primarily for grunt or/and a new feature such as FG/dlss 3, more vram is just the cherry on top.
Why did 3090, 6800xt, 6900xt, 6950xt owners move to 7900xtx, 4090? Surely they had no need to since they already had plenty of vram, right?
The grunt depends on the review or video you look at. At 1080p and 1440p it does seem to have enough grunt but not enough Vram but in others the numbers seem to be way different. In the Hub review the numbers looked decent until Vram ran out so grunt was there. Really don't know what's what with this game. If i remember correctly it was the hogsmead area where the 3080 had vram problems but was ok in the hogwarts area so i guess where sites are testing the game will make a big difference as well.
So far, it is only HUB who have shown the issues where fps plummets AND a 7900xtx getting higher fps at 1080P than every other reviewers results and by everyone I mean, Jansn, TPU, computerbase, pcgamershardware (will be interesting to see their results for ampere, rdna 2 when they're done with testing)
As for most demanding areas, it is definitely hogsmead followed by hogswart castle/grounds especially when you go from room/area to a new area/room but as computerbase pointed out, this is happening on even the best gaming system:
However, despite the decent frame times, Hogwarts Legacy has a pretty big problem with frame pacing. Regardless of the hardware, i.e. also on a Core i9-12900K and a GeForce RTX 4090, the game always gets stuck. This is particularly extreme in Hogwarts when ray tracing is activated, where there can be severe stuttering. This is probably not due to shader compile stutters, rather the game seems to be reloading data at these points. The phenomenon occurs reproducibly, for example, when the part of the lock or the floor is changed. the slower the hardware, the longer the hackers become and the more frequently they occur.
As said though, I've got the game running wonderfully now:
Wish there was a way to have screenshots taken in sdr format
The game might need optimizations, but it won't change the fact that 10gb is not enough to run the game maxed out with RT. The question is - who cares. You have to drop settings on a 2 and a half year old card at the latest - great looking RT game, that's just freaking NORMAL. Im sitting here with my brand new just released 2000€ GPU and have to drop settings as well, so I really really can't fathom what the flying duck are people complaining about. Especially considering the actual competitors of the 3080 are doing much worse, why are we even talking about it? Why is it an issue? I also have to drop settings on my laptop with a 1650, so what? What am I missing? It seems to me people have an agenda, there is no way they legitimately consider it a problem that a 2.5 year old 699€ msrp card needs to drop some settings in the latest game. That's freaking nuts. They can't actually mean what they say, I think they have an axe to grind.
Don't agree with all that but the reason it's a problem is because a couple paid an extra £750 for the 3090 over the 3080 and they haven't gotten that "value" back yet, they thought there would be loads of games by now where a 3080 would be having to run everything on low whilst they could use max settings on their 3090s but alas that hasn't happened and when a 3080 generally ***** the bed so does a 3090
Clue...
A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Carry on with your delusions it's not a vram issue ..
Wow really...
Watch the video (that has been posted many times on these very forums)...
Come on man, you seriously can't say that there isn't also issues with the game too.......

Are you also ignoring every other reviewers results?
I did some testing with my 3080 back in Sept/Oct 2020 when I was silly enough to listen to “da internet“. At the time Afterburner beta had been released- this shows actual vRam usage compared to allocated. As I remember actual was more than 20% less than allocated and using any Upscaling dropped...
forums.overclockers.co.uk
I just changed from a 3080 10GB to a 4080 16GB. The Witcher 3 on Ultra with RTing maxxed out (DLSS3 on) is using just over 10GB at 3440x1440. I guess you could say that isn't really relevant to the 3080 because as I found out myself, RTing was nigh unplayable on a 3080 with TW3, unless you...
forums.overclockers.co.uk
Yes there are vram issues, that's obvious but to say there this is 100% not enough vram and nothing else is at play with this game despite the mountains of evidence showing there is plenty wrong with the game is as you said...... "delusional"
The PC gaming scene really is doomed if people aren't willing to acknowledge when games are poorly optimised/developed and instead just think throwing more "cpu, ram, vram" at the "issues" is the way forward

I mean the fact that some random people have been able to fix/improve the performance by adding a few config lines to a file says it all.......