Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Competitor rules
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
You are right, I haven't,, cause nobody explains it to me. They just claim it's a problem. Is it a problem that I have to also drop settings on a brand new 1300€ 7900xtx? Or that's okay, but dropping settings on a 2.5 year old 699€ card is a sin cause it's nvidia branded?
The game might need optimizations, but it won't change the fact that 10gb is not enough to run the game maxed out with RT. The question is - who cares. You have to drop settings on a 2 and a half year old card at the latest - great looking RT game, that's just freaking NORMAL. Im sitting here with my brand new just released 2000€ GPU and have to drop settings as well, so I really really can't fathom what the flying duck are people complaining about. Especially considering the actual competitors of the 3080 are doing much worse, why are we even talking about it? Why is it an issue? I also have to drop settings on my laptop with a 1650, so what? What am I missing? It seems to me people have an agenda, there is no way they legitimately consider it a problem that a 2.5 year old 699€ msrp card needs to drop some settings in the latest game. That's freaking nuts. They can't actually mean what they say, I think they have an axe to grind.
A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Carry on with your delusions it's not a vram issue ..
Wow really...
Watch the video (that has been posted many times on these very forums)...
A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Carry on with your delusions it's not a vram issue ..
NO freaking card can do 4k in this game, WTF are you talking about??? The 3090 has 24GB of ram and can't do 4k. The 6800xt has 16GB and can't do 4k. The 6950xt has 16gb and cant do 4k. The 7900xtx has 24GB and can't do 4k. The 7900xt has 20GB and can't do 4k. The 4090 has 24GB and can't do 4k. The 3080ti has 12gb and can't do 4k.
Most of the aforementioned cards were MORE expensive than the 3080, so how is it an issue just for the 3080? What the heck are you talking about man? Are you saying that a 3060 12gb can play the game with better IQ and framerate than a 3080? That's just completely delusional
Most of the aforementioned cards were MORE expensive than the 3080, so how is it an issue just for the 3080? What the heck are you talking about man? Are you saying that a 3060 12gb can play the game with better IQ and framerate than a 3080? That's just completely delusional
3060 beating a 3080.... click and will play at the point you need to see... So yes the 3060 12GB is playing the game at same settings as the 3080 and getting better frame rates.. Look at the lows too..
Ι did. Proves my point. At 4k DLSS Q (which is basically his 1440p results) is faster than a 6950xt, which was much more expensive and has 16GB. The 6800xt which was it's actual competitor gets blasted. None of these can do 4k natively anyways whether they have 1gb or 1TB of vram caue they run out of grunt. Unless you are going to play the game at 21 fps with 14 minimums at 4k native with your 6800xt, I don't see wtf your point is. Id much much much much much rather have the 3080 than the 6800xt for hogwards. Not even a contest.
3060 beating a 3080.... click and will play at the point you need to see... So yes the 3060 12GB is playing the game at same settings as the 3080 and getting better frame rates.
You insist on the nonsense that a 3060 plays the game better than a 3080. As if you were going to play with any of these cards at native 4k. That's delusional. Not a single card can play at 4k natively - even the 4090 has issues. So WHAT are you talking about? The 3080 only has vram issues with settings that the rest of the cards are getting 20 fps anyways, so who cares. It's unplayable on ALL of them.
Ι did. Proves my point. At 4k DLSS Q (which is basically his 1440p results) is faster than a 6950xt, which was much more expensive and has 16GB. The 6800xt which was it's actual competitor gets blasted. None of these can do 4k natively anyways whether they have 1gb or 1TB of vram caue they run out of grunt. Unless you are going to play the game at 21 fps with 14 minimums at 4k native with your 6800xt, I don't see wtf your point is. Id much much much much much rather have the 3080 than the 6800xt for hogwards. Not even a contest.
Im going to play hogwarts at native 4k +RT ULTRA on my 16GB 6800xt. Ill get 20 fps but at least i won't run out of vram so somehow that will make the experience more enjoyable. I give up. You win.
This is basically on point condensing pages of waffle. What's comical is some of the highly active defenders have moved on from a 3080 to a 3080Ti and 4090's. This is exactly what we predicted as it avoids the problem.
Or maybe as they have all said themselves is because they ran out of grunt and weren't hitting fps/settings they wanted???? Like I've said many times when/if I upgrade, it is going to be primarily for grunt or/and a new feature such as FG/dlss 3, more vram is just the cherry on top.
Why did 3090, 6800xt, 6900xt, 6950xt owners move to 7900xtx, 4090? Surely they had no need to since they already had plenty of vram, right?
The grunt depends on the review or video you look at. At 1080p and 1440p it does seem to have enough grunt but not enough Vram but in others the numbers seem to be way different. In the Hub review the numbers looked decent until Vram ran out so grunt was there. Really don't know what's what with this game. If i remember correctly it was the hogsmead area where the 3080 had vram problems but was ok in the hogwarts area so i guess where sites are testing the game will make a big difference as well.
So far, it is only HUB who have shown the issues where fps plummets AND a 7900xtx getting higher fps at 1080P than every other reviewers results and by everyone I mean, Jansn, TPU, computerbase, pcgamershardware (will be interesting to see their results for ampere, rdna 2 when they're done with testing)
As for most demanding areas, it is definitely hogsmead followed by hogswart castle/grounds especially when you go from room/area to a new area/room but as computerbase pointed out, this is happening on even the best gaming system:
However, despite the decent frame times, Hogwarts Legacy has a pretty big problem with frame pacing. Regardless of the hardware, i.e. also on a Core i9-12900K and a GeForce RTX 4090, the game always gets stuck. This is particularly extreme in Hogwarts when ray tracing is activated, where there can be severe stuttering. This is probably not due to shader compile stutters, rather the game seems to be reloading data at these points. The phenomenon occurs reproducibly, for example, when the part of the lock or the floor is changed. the slower the hardware, the longer the hackers become and the more frequently they occur.
The game might need optimizations, but it won't change the fact that 10gb is not enough to run the game maxed out with RT. The question is - who cares. You have to drop settings on a 2 and a half year old card at the latest - great looking RT game, that's just freaking NORMAL. Im sitting here with my brand new just released 2000€ GPU and have to drop settings as well, so I really really can't fathom what the flying duck are people complaining about. Especially considering the actual competitors of the 3080 are doing much worse, why are we even talking about it? Why is it an issue? I also have to drop settings on my laptop with a 1650, so what? What am I missing? It seems to me people have an agenda, there is no way they legitimately consider it a problem that a 2.5 year old 699€ msrp card needs to drop some settings in the latest game. That's freaking nuts. They can't actually mean what they say, I think they have an axe to grind.
Don't agree with all that but the reason it's a problem is because a couple paid an extra £750 for the 3090 over the 3080 and they haven't gotten that "value" back yet, they thought there would be loads of games by now where a 3080 would be having to run everything on low whilst they could use max settings on their 3090s but alas that hasn't happened and when a 3080 generally ***** the bed so does a 3090
A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Carry on with your delusions it's not a vram issue ..
Wow really...
Watch the video (that has been posted many times on these very forums)...
I did some testing with my 3080 back in Sept/Oct 2020 when I was silly enough to listen to “da internet“. At the time Afterburner beta had been released- this shows actual vRam usage compared to allocated. As I remember actual was more than 20% less than allocated and using any Upscaling dropped...
I just changed from a 3080 10GB to a 4080 16GB. The Witcher 3 on Ultra with RTing maxxed out (DLSS3 on) is using just over 10GB at 3440x1440. I guess you could say that isn't really relevant to the 3080 because as I found out myself, RTing was nigh unplayable on a 3080 with TW3, unless you...
forums.overclockers.co.uk
Yes there are vram issues, that's obvious but to say there this is 100% not enough vram and nothing else is at play with this game despite the mountains of evidence showing there is plenty wrong with the game is as you said...... "delusional"
The PC gaming scene really is doomed if people aren't willing to acknowledge when games are poorly optimised/developed and instead just think throwing more "cpu, ram, vram" at the "issues" is the way forward I mean the fact that some random people have been able to fix/improve the performance by adding a few config lines to a file says it all.......
Nah it's fine, you convinced me, knowing that my 16GB 6800xt doesn't run out of vram while spitting out 20 fps makes the game more enjoyable for me. Thanks for opening my eyes.
- gpus with plenty of vram are also ******** the bed at 1080P e.g. 7900xtx
- in cut scenes, fps plummet but in gameplay, goes back up
- casting a spell causes the fps to shoot straight back up even though nothing else has changed/happened in the frame
- vram usage isn't any different between 1080p and 4k (also, just to add even when I have dlss performance mode enabled, vram usage also didn't drop, another thing which goes against the standard of what you would expect to happen)
- people even at 720P getting drops to 10-20 fps with everything on low
- adding some config lines improves performance drastically as confirmed by loads of people now
Or explain what is happening here even though textures are set to low.....
I just changed from a 3080 10GB to a 4080 16GB. The Witcher 3 on Ultra with RTing maxxed out (DLSS3 on) is using just over 10GB at 3440x1440. I guess you could say that isn't really relevant to the 3080 because as I found out myself, RTing was nigh unplayable on a 3080 with TW3, unless you...
forums.overclockers.co.uk
Or at least agree with this:
Yes there are vram issues, that's obvious but to say there this is 100% not enough vram and nothing else is at play with this game despite the mountains of evidence showing there is plenty wrong with the game
I never said there wasn't and there is no real proof this is one of the issues if it has any, reality is in the current form this game and other games that have been coming out recently are showing 10GB is dead at 4K and even 12GB has shown issues on 4070ti and other 12GB cards.
The issue is system resources and VRAM is a system resource for graphics and showing we are now even having issues at 1440p with 10GB.. Yes maybe the game needs patching or some weird memory leak that only effects the 3080 10GB... but ... Anyways I don't care I don't have a 10GB card and even my laptop has 16GB VRAM 3080ti... I'm a tech enthusiast and like to see where technology is taking us and the system resources we require to get to the new levels of performance or increased image quality.. etc..
My replying here is based on tech not what anyone owns.. and childish comments like this prove my point.
Don't agree with all that but the reason it's a problem is because a couple paid an extra £750 for the 3090 over the 3080 and they haven't gotten that "value" back yet, they thought there would be loads of games by now where a 3080 would be having to run everything on low whilst they could use max settings on their 3090s but alas that hasn't happened and when a 3080 generally ***** the bed so does a 3090
I never said there wasn't and there is no real proof this is one of the issues if it has any, reality is in the current form this game and other games that have been coming out recently are showing 10GB is dead at 4K and even 12GB has shown issues on 4070ti and other 12GB cards.
The issue is system resources and VRAM is a system resource for graphics and showing we are now even having issues at 1440p with 10GB.. Yes maybe the game needs patching or some weird memory leak that only effects the 3080 10GB... but ... Anyways I don't care I don't have a 10GB card and even my laptop has 16GB VRAM 3080ti... I'm a tech enthusiast and like to see where technology is taking us and the system resources we require to get to the new levels of performance or increased image quality.. etc..
My replying here is based on tech not what anyone owns.. and childish comments like this prove my point.
Also why threads are getting locked is because of this childish behaviour...
You might not have said it but you are definitely implying that the lack of vram is 100% the reason for performance issues here, again, maybe we should create a thread "is 24GB enough" since a 7900xtx is ******** the bed at 1080P too? Must be vram, right? What other games recently are showing issues, which are 100% down to vram and nothing else? Please don't say forspoken..........
So far the only 100% genuine vram issues I would say are still:
- games with mods
- high res. gaming @ 4k or higher AND if you refuse to use dlss/fsr
- FC 6 (if you refuse to use FSR and force rebar on and only run the the benchmark mode in my case)
Whilst you, myself and everyone here is a "tech enthusiast", the problem is you're advocating that the solution to these issues is by buying new tech to overcome shoddy optimisation by developers.
The PC gaming scene really is doomed if people aren't willing to acknowledge when games are poorly optimised/developed and instead just think throwing more "cpu, ram, vram" at the "issues" is the way forward
I mean the fact that some random people have been able to fix/improve the performance by adding a few config lines to a file says it all.......
Given all the games with awful cpu utilization now and using frame generation is frowned upon, I guess all cpus are gimped too except for the top dog chips like the 13700/13900 and 7900x/7950x.
Also:
Don't agree with all that but the reason it's a problem is because a couple paid an extra £750 for the 3090 over the 3080 and they haven't gotten that "value" back yet, they thought there would be loads of games by now where a 3080 would be having to run everything on low whilst they could use max settings on their 3090s but alas that hasn't happened and when a 3080 generally ***** the bed so does a 3090
You are right, I haven't,, cause nobody explains it to me. They just claim it's a problem. Is it a problem that I have to also drop settings on a brand new 1300€ 7900xtx? Or that's okay, but dropping settings on a 2.5 year old 699€ card is a sin cause it's nvidia branded?
Your not factoring in that before you posted on here, like when the OP created these threads- just go read them from early pages.. the people I am referring to or some of the other posters are trying to remind you of is firstly its a discussion not cult, and those that said the 3080 "was fine" were ignoring the simple criteria that we said playing at 4k high settings it will end up with a choke (some of these deniers bought cards with more vram). Someone earlier listed the bullet points and again you should brush up on them.
My replying here is based on tech not what anyone owns.. and childish comments like this prove my point.
...
Also why threads are getting locked is because of this childish behaviour...
You might not have said it but you are definitely implying that the lack of vram is 100% the reason for performance issues here, again, maybe we should create a thread "is 24GB enough" since a 7900xtx is ******** the bed at 1080P too? Must be vram, right? What other games recently are showing issues, which are 100% down to vram and nothing else? Please don't say forspoken..........
AMD does not have game ready drivers for this game yet and when tested... so wait for them then we can see .. Nvidia and intel had game ready drivers in all tests/benchmarks.
Your not factoring in that before you posted on here, like when the OP created these threads- just go read them from early pages.. the people I am referring to or some of the other posters are trying to remind you of is firstly its a discussion not cult, and those that said the 3080 "was fine" were ignoring the simple criteria that we said playing at 4k high settings it will end up with a choke (some of these deniers bought cards with more vram). Someone earlier listed the bullet points and again you should brush up on them.
Yeah remember when it started off with but but 4k max settings. Now 1440p is saying hello, not an issue though.
Or maybe as they have all said themselves is because they ran out of grunt and weren't hitting fps/settings they wanted???? Like I've said many times when/if I upgrade, it is going to be primarily for grunt or/and a new feature such as FG/dlss 3, more vram is just the cherry on top.
Why did 3090, 6800xt, 6900xt, 6950xt owners move to 7900xtx, 4090? Surely they had no need to since they already had plenty of vram, right?
AMD does not have game ready drivers for this game yet and when tested... so wait for them then we can see .. Nvidia and intel had game ready drivers in all tests/benchmarks.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.