Soldato
Your not factoring in that before you posted on here, like when the OP created these threads- just go read them from early pages.
Wow that is a cruel and unusual punishment!
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Your not factoring in that before you posted on here, like when the OP created these threads- just go read them from early pages.
the Game Ready stuff is usually bollock s anyhow tbf.Still waiting for an answer to this:
I'm not sure if nvidias driver update was actually a "game ready" one, it created a profile but that was all apparently.
Yeah remember when it started off with but but 4k max settings. Now 1440p is saying hello, not an issue though.
the Game Ready stuff is usually bollock s anyhow tbf.
Tbf AMD driver guy is doing his best, can't possibly do any more overtime!!!Could be worse, could be a rdna 2 owner who hasn't received any support/drivers since rdna 3 release, planned obsolescence..... or at least that is what some would be saying if this where nvidia
Of course playing at 4k would at some point end up with a choke. That would absolutely be the case regardless of the actual vram of the 3080. Actualy every single card in existence chokes in hogwarts 4k.and those that said the 3080 "was fine" were ignoring the simple criteria that we said playing at 4k high settings it will end up with a choke (some of these deniers bought cards with more vram). Someone earlier listed the bullet points and again you should brush up on them.
Ι did. Proves my point. At 4k DLSS Q (which is basically his 1440p results) is faster than a 6950xt, which was much more expensive and has 16GB. The 6800xt which was it's actual competitor gets blasted. None of these can do 4k natively anyways whether they have 1gb or 1TB of vram caue they run out of grunt. Unless you are going to play the game at 21 fps with 14 minimums at 4k native with your 6800xt, I don't see wtf your point is. Id much much much much much rather have the 3080 than the 6800xt for hogwards. Not even a contest.
Yes the problem is the vram. I don't get this discussion, it's this to get us all riled up at tech companies or to prove posters in a 2 year old thread wrong? The 3080 destroyed the 3060 in basically every other game. If I had a £700 3080 and couldn't play hogwarts after owning it for 2 years I'd say oh well, refund and go play the other 99% of games out there.Clue...
A card that was designed for 4k can't do 1080p now because of the VRAM amount... The rest of what you state is obvious and not the topic.. We all know settings can be dropped etc... but when a 3060 12GB beats a 3080 10GB at the same settings there is something very wrong.
Yes the problem is the vram... snip
Scenario 2 means you lower textures from ultra uber high to just high and get 35 fps average with minimums of 30. Big whoop.When I bought the 3080 10GB I did so knowing the VRAM was going to be more of an issue before GPU grunt was.
Let me give you two scenarios, one is due to lack of GPU grunt and the other is due to lack of VRAM. Let's assume the rest of the PC consistes of a good CPU and RAM etc.
In order to fix both scenarios you need to lower in game settings but scenario 1 means your GPU is running out of grunt. Scenario 2 means your GPU clearly has enough power to reach much better FPS but is held back by VRAM limits. Scenario 2 is down to poor GPU design balance, it's like designing a sports car with a great engine and handling, and putting an old 4 speed gearbox from a 1980s Ford Fiesta on it. You have made a design choice the cripples performance in a way that could easily have been avoided.
- You are getting 35 FPS average with minimums of 30 and highs of 40.
- You are getting 35 FPS average with minimums of 5 and highs of 80.
You stated that at 1440p (4K with DLSS quality) the 3080 was killing the 6950 XT. You are wrong and clearly looked at the 3080Ti 12GB numbers because the 3080 is getting killed at 1440p with RT on (at least according to HUB) and barely any faster on some other sites.
The 3080 destroys the 3060 in hogwarts as well, whoever says or thinks otherwise is delusionalYes the problem is the vram. I don't get this discussion, it's this to get us all riled up at tech companies or to prove posters in a 2 year old thread wrong? The 3080 destroyed the 3060 in basically every other game. If I had a £700 3080 and couldn't play hogwarts after owning it for 2 years I'd say oh well, refund and go play the other 99% of games out there.
Im playing with RTX on a 3060ti at 3440x1440p, what are you talking about, lol3080 having to turn off RTX to conserve vram going forwards.
NV will do anything to get a sale.
Thanks for your input.Im playing with RTX on a 3060ti at 3440x1440p, what are you talking about, lol
Especially considering a 3090 24gb couldnt maintain that day one (cyberpunk, ac odyssey etc.). Weird times we live in friendStrange how the 3080 10GB being nearly 2 and a half years old soon to be replaced by a midrange40604070 still has to maintain 4k max standards.
Could be worse, could have spent £1300 or £1600+ on the latest and greatest and have to drop the same settings as on a 2+ year old £650 GPU including at 1440p and 1080p depending on the GPU.....Especially considering a 3090 24gb couldnt maintain that day one (cyberpunk, ac odyssey etc.). Weird times we live in friend
I'll swap you my 3080 since it doesn't matter?You might be laughing, but it's really not funny. If I didn't have the spare income to upgrade to a 4090, buying that 3090 instead of a 3080 would have been top 5 of the stupidest things I've ever done in my life. And don't get me wrong, I like my 3090, I have kept it, I've benched the crap out of it (it was a particularly good overclocker) I don't plan on selling it - still it was a stupendously stupid buy over a 3080.