• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
I'll swap you my 3080 since it doesn't matter?
SURE, if you can swap my 1200€ extra as well, absolutely.

@tommybhoy Unless of course you think 1200€ aint worth it for marginally better looking textures, WHICH IS THE FREAKING POINT ALL ALONG! That's all a 3090 gives you, marginally better looking textures for 1200 bucks.
 
Last edited:
I suspect the average bought price of a new 3080 10Gb is probably north of £1k given what we we're seeing throughout 2021. And now we see the stupid sheeple who slept walked into the most obvious planned obsolescence ever devised by top tier tech company.
Luckily I don't run 4K I guess, although I only paid a small amount over RRP just over two years ago. I would consider upgrading, but 1. I'm absolutely aghast at GPU pricing at the moment, 2. I need a white card to match my build aesthetic and 3. I have a G-Sync only monitor so am kind of stuck with Nvidia. This likely means another Gigabyte card or perhaps an Asus, but I am not spending more money than I've paid for previous CARS just to keep up with the Jones's. I'll wait.
 
Yes, finally, you've acknowledged at last that extra vram gets better textures.
Basically what a better graphics card is meant to do .. Also has the resources to do that and offer better performance. Without sacrificing by downgrading settings in the games...

I think the penny is starting to drop for some.. or just another tactic to phish people in for more endless mind numbing repeating of the the same things over and over.

Anyways .... 3060 12GB crowned best cheap Nvidia 4K card 2023 for new games that use more VRAM than 10GB :cry::p...

5jtDzpP.jpg
 
Last edited:
I suspect had the 3090 had the same performance gap as the 4090 does to the 4080, this thread would probably never have existed.
Its better to use the 4070ti as an example because the price gap between that and a 4090 is the same $800 as between a 3080 and 3090 was yet one had a performance gap of 13%, the other a difference of 67%.

Sure on a 3080 you had to compromise on VRAM but you still got top tier performance in every game bar one with optional textures over the 2 years of the cards release, the 4070ti on the other hand means you have to compromise on both performance and VRAM and is already struggling just a month in.
 
Its better to use the 4070ti as an example because the price gap between that and a 4090 is the same $800 as between a 3080 and 3090 was yet one had a performance gap of 13%, the other a difference of 67%.

Sure on a 3080 you had to compromise on VRAM but you still got top tier performance in every game bar one with optional textures over the 2 years of the cards release, the 4070ti on the other hand means you have to compromise on both performance and VRAM and is already struggling just a month in.

:cry:
 
Basically what a better graphics card is meant to do .. Also has the resources to do that and offer better performance. Without sacrificing by downgrading settings in the games...
5jtDzpP.jpg
So where is the issue exactly? Basically you are blaming the 3080, 699 msrp, for not offering the same performance as the 3090, a 2k msrp card. Sorry, what? Especially considering their difference is marginally better textures, YIKES..

But dont worry, i feel for the people that wasted their money on a 3090. I am one of them. But come on, lets not pretend it was a better buy because it offers marginally better textures for the premium of only your entire savings accounts. Roflmao

Let me put it like this. If an owner of a 3090 could go into the settings and put textures in high meant they would gain 1200 euros back, they would do so in a heartbeat. But they can't, they are stuck with 24 useless gb for gaming.

Imagine going into that freaking menu and the ultra textures are locked behind a 1200 euro paywall. Im absolutely positive youd instantly pull out your credit card to unlock them, lmao
 
Last edited:
Problem is though even reducing textures to low does not fix the performance.........


Surely putting the main setting responsible for VRAM usage down to its lowest setting should solve the problem? That's how this is suppose to work according to folks here isn't it?

On a slightly separate note, based on my own testing between ultra and high preset at both 3440x1440 and 4k, I can hardly see a difference between the presets.


However, what does fix/improve the performance considerably as confirmed by a **** ton of people now (including myself and 2 others on this forum [haven't checked since last night to see if others have noted the same])


But nope, just lack of vram the sole cause of these problems here, nothing else at play.....




The only way this would be 100% genuine vram shortage is if none of the above was possible i.e. instead, reducing textures fixes vram the issue and manually putting in changes yourself doesn't boost the fps/performance more than changing settings) among the following not happening:

- gpus with plenty of vram are also ******** the bed at 1080P e.g. 7900xtx
- in cut scenes, fps plummet but in gameplay, goes back up
- casting a spell causes the fps to shoot straight back up even though nothing else has changed/happened in the frame
- vram usage isn't any different between 1080p and 4k (also, just to add even when I have dlss performance mode enabled, vram usage also didn't drop, another thing which goes against the standard of what you would expect to happen)
- people even at 720P getting drops to 10-20 fps with everything on low

Sooner people start to call out bad optimisation and dodgy **** like the above, the better for the gaming industry otherwise keep enjoying paying through the nose for the best hardware to overcome obvious game development issues and even then, that still isn't enough.......


Sorry I keep forgetting, computerbase, pcgamershardware, TPU, Jansn aren't trusted source any more, only HUB is.
 
Last edited:
Problem is though even reducing textures to low does not fix the performance.........


Surely putting the main setting responsible for VRAM usage down to its lowest setting should solve the problem? That's how this is suppose to work according to folks here isn't it?

On a slightly separate note, based on my own testing between ultra and high preset at both 3440x1440 and 4k, I can hardly see a difference between the presets.


However, what does fix/improve the performance considerably as confirmed by a **** ton of people now (including myself and 2 others on this forum [haven't checked since last night to see if others have noted the same])


But nope, just lack of vram the sole cause of these problems here, nothing else at play.....




The only way this would be 100% genuine vram shortage is if none of the above was possible i.e. instead, reducing textures fixes vram the issue and manually putting in changes yourself doesn't boost the fps/performance more than changing settings) among the following not happening:

- gpus with plenty of vram are also ******** the bed at 1080P e.g. 7900xtx
- in cut scenes, fps plummet but in gameplay, goes back up
- casting a spell causes the fps to shoot straight back up even though nothing else has changed/happened in the frame
- vram usage isn't any different between 1080p and 4k (also, just to add even when I have dlss performance mode enabled, vram usage also didn't drop, another thing which goes against the standard of what you would expect to happen)
- people even at 720P getting drops to 10-20 fps with everything on low

Sooner people start to call out bad optimisation and dodgy **** like the above, the better for the gaming industry otherwise keep enjoying paying through the mouth for the best hardware to overcome obvious game development issues and even then, that still isn't enough.......


Sorry I keep forgetting, computerbase, pcgamershardware, TPU, Jansn aren't trusted source any more, only HUB is.
Ιt worked for me, im playing on a 3060ti with DLSS Q at 3440x1440, my settings are everything ultra except shadows (low) and textures (high), RT reflections ultra , RT AO ultra, RT shadows off. I get 65 to 75 running around in hogwarts. Also, 16gb system ram and 3700x. Of course I have traversal stutters but those happen in every system.

The problem is when you change settings the game bugs , or maybe it's cause of the lack of vram, it drops to 14 fps for me. For example i turn textures down to the lowest setting and the game goes down to 10-15fps. I need to go in the menu a few times for the fps to come back to normal (also loading the last save seems to work).
 
Last edited:
Basically what a better graphics card is meant to do .. Also has the resources to do that and offer better performance. Without sacrificing by downgrading settings in the games...

I think the penny is starting to drop for some.. or just another tactic to phish people in for more endless mind numbing repeating of the the same things over and over.

Anyways .... 3060 12GB crowned best cheap Nvidia 4K card 2023 for new games that use more VRAM than 10GB :cry::p...

5jtDzpP.jpg
What is the penny? That it's a duopoly market? Yes it is... Ok what now? Who isn't saying that? We are getting screwed, everyone knows that. Who isn't saying that in this thread? There's all this talk but I can't see anyone, was it 2.5 years ago? The 3080 was still not a good buy given the market? They should have bought a 3060 12gb back then instead :confused:
 
What is the penny? That it's a duopoly market? Yes it is... Ok what now? Who isn't saying that? We are getting screwed, everyone knows that. Who isn't saying that in this thread? There's all this talk but I can't see anyone, was it 2.5 years ago? The 3080 was still not a good buy given the market? They should have bought a 3060 12gb back then instead :confused:

Sad thing is, everyone is pointing the fingers at the wrong ones..... the ones responsible are developers, just maybe if they optimised their game, the game would be running pretty well on all gpus (given what they are capable of) and a 7900xtx would also be able to play at 60+ fps @ 1080P but nope, you simply need to go out and buy better hardware as these big bad corporate hardware companies planned it all along....

I think we have all said time and time before too, yes it would have been great to have got the 3080 with 12 or even 16gb from the start but how much would that have cost? How much were people willing to pay for extra vram? Given that when the 3080 12gb launched, it launched for £1200+, no way in hell was I going to be paying that especially when the difference in 99% of games compared to a 3080 10gb is <5%, that and £650 was already more than what I was willing to spend (all previous gpus I bought were always £300-400)




Having said all that, hogwarts is running fantastically well for me now, I get the same fps drops/stutters when entering different rooms and in some parts around hogsmead but as stated by every reviewer so far too, this also happens on the best of the best systems. One thing which I and others have noted is that after the tutorial stuff and returning to hogwarts for the second time, the performance is considerably better. All in all, despite it's obvious issues especially around RT, it plays pretty well in my experience, WB have certainly come along way since batman arkham knight pc release ;)
 
Games doesn’t run well on Nvidia. Blame devs for not using the magical optimise everything button.

Game doesn’t run well on AMD. well you should have bought Nvidia.

The next time we see a game with low RT performance on AMD we can blame lazy devs for not pressing the optimise everything button :cry:

Sadly amds lack of RT perf. is all down to amd and not game devs :p ;)
 
The pathetic attempts at trolling regarding hogwarts is truly a sight to behold. Also HU creating their own clickbait ‘tests’ when all the other sites are showing the exact opposite show how much integrity they have, spoiler alert : none, gotta drive the clicks from rabid AMD fanboios.

:rolleyes:
 
So where is the issue exactly? Basically you are blaming the 3080, 699 msrp, for not offering the same performance as the 3090, a 2k msrp card. Sorry, what? Especially considering their difference is marginally better textures, YIKES..

But dont worry, i feel for the people that wasted their money on a 3090. I am one of them. But come on, lets not pretend it was a better buy because it offers marginally better textures for the premium of only your entire savings accounts. Roflmao

Let me put it like this. If an owner of a 3090 could go into the settings and put textures in high meant they would gain 1200 euros back, they would do so in a heartbeat. But they can't, they are stuck with 24 useless gb for gaming.

Imagine going into that freaking menu and the ultra textures are locked behind a 1200 euro paywall. Im absolutely positive youd instantly pull out your credit card to unlock them, lmao

There were also people that bought the 3080 12GB, or Ti models. These also dont have the same issues. As per thread topic.
 
Last edited:
Also HU creating their own clickbait ‘tests’ when all the other sites are showing the exact opposite show how much integrity they have

Yeah it's a bit weird how their results are completely different to everyone else so far, Steve being as stubborn as he is will probably do a follow up video explaining why their results are different like what they did with halo infinite.

There were also people that bought the 3080 12GB, or Ti models. These also dont have the same issues. As per thread topic.

Except they do unless you're keeping your fingers in your ears and only paying attention to HUB and none of the other reviewers or users.




Just noticed TPU updated their findings for the 7900xtx, so to get 60+ fps @ 1080P, you just have to drop RT to low:

The ray tracing performance hit can be reduced by lowering the ray tracing quality setting from the "Ultra" default, which helps a lot with performance, especially on AMD. RX 7900 XTX can now reach 60 FPS with ray tracing "low," at 1080p. Not impressive, but still a huge improvement over 28 FPS.
 
Last edited:
Sadly the 3080 lacks VRAM. That is all down to Nvidia not game devs ;)

In all seriousness though, what's the difference here?

- amd gpus show regression in "every" rt title more than what nvidia does, even in their very own "sponsored" games they show more of a hit than nvidia


It's clear that amd are just a gen behind in their RT area, which is not surprising given they were late to the party and for them, it's not their focus like it is for nvidia

- vram has affected how many titles to date? Is it consistent across the board? And can you genuinely say that there is absolutely nothing at fault here with hogwarts? Bearing in mind, the absolute **** ton of evidence all pointing towards game issues......


Again, computerbase comments:

However, despite the decent frame times, Hogwarts Legacy has a pretty big problem with frame pacing. Regardless of the hardware, i.e. also on a Core i9-12900K and a GeForce RTX 4090, the game always gets stuck. This is particularly extreme in Hogwarts when ray tracing is activated, where there can be severe stuttering. This is probably not due to shader compile stutters, rather the game seems to be reloading data at these points. The phenomenon occurs reproducibly, for example, when the part of the lock or the floor is changed. the slower the hardware, the longer the hackers become and the more frequently they occur.

But nah, 4090 just needs more vram, right?
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom