• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

is the extra VRAM really that beneficial?

Status
Not open for further replies.
Not sure who'd they'd blame if they (AMD) just stopped making consumer cards, Nvidia maybe... nah you can't blame the company who choses not to do something that would benefit the consumer and not the profit making corporation.

Arguing about putting extra $30 VRAM on a $700 card is absolutely hilarious, 4-5% extra cost for what might mean a lifespan that is 20% longer. Also that is 8GB extra not 4GB, so really it is like 2.5%

I do love reading the comments as well, its almost as funny as anti-social media sometimes

If we are referring to now with 40xx and going forward? Completely agree.

But as noted before, it simply wasn't possible back when the 3080 and 3090 released.

If you want proof of what happens when there is lack of competition and then very good competition, just look at the cpu market..... Whether people like it or not, we are ultimately in this boat because of there being little to no competition in the gpu market, simply under pricing competitor products by £50-100 on launch with only having more vram and significantly better perf in 1-3 raster based games such as COD when you are severly lacking in a number of other ways is not the way to be competitive.

HUB was reporting as well basically Vram needs to match the current generation of consoles was their conclusion or you're going to run into a shortage because thats the baseline for game devs. Anyone arguing anything else is just ******* in the wind due to sheer bloody obstinacy. Nvidia do it for reasons of planned obsolescence amongst other things force people onto new genarations of cards especially at the lower end.



They've covered it in the past as well:

Don't disagree, although it's such a stupid thing to "only" point fingers at nvidia for stinging on vram when games are obviously released in **** state, only for them to get "fixed" and magically work well on lower vram gpus......

Pc gamers always complain about how **** games run on pc and need upscaling etc. and think the only way to do things is simply to brute force so as to overcome or not face such issues. This is why DF are by far the best in the business, they point out such flaws which ultimately shows the issues with games and forces/helps devs to solve issues, again TLOU being the perfect example. I dread to imagine what games would be like if if it weren't for Alex and his in depth insight as to what causes the perf. issues.

Essentially yes point fingers at nvidia and give them **** for it now but the main culprits are game devs. The fact that bang4buck on hogwarts still notices textures loading in further proves that even with all the vram in the world, the game still exhibits issues which are usually associated with running out of vram and it's obviously not the gpu/vram, it's because of how the devs have designed the game.

It's always a silly argument these days "planned obsolescence", this goes for basically every company and products out there (even amd), capitalism is a bitch.
 
Last edited:
  • Like
Reactions: TNA
This thread is strange.

It's almost as if people are trying to give justification for low vram.

Bus width matters more and this tends to be given to high vram cards but not always.

Running 4k native and number of games running at 120 FPS or close to with decent texture resolution and draw distance to go with it unlike cyber punk, yes vram and bus width matters more.

We all should be fighting for more for lower prices.
 
Last edited:
This thread is strange.

It's almost as if people are trying to give justification for low vram.

Bus width matters more and this tends to be given to high vram cards but not always.

Running 4k native and number of games running at 120 FPS or close to with decent texture resolution and draw distance to go with it unlike cyber punk, yes vram and bus width matters more.

We all should be fighting for more for lower prices.

If you could read objectively, you would see no one is doing that.
 
Yup the fact people are still dodging the question of if the extra vram on the 3090 was beneficial enough in order to justify the cost says it all, well except for hum and therealdeal agreeing :)

We say the extra vram cost is ludicrous and not worth it.

They understand. Extra vram is pointless...

:cry::cry::cry:
 
Until you run out. :D

Yeah and if this was a common enough thing I would take note. But has yet to happen to me and based on what I have seen from all the graphs and videos online presented here or elsewhere I am unlikely to until next gen cards are here :D
 
Yup the fact people are still dodging the question of if the extra vram on the 3090 was beneficial enough in order to justify the cost says it all, well except for hum and therealdeal agreeing :)
3090 was significantly more expensive in general, it's bus width matters more but more vram is more beneficial yes.

Spending an extra £1000 isn't but that's not people arguement, that's called clutching for low vram using a super expensive component as a basis and point of your argument or others.

I'd rather argue for more vram then justify low vram for the money
 
Bus width matters more and this tends to be given to high vram cards but not always.
Absolutely. Which is probably why you have 3080 owners here saying that these cards are still smashing games for them, 384bit bus width if I'm not mistaken, compared to the 4080 and 4080 Super both have 256bit buses.

People often go on about the 10gb of VRAM in the 3080, but they never mention the bus width comparison. You seem knowledgeable on this, do we have reliable numbers anywhere comparing 256bit to 384bit bus width performance?
 
3090 was significantly more expensive in general, it's bus width matters more but more vram is more beneficial yes.

Spending an extra £1000 isn't but that's not people arguement, that's called clutching for low vram using a super expensive component as a basis and point of your argument or others.

I'd rather argue for more vram then justify low vram for the money

As already discussed earlier in the thread, in order for something to truly be beneficial then people need to also be willing to spend the extra if that is the case and so far as shown by the ones who have answered (including yourself now), the vram simply wasn't beneficial to justify the extra cost back then and even now at times, it's arguable depending on your use case etc. Also the ones who keep on buying gpus with low vram speaks for itself even more.

Again, it's no excuse nowadays charging £500+ for 8gb and all gpus should be coming with at least 12gb at the very min. But in the context of the thread op/title, back when we only had said ampere and rdna 2 gpus, it was a different scene to what we have now. As noted too, games are in better shape now than what we had in 2023, speaking from experience as well as evidenced in such videos.
 
  • Like
Reactions: TNA
That's called reframing, because no one is arguing against more vram. Who doesn't want more vram if the extra cost associated with it is sensible?
It really isn't, I get the 3090 argument but the general price increase was really it's halo position and Nvidia knew it, it was a beast but it still carried a large bus width with the ram which would be amazing in games that actually do have huge draw distances that aren't false backgrounds.

The extra vram increase of this GPU wasn't because if it's ram, but it's beneficial yes, you have an answer to this topic from someone at the least.

I don't care for vendor, I just care for us consumers getting more for our money and never giving any form of justification to these companies to give less for more money.
 
Absolutely. Which is probably why you have 3080 owners here saying that these cards are still smashing games for them, 384bit bus width if I'm not mistaken, compared to the 4080 and 4080 Super both have 256bit buses.

People often go on about the 10gb of VRAM in the 3080, but they never mention the bus width comparison. You seem knowledgeable on this, do we have reliable numbers anywhere comparing 256bit to 384bit bus width performance?
I will need to take a look for this specific test actually, because I just took a quick look and it's all old stuff.

High Res and high textures with high draw distances with high resolution assets including shadows really wants high bus ( which you can feel in the open world in world of Warcraft) and moving at high speed, nothing with false backgrounds stuff you can actually get to.
 
Assuming normal prices (but not necessarily), it made more sense to buy the 3080, save the difference until 3090/ti and then sell 3080, use that difference and buy a 4080 or 4090 next gen. Repeat as needed.

Games won't be using over-the-top vRAM assets and such simply because they're build for consoles... That's it. Edge cases are just that, edge cases and are to be treated as such. Do you fall into an edge case? Then make your purchases accordingly. Splitting hairs one way or another won't change much.
 
It really isn't, I get the 3090 argument but the general price increase was really it's halo position and Nvidia knew it, it was a beast but it still carried a large bus width with the ram which would be amazing in games that actually do have huge draw distances that aren't false backgrounds.

The extra vram increase of this GPU wasn't because if it's ram, but it's beneficial yes, you have an answer to this topic from someone at the least.

I don't care for vendor, I just care for us consumers getting more for our money and never giving any form of justification to these companies to give less for more money.

This doesn't make sense imo. The 3080 had a big bus also and was around 15% or so slower.

What games actually are you on about? Because the only game that comes to mind before the 4090 release was Far Cry 6 with it's optional texture pack. So what are these games you are referring to?
 
Yup the fact people are still dodging the question of if the extra vram on the 3090 was beneficial enough in order to justify the cost says it all, well except for hum and therealdeal agreeing :)
We answered this on the 10GB enough thread that got locked and previous threads that too got locked the answers you are looking for are there and you just don't acknowledge them or bluff them off. This is why many of us have got tired with this topic as people that know why it benefits them and how are just ignored by the loud minority that keep crying prove it and was done many times but same reply... prove it...

So as was stated before the people in favour of low VRAM or state there is no need for more can argue this till they go blue and honestly it just feels like trolling to me now and not a real sensible discussion. Also not everyone is using their GPU just for gaming.. and GPUS are not purchased because of gaming only. So as you see the topic is dead really when people are just looking at gaming and even gaming has shown not enough VRAM on most gpus that should have had more from day one .. I showed this with facts on one thread and even showed it with AMD test methodology on how to make Low VRAM Nvidia cards struggle against their cards that were really slower but had more VRAM.


I will find the thread if you want and the video ...
 
Last edited:
3090 was significantly more expensive in general, it's bus width matters more but more vram is more beneficial yes.

Spending an extra £1000 isn't but that's not people arguement, that's called clutching for low vram using a super expensive component as a basis and point of your argument or others.

I'd rather argue for more vram then justify low vram for the money

Also 3090s had NVLINK a very important feature for people wanting to use them for professional work (to pool the cuda cores and VRAM 48GB) and some of the games supporting dual NVLINK-ED GPUS. No other cards could do that apart from the 3090/ti not even the 4090...
 
Last edited:
We answered this on the 10GB enough thread that got locked and previous threads that too got locked the answers you are looking for are there and you just don't acknowledge them or bluff them off. This is why many of us have got tired with this topic as people that know why it benefits them and how are just ignored by the loud minority that keep crying prove it and was done many times but same reply... prove it...

So as was stated before the people in favour of low VRAM or state there is no need for more can argue this till they go blue and honestly it just feels like trolling to me now and not a real sensible discussion.

Still haven't answered the question..... :cry:

We all know your use case for more vram regardless of extra cost is more justified because you actually "need" it for workload reasons along with the nvlink but for 95% of people on this forum, it is all about gaming so bearing that in mind:

has the extra vram proved beneficial enough to justify the extra cost

So far we have:

- humbug
- realdeal
- raegun

All basically agreeing it hasn't been beneficial enough to justify the extra cost. So what is your take? Simple yes or no and why.

Where exactly are myself and others ignoring others claims? (although people who post the usual of xyz and don't post anything to back up such claims other than "trust me bro" style of messages, I will disregard especially when videos showcase exactly what is happening e.g. daniel owens 12gb one showing most games barely break 8/9gb dedicated vram usage and don't face the same issues that would be observed on 8gb gpus as shown in his other video). I think you're missing the point we are putting across.... And myself and the others who are making the same point have stated many times that nowadays, there is no excuse for £500+ gpus with less than 12gb so there is no defending happening here like what you and others are insinuating.
 
Last edited:
  • Love
Reactions: TNA
Status
Not open for further replies.
Back
Top Bottom