• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia planned obsolescence VRAM thread

Status
Not open for further replies.
Since better textures are about the easiest way to make a game look better, I never buy the "that card is too weak to need more than X VRAM" argument.

Not only do Nvidia get to decided on what each SKU should have, they have also prevented AIBs from making double VRAM cards, so planned obsolescence and market segmentation is the most probably answer.

Problem is, these threads tend to get closed by partisan denial posters - almost like it happens on purpose.

Why do you think some of their new cards like the 4070 are 12gb? because it is just enough for them to look good on review day. They know for a fact that if they make that an 8gb card the reviewers are bound to test this game.

They give you as little as they can. VRAM to them is no longer the selling feature for GPUs that it was when the 10 series came out. That "need" has been replaced with RTX.
 
Since better textures are about the easiest way to make a game look better, I never buy the "that card is too weak to need more than X VRAM" argument.

Not only do Nvidia get to decided on what each SKU should have, they have also prevented AIBs from making double VRAM cards, so planned obsolescence and market segmentation is the most probably answer.

Problem is, these threads tend to get closed by partisan denial posters - almost like it happens on purpose.

Why do you think some of their new cards like the 4070 are 12gb? because it is just enough for them to look good on review day. They know for a fact that if they make that an 8gb card the reviewers are bound to test this game.

They give you as little as they can. VRAM to them is no longer the selling feature for GPUs that it was when the 10 series came out. That "need" has been replaced with RTX.

Cards like the RTX4050 6GB will have less VRAM than the RTX3050 8GB and RX6600 8GB. Textures are one of the easiest way to make games look better. Plus PCMR enthusiasts on tech forums are in denial at where games development has shifted to over the last decade.

The reality is that games are also being developed with consoles in mind. The consoles probably can allocate more VRAM than 8GB. The PS5 has 16GB of unified GDDR6(plus 512MB for background tasks),and the XBox Series X has 10GB of higher bandwidth GDDR6,and 6GB of lower bandwidth GDDR6. That means probably upto 10GB is used for VRAM.

Then both have very high speed SSDs,which can be used to rapidly stream textures into the VRAM too. So when it comes to textures,these lowish VRAM cards are at a disadvantage,especially as most gaming PCs(and the PC ports of games) are not really set up to use things such as DirectStorage properly. Most SSDs in gaming PCs are probably DRAMless QLC PCI-E 3.0 and PCI-E 4.0 SSDs.
 
Last edited:
End off the day if the developers don't make games that cannot run on the most popular cards and setups they won't sell many games.
I suspect they are making them for the most popular setups, isn't the PS5 and Xbox using something like 16GB. I'd hazard a guess that they'll sell far more games like The Last of Us on consoles than PCs.
 
I suspect they are making them for the most popular setups, isn't the PS5 and Xbox using something like 16GB. I'd hazard a guess that they'll sell far more games like The Last of Us on consoles than PCs.

The XBox Series X higher bandwidth GDDR6 pool is 10GB,so I would say the consoles probably can allocate upto 10GB of VRAM. Plus they have very high speed SSDs,unlike most gaming PCs,so they can rapidly stream textures into the main memory.

This is why the PC ports of these games probably need more VRAM,as DirectStorage isn't properly integrated into most PC ports of games. But it would mean,that a lot of the slower DRAMless SSDs might not make the cut either,and certainly SATA SSDs!
 
Last edited:
Since better textures are about the easiest way to make a game look better, I never buy the "that card is too weak to need more than X VRAM" argument.

Textures are one of the easiest way to make games look better.
Really feel this needs to repeated over and over again.

There is a reason why PC mods have always gone for better textures (yes, modders don't have access to the 3D models the studios had so remaking higher polygon count modesls is very hard). Yet, strangely far to many supposed PCMR types aren't into that? Strange, if you have a game where a 4090 doesn't struggle: first thing, look for a texture pack. Unless PCMR types are just too partisan, or like to argue!

Of course, for a sufficiently powerful card, adding extra rendering steps may mean being able to improve things without need more textures but that is not what we are talking about here.
 
Really feel this needs to repeated over and over again.

There is a reason why PC mods have always gone for better textures (yes, modders don't have access to the 3D models the studios had so remaking higher polygon count modesls is very hard). Yet, strangely far to many supposed PCMR types aren't into that? Strange, if you have a game where a 4090 doesn't struggle: first thing, look for a texture pack. Unless PCMR types are just too partisan, or like to argue!

Of course, for a sufficiently powerful card, adding extra rendering steps may mean being able to improve things without need more textures but that is not what we are talking about here.

Because PCMR does not mean hardcore gamer,despite what Nvidia marketing was trying to push over a decade ago on forums. Plenty are more benchmarkers and hardware enthusiasts(although there is nothing wrong with that,as its fun in its own right) so upgrade very quickly so will never seen the issues affect them.

It's worse for mainstream gamers,who tend to be more budget limited,ie,they buy cheaper hardware or keep more expensive hardware for longer. This is where you start to see the penny pinching in hardware becoming annoying.

The irony is now MS is starting to slowly allow mods on the XBox. If MS/Sony started to allow mods in more games,and made better support for keyboard and mouse I could see quite a few PC gamers moving to a console.
 
Last edited:
They know for a fact that if they make that an 8gb card the reviewers are bound to test this game.
Therein lies the problem though, I wouldn't put a reviewer down as any good if using a game that has the reviews the game has on steam.

IF the game is to always have this performance then fair enough but reviewing newly released games is solely for the website/channel to get revenue. My takeaway from HUB is that if you own an 8GB card, don't buy the game yet. In fact don't buy it with a 24GB card.

Ah yes HU is at it again, blaming everything on vram amount.
:rolleyes:

The game crashes on 24GB vram cards just as much.

Some people can't work out the obvious - posting just to bait people i imagine - what's funnier is the mind set of many on an 'hardware enthusiasts' forum! Gibbo'll be rubbing his hands with so many beleivers. All sides of buisness, from the game dev, NV & AMD and OCUK wins from this mindset. Then the same folk moan when they cant afford the high end and are entitled to it being peanuts.

A high proportion of 10,000 people on steam say it's rubbish. OCUK hardware enthiusiasts watch one Aussie Geezer and be like........................ 'upgrade time'. :cry:
 
Last edited:
Therein lies the problem though, I wouldn't put a reviewer down as any good if using a game that has the reviews the game has on steam.

IF the game is to always have this performance then fair enough but reviewing newly released games is solely for the website/channel to get revenue. My takeaway from HUB is that if you own an 8GB card, don't buy the game yet. In fact dont buy it with a 24GB card.

From what I have seen of the game nearly all of its problems seem to be VRAM related. IE, people whose videos I have watched have simply stumbled across the "fix" without even realising it.

Let me use an extreme example here. Or, in other words, a very likely one that could well have led to all of the poor reviews. The secret here? is in the stats. Steam shows that 90% of their user base doesn't even have 8gb of VRAM, with each step of the ladder leading to increasingly tiny numbers. Users over 8gb etc. With it literally coming down to 1% at the end.


He did a video leading up to that based on the 6500XT is it? that X4 card. And, he managed to get it perfectly playable in that too. Now what he did not realise* was that it was ALL because of VRAM use. No one had realised it until HWUB put it under the microscope, and did a lot of testing. Which as he points out, is a massive ball ache due to it taking hours to build the shaders.

* This all leads back to many previous points I have made about this. There is no 100% science to measuring VRAM. The fact is? you can't, and Nvidia know this only too well. Why? because if you could prove it 100% they would have been outed YEARS ago. It is the only thing we can not accurately measure. I have also mentioned how until FCAT came around there was no way to accurately measure stutter in games, or fake frames, runt frames and etc. Now we can, and that has all been sorted out and fixed.

All we can do is have a sixth sense and pick it out when it happens. And that is very hard, and Nvidia KNOW this. It is only when you put a whole ton of cards through the same scenario that you can ultimately prove what is going on, as HWUB have done quite a few times now.

The most annoying thing here? I bet that Nvidia know exactly how to tell how much VRAM is being used. As such, they have weaponised it. Why do they put too much on low end cards? because they know the core won't be able to keep up. Why do they not put enough on mid range and up cards? because they don't want them lasting too long. It really is as crushingly simple as that. AMD? are not focused solely on RT, and as such they give you more than enough. Nvidia on the other hand? DO NOT want you disabling it to get more life out of your card. They want your card to become a paper weight so you rush out and buy another.

The problem with that as I see it? well firstly they are going to get called out a lot, and secondly because in the past 3 years the world has changed drastically. People no longer shout "Shut up and take my money". Instead they are being far more careful, and as such? they are starting to watch videos all of the way through as in the OP.


Edit. As for the game being fixed? I have no doubt it absolutely will be. In the same way they fixed BLOPS III from using too much VRAM on my Fury X. They will either hide/nerf texture settings on low end cards or down grade the textures until they fit. As for the comments about people not buying the game because of this? that is nonsense. Who here didn't buy Crysis? who here never said "Does it run Crysis" No one and and every one dude. We then spent YEARS chasing the technological dragon trying to get the bloody thing to run. Because as humans that is what we do. And Nvidia understand this all too well.
 
Last edited:
From what I have seen of the game nearly all of its problems seem to be VRAM related. IE, people whose videos I have watched have simply stumbled across the "fix" without even realising it.

Let me use an extreme example here. Or, in other words, a very likely one that could well have led to all of the poor reviews. The secret here? is in the stats. Steam shows that 90% of their user base doesn't even have 8gb of VRAM, with each step of the ladder leading to increasingly tiny numbers. Users over 8gb etc. With it literally coming down to 1% at the end.


He did a video leading up to that based on the 6500XT is it? that X4 card. And, he managed to get it perfectly playable in that too. Now what he did not realise* was that it was ALL because of VRAM use. No one had realised it until HWUB put it under the microscope, and did a lot of testing. Which as he points out, is a massive ball ache due to it taking hours to build the shaders.

* This all leads back to many previous points I have made about this. There is no 100% science to measuring VRAM. The fact is? you can't, and Nvidia know this only too well. Why? because if you could prove it 100% they would have been outed YEARS ago. It is the only thing we can not accurately measure. I have also mentioned how until FCAT came around there was no way to accurately measure stutter in games, or fake frames, runt frames and etc. Now we can, and that has all been sorted out and fixed.

All we can do is have a sixth sense and pick it out when it happens. And that is very hard, and Nvidia KNOW this. It is only when you put a whole ton of cards through the same scenario that you can ultimately prove what is going on, as HWUB have done quite a few times now.

The most annoying thing here? I bet that Nvidia know exactly how to tell how much VRAM is being used. As such, they have weaponised it. Why do they put too much on low end cards? because they know the core won't be able to keep up. Why do they not put enough on mid range and up cards? because they don't want them lasting too long. It really is as crushingly simple as that. AMD? are not focused solely on RT, and as such they give you more than enough. Nvidia on the other hand? DO NOT want you disabling it to get more life out of your card. They want your card to become a paper weight so you rush out and buy another.

The problem with that as I see it? well firstly they are going to get called out a lot, and secondly because in the past 3 years the world has changed drastically. People no longer shout "Shut up and take my money". Instead they are being far more careful, and as such? they are starting to watch videos all of the way through as in the OP.


Edit. As for the game being fixed? I have no doubt it absolutely will be. In the same way they fixed BLOPS III from using too much VRAM on my Fury X. They will either hide/nerf texture settings on low end cards or down grade the textures until they fit. As for the comments about people not buying the game because of this? that is nonsense. Who here didn't buy Crysis? who here never said "Does it run Crysis" No one and and every one dude. We then spent YEARS chasing the technological dragon trying to get the bloody thing to run. Because as humans that is what we do. And Nvidia understand this all too well.

Cheers for reply ANdy,

Crysis was a serious graphical masterpiece though compared to anything else around at the time. The last of US I'm afraid isn't. Nor was FC6, nor was Hogwarts. They are games have extended the boundaries of the current game engines that are mostly soon to move on a gen. Could be

It's just that the recommended specs are 8GB, that's not minimum specs - recommended means really you need 8GB to play it properly. 24GB doesn't do that currently. Post #21 shows there is near as makes no difference between high and ultra settings on higher tier cards.

Saying NV has weaponised VRAM is a tad strong. I'd agree games ran out of VRAM, but really, in the grand scheme of things, the only ones that have, are games tested upon release. Like you say it will be fixed, I dunno how much the textures get lowered etc, but post 21 shows near as makes no difference in performance between high & ultra @ 1080p, though that could be down to CPU bottle neck. But I'd still expect much higher performance from a 4090 @ 1080p. Plus, textures at 1080p saturating 8GB.....I just don't think that's right. The game compared to all others at 1080p will really have to be a visual wow to demand that much and would be apparent at higher rez's.

It's only AAA games in the last year or two that have been sent out in terrible states. Maybe VRAM is just the first thing to highlight this. Heck, I remember when BF4 had a memory leak and stuttered when you reached your cards limit. CRash - start again, this isn't much different from that. If people thing that TLOU really is next level gfx, even at 1080p then I'm all ready for some screen shots. These will need to be saved for when the game is fixed as if you say that textures will be downgraded (not optimized) then the difference in graphics should easily be apparent.

I just wouldn't advise people to upgrade anytime soon based on day1 release performance. Only because, all games touted as such before, have gone on to be fixed and look great and run on cards that showed an issue on release. Giving game devs infinite amounts of VRAM to soak up poorly written games rushed to meet shareholders demands, will only make it more expensive for the consumer.

Really, games coming from UNreal5 engine and next gen engines will be the proper place for seeing how much VRAM game engines need.

As you say, seeing as most have <8GB, then naughty dog wont be selling many copies, which isn't a great way to sell or market your game when you want MSRP for it. I too, want this game.

This VRAM debacle will be much better had when the games on Unreal engines start to release, of course they will run less than optimally at the beginning -just in time before many are drawn into next gen gfx.

BF4 I saw loads of people buying i7's during the beta for the hyper threading as it ran better. ON release - ran fine on the i5's they all swapped out. CPU's fault back then.

Right, off to take the boys to the pub as they have 2 weeks off for easter, ground too wet anyway to do anything. Hopefully no drunken posts later :D apologies in advance.
 
4090 doesn't run it perfectly, stutters like Arkright, and runs at <60% gpu utilisation from what I've seen. In fact a 4090 struggled with 1440p

Like you say, everyone needs a 24GB card- top advice.

I have seen someone mention that some 522 NV drivers are having better results.

Really this thread should be closed, as it's another VRAM merry go round.
Runs flawless on my 4090 getting between 80-115fps at 4K native ultra or 190FPS at 1440p native ultra and both at 100% GPU utilisation :)
 
7 years ago when the £330 GTX 1070 came out with 8GB was that just too much too soon?

Yes.

Pascal was a very cheap card to make. As such? they wanted to bolt on more of stuff so they could charge you more. That is what Nvidia do best. Charge you for things you don't need, and removing things you do.

I bought a 2080Ti in March of 2020 for £1600. It was the Kingpin model. Normally I would not be so insane, but the money literally fell into my lap and I had to get rid of it. Very long story, not for here. However, I can still use it perfectly well. Was it too expensive? hell yes. But, at least I paid up to stop it becoming a paperweight.

My two last GPUs? a 6700XT Strix used for £350, and a 6800XT Strix LC from OCUK B grade. It was a display model. I paid £729, which at the time was an absolute steal given the RRP of the card was around £1100. Do remember context, though. That was June of last year. However, at no point have I felt bad about that in any shape or form because it's a beautiful card, runs super cool and won't run out of VRAM any time soon. My next GPU buy will hopefully be an ARC. Well, the next lot, if they don't pull the plug, which seriously worries me.
 
Runs flawless on my 4090 getting between 80-115fps at 4K native ultra or 190FPS at 1440p native ultra and both at 100% GPU utilisation :)
See for me the game stutters/judders despite fps staying high even when using controller.
When using mouse it’s constant judder though.
The weird thing is that frametime graph doesn’t show it anywhere near as much as what my eyes can see.
 
Cheers for reply ANdy,

Crysis was a serious graphical masterpiece though compared to anything else around at the time. The last of US I'm afraid isn't. Nor was FC6, nor was Hogwarts. They are games have extended the boundaries of the current game engines that are mostly soon to move on a gen. Could be

It's just that the recommended specs are 8GB, that's not minimum specs - recommended means really you need 8GB to play it properly. 24GB doesn't do that currently. Post #21 shows there is near as makes no difference between high and ultra settings on higher tier cards.

Saying NV has weaponised VRAM is a tad strong. I'd agree games ran out of VRAM, but really, in the grand scheme of things, the only ones that have, are games tested upon release. Like you say it will be fixed, I dunno how much the textures get lowered etc, but post 21 shows near as makes no difference in performance between high & ultra @ 1080p, though that could be down to CPU bottle neck. But I'd still expect much higher performance from a 4090 @ 1080p. Plus, textures at 1080p saturating 8GB.....I just don't think that's right. The game compared to all others at 1080p will really have to be a visual wow to demand that much and would be apparent at higher rez's.

It's only AAA games in the last year or two that have been sent out in terrible states. Maybe VRAM is just the first thing to highlight this. Heck, I remember when BF4 had a memory leak and stuttered when you reached your cards limit. CRash - start again, this isn't much different from that. If people thing that TLOU really is next level gfx, even at 1080p then I'm all ready for some screen shots. These will need to be saved for when the game is fixed as if you say that textures will be downgraded (not optimized) then the difference in graphics should easily be apparent.

I just wouldn't advise people to upgrade anytime soon based on day1 release performance. Only because, all games touted as such before, have gone on to be fixed and look great and run on cards that showed an issue on release. Giving game devs infinite amounts of VRAM to soak up poorly written games rushed to meet shareholders demands, will only make it more expensive for the consumer.

Really, games coming from UNreal5 engine and next gen engines will be the proper place for seeing how much VRAM game engines need.

As you say, seeing as most have <8GB, then naughty dog wont be selling many copies, which isn't a great way to sell or market your game when you want MSRP for it. I too, want this game.

This VRAM debacle will be much better had when the games on Unreal engines start to release, of course they will run less than optimally at the beginning -just in time before many are drawn into next gen gfx.

BF4 I saw loads of people buying i7's during the beta for the hyper threading as it ran better. ON release - ran fine on the i5's they all swapped out. CPU's fault back then.

Right, off to take the boys to the pub as they have 2 weeks off for easter, ground too wet anyway to do anything. Hopefully no drunken posts later :D apologies in advance.

In fairness at ultra settings this game is a masterpiece. As a game it is one of the best I have ever played, and I don't say that lightly.
 
Since better textures are about the easiest way to make a game look better, I never buy the "that card is too weak to need more than X VRAM" argument.

Not only do Nvidia get to decided on what each SKU should have, they have also prevented AIBs from making double VRAM cards, so planned obsolescence and market segmentation is the most probably answer.

Problem is, these threads tend to get closed by partisan denial posters - almost like it happens on purpose.

This is the nail on the head. IDK if you realised what I will say in the first line with the "Better textures are about the easiest way to make a game look better" but in this case? that is absolutely spot on. This game is years old. They have not changed the game mechanics AT ALL, all they have done is exactly what you describe.

And it is one of those games? where that is all it needs. Every one needs to play this, even if they never had a Playstation. Because its story and execution are about as good as it ever gets. The fact I rate this game in the same category as Half Life, Mario 64 and literally a couple of others? says it all. Game changing definitive games is a superlative I could use. And sure, some of it hasn't aged terribly well, but then neither has HL or Mario 64. That doesn't stop them from being genre defining legends of code.

As for the last line? again that sums it up perfectly. All of the time you create a reason for doubt people will doubt it. Until you can categorically prove it like the stutter in Crossfire? there will be 10 8gb GPU owners there to argue with the one. And they know that too.

Jen is nothing but a glorified salesman now. A genius one at that, but that is his role now. More now than ever. And salesmen are not the nicest people you will ever meet, especially American ones. Capitalism means cutting proverbial throats. Be it the competition? or yours? they don't care.

To say that this isn't all 100% deliberate is just mad. You can't leave a genius in doubt. Ever. He knows exactly what he is doing.
 
This is the nail on the head. IDK if you realised what I will say in the first line with the "Better textures are about the easiest way to make a game look better" but in this case? that is absolutely spot on. This game is years old. They have not changed the game mechanics AT ALL, all they have done is exactly what you describe.

And it is one of those games? where that is all it needs. Every one needs to play this, even if they never had a Playstation. Because its story and execution are about as good as it ever gets. The fact I rate this game in the same category as Half Life, Mario 64 and literally a couple of others? says it all. Game changing definitive games is a superlative I could use. And sure, some of it hasn't aged terribly well, but then neither has HL or Mario 64. That doesn't stop them from being genre defining legends of code.

As for the last line? again that sums it up perfectly. All of the time you create a reason for doubt people will doubt it. Until you can categorically prove it like the stutter in Crossfire? there will be 10 8gb GPU owners there to argue with the one. And they know that too.

Jen is nothing but a glorified salesman now. A genius one at that, but that is his role now. More now than ever. And salesmen are not the nicest people you will ever meet, especially American ones. Capitalism means cutting proverbial throats. Be it the competition? or yours? they don't care.

To say that this isn't all 100% deliberate is just mad. You can't leave a genius in doubt. Ever. He knows exactly what he is doing.
Lol what.? Are you suggesting that only textures were upgraded from the original game.?
This is a proper Remake not some remaster.
Literally everything was redone except for the story line.
 
The issue isn't just at 4k. At 1080p the 1% lows for all the 8gb cards are poor from all vendors using ultra quality preset, however dropping to high seems to make this playable for the most part
Zo2ZGLI.png

rT5X8VY.png

RTX 3060 12GB: 39 Min
A770 16GB: 40 Min
RX 6700XT 12GB: 50 Min

RTX 3060Ti, RTX 3070, RTX 3070Ti, 8GB: >16 Min At 1080P

At least the RTX 3080 can still manage 1080P, for now...... :cry:
 
Status
Not open for further replies.
Back
Top Bottom