• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

The console specs are not directly comparable to PC specs. Differences in APIs, resolution, lack of modding, plus techniques/cheats like checkerboarding will effectively reduce the amount of grunt and mem needed by the consoles.

In other words you need more on PC to run these "ports" properly, as you've come to expect on your 1440p/4k screen, with ultra settings :p

I think it stands to reason that if the consoles have had a major upgrade, that continuing to use the same amount of VRAM on PC GPUs that we've had since 2015 is at best a gamble, at worst planned obsolescence from nVidia.

That's just it though, most gamers aren't using cards with more than 8Gb of vRAM. The steam hardware survey has a breakdown by vRAM, 21.47% of people have 8Gb, 4% of people have 11Gb, a whopping 74.44% of people have 6GB or less. https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

My own anecdotal experience is that at 8Gb with a huge sample size of games I've never run into a vRAM limit and rarely come even close to do so would be an extreme example, and even I run modded to hell games like GTA V and Skyrim/Fallout with insane texture pack overhauls. Running into a vRAM limit before running into unplayble FPS due to GPU choking on all those assets is a very rare and specific thing and the lack of examples so far in this thread I think is a good testament to that.

The PS5 will have something a lot like a 1080 in it, their RAM usage will not likely exceed 8Gb for the same reasons I've mentioned. 10Gb for the high end Nvidia cards is going to be perfectly fine for almost all gamers in almost all circumstances.

Not sure if you've been following this launch but costs are already going up :p That's mostly to do with nV's reported 60% profit margin tho :p

So now, as well as a £800 GPU, and a new 1kw PSU, we'll also need a brand new best-in-class NVME drive, plus 32 GB fast system RAM, so nV can cache the video assets on other parts of your PC and avoid having to eat into their profits by putting a decent amount of VRAM on their cards.

Great for them, not so great for us :p

I think most of the increase in costs is because to maintain what gamers are used to, large performance increases, in a world where shrinking the transistors is getting increasingly more difficult, the additional horsepower has to come from larger die sizes which means increased costs. Plus probably a bit of gouging because they're kinda an industry leader in the performance bracket at this point and there's not much in the way of an alternative.

You wont need any of that extra stuff, games aren't bottlenecked by those things. Trust me I've tried to find circumstances where RAID 0 for 2 fast NVMe drives actually provides a benefit in games and it's practically non existent. The best I can find is that Subnautica when streaming in new biomes doesn't lag for a few seconds, mostly down to how badly optimised the game is more than anything. All those lovely prefetching/streaming/caching things are all ubiquitous today and have been for ages, it's not new.
 
Well moore's law (observation) is basically starting to fail and will fail in the long run and by long run actually not that far from now. And most of the doubling in speed for the same cost we've appreciated in the past has come from innovating the transistor size down smaller per area. Where as while that's still happening now, most of the performance increase is now coming from larger die sizes and more aggressive clocking. I think that's something we'll all have to eventually come to terms with that increasingly we wont be able to expect big leaps at the same cost, those days are getting closer and closer to being over.

In all honesty, I'm just a consumer. I have a set criteria for making purchases, it's simply 50% increase in performance over whatever I currently have for around the same price. I have no real interest in how GPU's work, or how the companies manage to get that performance. They are just tools and a means to an end. If it will take longer to meet my criteria, then so be it, as it will save me money so I don't mind!

I don't really class it as stingy, it sort of implies that Nvidia makes off the more cash or something like that. if they put more vRAM on the cards they wouldn't just eat that cost for us like a champ, they'd pass it onto the consumer with higher video card prices. Their interests are aligned with their customers, they're not going to dump unnecessary amounts of vRAM onto a card if they don't think it'll be used. And as I said, based on how slow the GPUs are in the next gen consoles (about the speed of high(ish) end card from 2 generations ago) which has 8Gb and is mostly unused, I think 10 is a real stretch that'll be the exceptions that'll need that, 8Gb I think is a realistic maximum. Devs are going to load that bad boy up with 10Gb of high end assets and find they have 10fps.

The GPU in the new Xbox is apparently on par with a RTX 2080, so fairly powerful as console GPU's go.
 
That's just it though, most gamers aren't using cards with more than 8Gb of vRAM. The steam hardware survey has a breakdown by vRAM, 21.47% of people have 8Gb, 4% of people have 11Gb, a whopping 74.44% of people have 6GB or less. https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

My own anecdotal experience is that at 8Gb with a huge sample size of games I've never run into a vRAM limit and rarely come even close to do so would be an extreme example, and even I run modded to hell games like GTA V and Skyrim/Fallout with insane texture pack overhauls. Running into a vRAM limit before running into unplayble FPS due to GPU choking on all those assets is a very rare and specific thing and the lack of examples so far in this thread I think is a good testament to that.

The PS5 will have something a lot like a 1080 in it, their RAM usage will not likely exceed 8Gb for the same reasons I've mentioned. 10Gb for the high end Nvidia cards is going to be perfectly fine for almost all gamers in almost all circumstances.

I'd have to agree somewhat. Don't think i've ever seen anyone max VRAM usage and have a playable experience at the end of it. Back when i had 980 SLI. Many of those games benchmarked and tested as of today. Never ran out and those cards had 4GB. Came close but still had a great 4k 60hz experience.

Many examples of high vram usage usually comes with unplayable framerates.

However i'd still have to say that 8GB on a 2080Ti class card is a joke going forward to what could be 2 years or more of next gen games where these requirements are only going to increase.
 
What confusing to me is the consumers that actually defend this, and want nVidia to squeeze them for every last penny! "Please Sir, can I have less?"

It's confusing because you think it's true when it's not. Nvidia while a market leader aren't a monopoly, silly decisions like that would open them up to losing market share to AMD. This is all coming from a faulty assumption that you can keep loading up the same card with more and more assets from newer games and somehow that doesn't impact your frame rate. The cards are going to ship with an amount of vRAM which is sufficient to not bottleneck the GPU, the only reason you'd need more vRAM in future is if magically you had a faster GPU.

This line of argument is basically saying something like, new games will use more vRAM therefore we'll run out and need to buy a new card. But, in the main, that's wrong. New games will demand more vRAM which will put more demand on the GPU which the GPU wont be able to deliver and so you'll have to drop some of those settings down to maintain a playable frame rate and when you do that, the vRAM requirements will drop.
 
It's confusing because you think it's true when it's not. Nvidia while a market leader aren't a monopoly, silly decisions like that would open them up to losing market share to AMD. This is all coming from a faulty assumption that you can keep loading up the same card with more and more assets from newer games and somehow that doesn't impact your frame rate. The cards are going to ship with an amount of vRAM which is sufficient to not bottleneck the GPU, the only reason you'd need more vRAM in future is if magically you had a faster GPU.

This line of argument is basically saying something like, new games will use more vRAM therefore we'll run out and need to buy a new card. But, in the main, that's wrong. New games will demand more vRAM which will put more demand on the GPU which the GPU wont be able to deliver and so you'll have to drop some of those settings down to maintain a playable frame rate and when you do that, the vRAM requirements will drop.

Reverse extrapolating your points the new console GPUs will only be able to use so much VRAM before they run out of grunt.

We know in flops where the console GPUs sit (roughly) and how much VRAM an equivalent PC graphics card currently ships with.
 
It's confusing because you think it's true when it's not. Nvidia while a market leader aren't a monopoly, silly decisions like that would open them up to losing market share to AMD. This is all coming from a faulty assumption that you can keep loading up the same card with more and more assets from newer games and somehow that doesn't impact your frame rate. The cards are going to ship with an amount of vRAM which is sufficient to not bottleneck the GPU, the only reason you'd need more vRAM in future is if magically you had a faster GPU.

This line of argument is basically saying something like, new games will use more vRAM therefore we'll run out and need to buy a new card. But, in the main, that's wrong. New games will demand more vRAM which will put more demand on the GPU which the GPU wont be able to deliver and so you'll have to drop some of those settings down to maintain a playable frame rate and when you do that, the vRAM requirements will drop.
Well you've rejected people's experiences as outliers, and you seem to have rejected previous occurances of this same issue (ie the 780 Ti 3GB when the new consoles had 8 GB), which has been discussed previously.

nVidia have done this before. Yes, it's not new :p

And going forwards there will be many more of these outliers, games using 8+ GB VRAM, esp when modding, etc.

For those buying a new card each gen it's a moot point, maybe. For the rest of us knowing there's an impending bottleneck around the corner is concerning.

Also the Steam stats are not a great indication of anything, btw. It includes all the laptops running Plants vs Zombies or <insert 2D indie game here>. I wouldn't be surprised if iGPU is well represented on that survey, too.

Average Steam Survey hardware spec is not 100% relevant to people paying £600+ on a DGPU to play demanding 3D games.

e: Please explain to people using a 2080 Ti atm and hitting 10+ GB ram usage, how they need to drop their settings because their card can't handle the game at the settings they're currently using :p Given that they seem to be happy enough with those settings, but clearly it's too demanding because the VRAM usage dictates that it's too demanding (as per your argument).
 
Nvidia being the market leader didn't stop the 8800GT 256MB running out of VRAM. In fact skimping on VRAM is excellent as it forces you to upgrade quicker,which works well for margins.

But you know what,let the people who want low VRAM get those GPUs,and then spend more money longterm. The rest of us will make sure we have enough VRAM and as a result can mod,and turn up features with less of a performance hit.
 
Nvidia being the market leader didn't stop the 8800GT 256MB running out of VRAM. In fact skimping on VRAM is excellent as it forces you to upgrade quicker,which works well for margins.

But you know what,let the people who want low VRAM get those GPUs,and then spend more money longterm. The rest of us will make sure we have enough VRAM and as a result can mod,and turn up features with less of a performance hit.

Well the market leader has nothing to prove. So if they can make more money by gimping the specs, they will.

When 3DFX was king and had the mindshare, Nvidia had to prove themselves, so they innovated and had to create products that were superior.

Now Nvidia can release products with limited RAM yet people will still preorder in desperation. Mindshare.
 
In all honesty, I'm just a consumer. I have a set criteria for making purchases, it's simply 50% increase in performance over whatever I currently have for around the same price. I have no real interest in how GPU's work, or how the companies manage to get that performance. They are just tools and a means to an end. If it will take longer to meet my criteria, then so be it, as it will save me money so I don't mind!

The GPU in the new Xbox is apparently on par with a RTX 2080, so fairly powerful as console GPU's go.

That's fine, I mean in terms you care about, you're going to have to wait longer and longer periods before you get that increase for the same price. You have that expectation because that expectation has been established from years of that being somethign we can reasonably expect, so we're used to it. But the mechanisms that allowed that to happen physically cannot continue into the future, so while you might not care about those mechanism (and that's completely fine) your expectations will have to change, as will us all. For most of us that's a real bummer but we're just on the very tip of starting to feel that now and it's only going to get worse.

The rumored specs are an AMD RDNA 2 GPU @ about 12TFLOPS which a hair over a 1080Ti, it's faster than the PS5 but barely. Even if it was as fast as a 2080, that's still an 8Gb card.

I'd have to agree somewhat. Don't think i've ever seen anyone max VRAM usage and have a playable experience at the end of it. Back when i had 980 SLI. Many of those games benchmarked and tested as of today. Never ran out and those cards had 4GB. Came close but still had a great 4k 60hz experience.

Many examples of high vram usage usually comes with unplayable framerates.

However i'd still have to say that 8GB on a 2080Ti class card is a joke going forward to what could be 2 years or more of next gen games where these requirements are only going to increase.

Yep. I mean honestly none of this I'm saying about the relationship between vRAM/GPU/Performance is particularly controversial, it's just people have got into a weird mindset of how they think about these things. People have kind scrambled to find examples that disprove the norm like FS2020 using 12+Gb of vRAM but at totally unplayable settings, which if anything just demonstrates what I'm saying. An RTX titan barely breaking 25fps @ 16.3TFLOPS of FP32 Compute, vs a rumored 20TFLOPS of the 3090, it's like sorry that's not going to put your 4k ultra FS2020 into 60fps territory. Maybe 30fps 99% min, still essentially unplayable. You're going to need a RTX 4090 before something like FS2020 @4k Ultra becomes even remotely playable at maybe something like a dodgy 45fps territory at which point we can justify 12-16Gb of vRAM.
 
Well the market leader has nothing to prove. So if they can make more money by gimping the specs, they will.

When 3DFX was king and had the mindshare, Nvidia had to prove themselves, so they innovated and had to create products that were superior.

Now Nvidia can release products with limited RAM yet people will still preorder in desperation. Mindshare.

This was way after 3DFX,and was during the ATI era. I get what you are saying but people are now using Stockholm syndrome to say low VRAM has no effect. I remember people making market leader statements back then,and I argued with them. Told people to buy the 8800GT 512MB,or any of the 512MB cards such as the 9600GT or HD3870. In the end the 8800GT 256MB performance started to fall off a cliff. The 8800GT 512MB OTH was relevent for a very long time.

One of the reasons having decent VRAM quanitities is useful,is because of the ability of PC to use higher quality textures,and for things like modding which are more PC centric things. 8GB is probably the minimum I would go for nowadays if I wanted any sort of reasonable lifespan. Even with my GTX1080 8GB,some games are starting to hit the 8GB limit.

I remember characterising a problem with ROTR under DX12. In some areas,it was the reason why Pascal DX12 performance could be a bit ropey in the game,ie,the card literally could run out of VRAM.

That is another problem with the move towards DX12/Vulkan/RT I can see VRAM usage starting to rise.
 
BTW,I hope Greta does not catch wind of Nvidia "500W" Ampere,otherwise there might be some problems!! Vega barely escaped!! :p
That Goddess can be appeased in winter knowing the 500w beast will warm us rather than burning coal, in the summer you must promise to underclock it to 100w and save the planet

I love Greta :=)
 
Reverse extrapolating your points the new console GPUs will only be able to use so much VRAM before they run out of grunt.

We know in flops where the console GPUs sit (roughly) and how much VRAM an equivalent PC graphics card currently ships with.

Yep, that's exactly what will happen, which is why the console designers gave them 16Gb of shared memory because between the CPU and the GPU they wont exceed that, if anything that's just another data point that proves my case, that people are assigning enough memory they need to service their GPUs, but no more.

Well you've rejected people's experiences as outliers, and you seem to have rejected previous occurances of this same issue (ie the 780 Ti 3GB when the new consoles had 8 GB), which has been discussed previously.

nVidia have done this before. Yes, it's not new :p

And going forwards there will be many more of these outliers, games using 8+ GB VRAM, esp when modding, etc.

For those buying a new card each gen it's a moot point, maybe. For the rest of us knowing there's an impending bottleneck around the corner is concerning.

Also the Steam stats are not a great indication of anything, btw. It includes all the laptops running Plants vs Zombies or <insert 2D indie game here>. I wouldn't be surprised if iGPU is well represented on that survey, too.

Average Steam Survey hardware spec is not 100% relevant to people paying £600+ on a DGPU to play demanding 3D games.

e: Please explain to people using a 2080 Ti atm and hitting 10+ GB ram usage, how they need to drop their settings because their card can't handle the game at the settings they're currently using :p Given that they seem to be happy enough with those settings, but clearly it's too demanding because the VRAM usage dictates that it's too demanding (as per your argument).

I've not rejected that experience, I'm simply saying that those kinds of experiences are outliers and as such Nvidia isn't going to care about them, there's no profit motive to care. The steam stats are a fantastic overview of the big picture which is what Nvidia cares about. And precisely demonstrate what I'm saying which is that games aren't hammering vRAM like crazy, most gamers are kinda around 6Gb or less. The survey lists the specific cards and lists intel integrated GPUs and I tallied those and it's 9.52% of the total. But you can sort of infer that from the band of vRAM that's at the 512Mb-1024Mb ranage anyway as iGPUs tend not to borrow more system memory than that. So that's an effect on the stats but it's relatively minor.

The top 20 or so most popular GPUs in that survey are GTX/RTX 1xxx and 2xxx series, steam isn't just PvZ playing laptops, it's one of the largest gaming platforms in existence with a really decent sample size that shows a huge buy in and the majority of gamers using mid range GPUs or above from modern product lines. You'll have to come to terms that very high end dGPUs are a small elite community and of that community an even smaller number of people ever running into vRAM limits and that it's not in Nvidias financial best interest to change a line of products and make them even more expensive to serve what is a group of outliers need. If there really is a sufficient demand for that stuff I'm sure they (or some AIB) will meet that demand, but it's going to come at some huge additional cost because to make it worth it to redesign and modify manufacture lines plus the additional cost of the memory itself, will add a serious premium to cards that are already insanely expensive. Even among this community which orientated far more towards the high end stuff, people are already bailing out based on price, what do you think an even bigger mark up is going to do?
 
Nvidia is capping RAM as GDDR6X is so expensive and while commodity memory prices in general may be falling, I'd take a bet there isn't a glut of supperfast GDDR6X anywhere to be found.
The Nvidia design forces 8/16, 10/20, these increments would cost $150-$200 to buy from Micron and once you start adding Nvidia margins that will add hundreds to the GPU price.
People would truly lose their minds if the 3080 20GB version cost $1200+, which I expect it would for little measurable performance gain.

I fully expect he 3090 24GB to be $2000 with a cheaper $1400 12GB available later. They want to differentiate the 3090 from the 3080TI which will come out with 11GB.
They have allegedly spent $150 on a cooler to stop the 3090 going nuclear.
I don't just don't see any likelyhood a 3090 with a fully functional die and 2.4x the super expensive memory compated to the 3080 will cost 'just' $600 more.

When GDDR6X gets less expensive, you may see higher memory cards released, or you may even see a 3080 with 20GB of plain old GDDR6.

Until Nvidia sorts out a wider memory bus or a more flexible memory architecture, they just can't make the margins they want even at the latest crazy prices if they add in all that memory.
They'll let the AIB's take the 'gouging' hit.

Can't wait for the announcement, I'll be bringing popcorn.
 
Nvidia would have spent so much on market research before designing the cards they have, it's not like they are clueless to what games are coming, they know more than we do and what they need to run.

All this speculation about vram is just playing with your bellybutton fluff until the Tuesday reveal.
They know what they are doing far beyond all of us here.
 
Back
Top Bottom