• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Whether or not you think the new consoles are crap or not, it is indesputable that they are a massive jump from PS4/XSX. Or do you want to dispute that?

PC GPUs have been using 8- 11GB to play last gen's "console ports" and we've been maxing them out in a several titles, it would seem. Esp with mods.

You think the new gen of games targeting the new consoles will be less demanding?

I didn't say crap, they're going to be middle of the road GPU wise, when they launch in the holiday period. The best PC GPU is looking to be somewhere in the neighborhood of twice as fast as what they're using at rasterization and and way faster at ray tracing. They will be a massive jump from the prior generation of consoles, I don't see that fact as particularly relevant to what I'm saying though. To sum up, what I was trying to convey is developers frequently make multi-platform games, they frequently target the lowest common denominator which are the consoles and the PC just gets more or less a clone of that, meaning the demand from the majority of PC games over the next 5-6 years is kind of predictable. It was different back in the day prior to multi-platform titles, the PC titles had a much more granular increase in visuals as they tended to conform to a much faster graphics card cycle of every 12-18 months rather than 5-6 years. You just don't see it as much these days. We don't have to guess, we know the PS5 specs, we know the amount of memory it will have and realistically their games wont target more than 8-10Gb of vRAMs worth of assets.

Usage right now above 8Gb is very rare and tends to be an extreme outlier, I know people have said well FS2020 uses 12gb and all this stuff, and it's like, Nvidia don't care about a few exceptions that stand out, OK so you have 300 skyrim mods and you need 11Gb of vRAM, that's an exception, it's not the norm. Putting 2x more vRAM onto a card than you need to satisfy some vanishingly small number of customers is not a good sales strategy for them. It's a trade off, vRAM costs money it makes products more expensive which hurt sales. If vRAM was free or nearly free, I'd agree, just slap more on the card to cover the exceptions, but it's not.
 
I still remember the 8800GT 256MB,abnd how it was eventually beaten by slower GPUs such as the 9600GT which had more VRAM.
I think the other point is that having more VRAM gives the game/engine more options.

You don't need a blisteringly fast NVME drive ala the consoles to stream from, if the game has the option of simply caching more of its assets.

Whichever way people want to spin it (to support nVidia), cheaping out on VRAM is a bad thing.

e: I think also people are going to remember when nVidia has cheaped out before, and how it bit them in the ass sooner rather than later (the customer, that is). "Fool me once," and all that jazz.
 
Yeah it was a gimped card against the 320mb and 640mb 8800gts's
I think the other point is that having more VRAM gives the game/engine more options.

You don't need a blisteringly fast NVME drive ala the consoles to stream from, if the game has the option of simply caching more of its assets.

Whichever way people want to spin it (to support nVidia), cheaping out on VRAM is a bad thing.

The GTX1060 3GB is another one. I tend to not like skimping on VRAM too much,especially since one of the great things about playing on a PC,is modding games,and this can use up more VRAM too.
 
I didn't say crap, they're going to be middle of the road GPU wise, when they launch in the holiday period. The best PC GPU is looking to be somewhere in the neighborhood of twice as fast as what they're using at rasterization and and way faster at ray tracing. They will be a massive jump from the prior generation of consoles, I don't see that fact as particularly relevant to what I'm saying though. To sum up, what I was trying to convey is developers frequently make multi-platform games, they frequently target the lowest common denominator which are the consoles and the PC just gets more or less a clone of that, meaning the demand from the majority of PC games over the next 5-6 years is kind of predictable.
I don't get your argument at all.

You say the target is the lowest common denominator which is the consoles.

You agree that the consoles are just about to take a massive jump forwards.

This means the lowest common denominator is about to take a massive jump forwards. The bar is about to be raised.

Yet you say this is not relevant to PC gaming? That 8 GB is as good going forwards (with the raised bar) as it was for the last 5-6 years (the last console generation).

I can't help but see this as a massive contradiction.
 
You'd better let Mark Cerny know pronto they've ballsed up the design. They only went and put too much shared memory in it, should have stuck with 8 GB! :p :p

This reads an awful lot like nV shilling btw. Esp the bit where you say more memory is bad because it will reduce frame-rates :p :p

I'll bet if AMD has more VRAM people will not only benefit from it, they'll probably be irritated by Jensen's cheapness.

Nothing about anything I've said is specific to Nvidia, it's about the relationship between vRAM and GPU and how much vRAM is appropriate for any given amount of GPU horsepower, realistically it tops out and people can use extreme examples like FS2020@4k ultra but it's not even playable, which is kind of the point. Accusations of shilling, that's all in your head I'm afraid.

No where did I say that more memory will reduce frame rates, that's a really horrible straw man of what I actually wrote. I said that as you fill memory with more and more game assets the GPU has to do more work and so the frame rate drops because it takes longer to render each frame. There's a relationship between vRAM and the GPU and performance. You can't dump 2x the assets into your game and expect to get the same frame rate. You don't just put as much vRAM onto a video card as you can possibly cram on there, you pick an amount that is appropriate for the GPU. FS2020 is actually a perfect example people are saying "ah-hah but you can get 12Gb of usage out of FS2020" yeah but not at a playable frame rate, no one is going to do that they're going to use settings in game that the GPU can run at a playable frame rate which means dropping settings which means less vRAM is needed.

vRAM is not the limiting factor in video card evolution, the GPU is. It's easy to put more memory onto cards, we could put 5x more memory onto a video card today if we really wanted to, but we can't put a GPU that's 5x faster on there, the limitation is the evolution of the GPU. And we simply put enough vRAM on the card that is appropriate to service the GPUs needs. The only time you'd ever put anything into vRAM is if the GPU is going to perform a calculation based on that, and the GPU can only run so fast, and so there's a ceiling on useful amount of vRAM to put on the card.
 
I don't get your argument at all.

You say the target is the lowest common denominator which is the consoles.

You agree that the consoles are just about to take a massive jump forwards.

This means the lowest common denominator is about to take a massive jump forwards. The bar is about to be raised.

Yet you say this is not relevant to PC gaming? That 8 GB is as good going forwards (with the raised bar) as it was for the last 5-6 years (the last console generation).

I can't help but see this as a massive contradiction.

No I'm saying that is IS relevant to PC gaming, deeply relevant, which is why I brought it up. You're thinking of this in terms of "leaps" and in some strange relative way to last gen, which is getting you confused. It's better to just look at the specs of the next gen consoles, we actually know what the specs are. Look at how much memory the console GPU will realistically have access to (at most 8-10Gb) and understand devs will target this amount (or lower) when developing their games because they're going to design their game generally speaking for the lowest common denominator. Not ALL devs do that, but most do.

No I'm saying that 10Gb will be enough going forwards, that everything to me suggests that Nvidia have picked an amount of vRAM at 10Gb (assuming that rumour is true) that is appropriate for the vast majority circumstances. And people will find exceptions to that, but they're just that, they're exceptions, things that Nvidia generally speaking don't care about. To make the argument why not just put more vRAM on there you have to acknowledge the downside of what you're asking for, which is increased costs. If you look at just the pros and not the cons then obviously that argument would make sense. What I'm saying is there's pros and cons, more vRAM = more expense, there's no clear win that satisfies everyone so you aim for the best trade off which is enough vRAM for all general cases and the exceptions be damned.
 
I said that as you fill memory with more and more game assets the GPU has to do more work and so the frame rate drops because it takes longer to render each frame.
Which isn't necessarily true, is it.

If you cache assets you don't need to use them any more than if you loaded everything "just in time". It just means they're available to be used when needed, without delay and without needing to be streamed from a much slower medium.
 
No I'm saying that 10Gb will be enough going forwards, that everything to me suggests that Nvidia have picked an amount of vRAM at 10Gb (assuming that rumour is true) that is appropriate for the vast majority circumstances. And people will find exceptions to that, but they're just that, they're exceptions, things that Nvidia generally speaking don't care about.
nVidia did what they have often done. Picked an amount which will (just about, hopefully) suffice until the next gen of cards, but by then will be starting to become inadequate.

If you buy a 6GB or 8GB card (or probably even 10GB) in 2020, basically expect to replace it when the 4000 series rolls along.

If you wanted a card to last a few years, yeah, joke's on you.
 
I don't understand why people are obsessed with GPUs they can't afford and get so upset about it. I'm more interested in the new entry level cards. Will they have RTX? Or will they be called GTX 2650 and GTX 2660? Will 1440p finally be the new budget resolution this gen?

Probably because people dream of owning a ferrari while all they can afford is a fiat panda?
 
Cheers PrincessFrosty, surprised it took so long fortfor someone to reiterate what I said a few pages ago. I know some on here are extremely knowledgeable but some are bigger equals better camp.

The core will generally let you down if you really load up a game with tonnes of assets in the VRAM. Just having more VRAM is not a get out of jail free key.

You also forget that a colossal multinational that is Nvidia,chose this amount. If they don't know what they are doing then who else do you propose does?
 
No I'm saying that is IS relevant to PC gaming, deeply relevant, which is why I brought it up. You're thinking of this in terms of "leaps" and in some strange relative way to last gen, which is getting you confused. It's better to just look at the specs of the next gen consoles, we actually know what the specs are. Look at how much memory the console GPU will realistically have access to (at most 8-10Gb) and understand devs will target this amount (or lower) when developing their games because they're going to design their game generally speaking for the lowest common denominator. Not ALL devs do that, but most do.
During the last console gen PC had 8GB - 11GB and people saw their VRAM get used up. By these "console ports". When the consoles had way less in terms of GPU grunt and memory/VRAM.

During the next console gen we're sticking with 8GB on PC even tho consoles are now upgraded...

What I'm saying is PC probably needs more than just to barely match console spec. Esp if you want to mod your games.

We've had 8 GB on PC since before 2015 and now things are moving on. The consoles are moving on.

If nVidia refuses to move on there should be nobody justifying this. They're only extracting every last $$$ and giving you the bare minimum in return.
 
Cheers PrincessFrosty, surprised it took so long fortfor someone to reiterate what I said a few pages ago. I know some on here are extremely knowledgeable but some are bigger equals better camp.
Well then history is going to repeat itself.

It's not like we haven't already given several examples of nVidia cards oft he past that have been gimped through lack of VRAM.

But hey, it's your money. If you want to pay more for less, I can't argue with that.
 
I don't want to keep a GPU 3-4 years, think longest serving was my GTX 1080 at 2.5 years or so.

Like I've experienced before and expect to again, the amount of VRAM will not limit me first but core grunt/speed.
 
Do the new consoles have any system ram? Or are they going to have to use a portion of the 16Gb GDDR as system ram?

If they have to use it as system ram as well, then it might not leave much more than 8Gb as VRAM


I don't understand why people are obsessed with GPUs they can't afford and get so upset about it. I'm more interested in the new entry level cards. Will they have RTX? Or will they be called GTX 2650 and GTX 2660? Will 1440p finally be the new budget resolution this gen?

Its got nothing to do with being able to afford it or not, its got more to do with the perceived worth.

Having a card that is 40-50% faster than its predecessor at roughly the same price = worth it, This is how it's been for a very long time.

Having a card that is only 30% faster than its predecessor with a price increase of 20-60% = not worth it, even if I can afford the new price.
 
No where did I say that more memory will reduce frame rates, that's a really horrible straw man of what I actually wrote. I said that as you fill memory with more and more game assets the GPU has to do more work and so the frame rate drops because it takes longer to render each frame. There's a relationship between vRAM and the GPU and performance. You can't dump 2x the assets into your game and expect to get the same frame rate.
.
Running out of Vram will have a more significant reduction in frame rates (and potential hitching issues) than increasing the amount of work that the GPU has to do due to an increase in assest quality. Your reasoning also seems to be based on the idea that as soon as asset quality increases past 8GB of Vram usage the render times will increase to the point were the frame rate is unplayable on current GPUs.

No I'm saying that is IS relevant to PC gaming, deeply relevant, which is why I brought it up. You're thinking of this in terms of "leaps" and in some strange relative way to last gen, which is getting you confused. It's better to just look at the specs of the next gen consoles, we actually know what the specs are. Look at how much memory the console GPU will realistically have access to (at most 8-10Gb) and understand devs will target this amount (or lower) when developing their games because they're going to design their game generally speaking for the lowest common denominator. Not ALL devs do that, but most do.

The current generation of consoles realistically had 4-6GB available to them for the GPU (8GB total). Yet we have people reporting around 8GB used on current AAA titles. We are about to double how much RAM is available to the console GPU, as well as asset streaming direct from their SSDs. Do you honestly think that this won't increase VRAM usage for PC games?
 
Back
Top Bottom