• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Just lol.

"Cheaper price". Hilarious. Cheaper than a Ferrari, I'll give you that. Or a private jet. Or a villa in Monaco.

No cheaper than what the card would cost if they added more memory to it, obviously

Also you again are adopting two positions at once. Firstly that I "crippled" the spec above by not have a fast NVME (etc), then also again that those things aren't necessary.

No, I didn't say that. I was very specific and careful with my words. I said that the system was crippled for non-vRAM related issues. Games have a minimum spec that covers all components of the computer not just the vRAM or the GPU. They need a certain minimum speed CPU, they need a certain amount of disk space, they need a certain amount of system RAM. If you put less than you need in of these components it doesn't matter what your vRAM is, the game is going to be crippled. I'm not sure why this even needs saying? How long have you been gaming for? This is super basic stuff.

I think I'll just leave you to your PR work, whilst waiting for AMD to deliver an alternative to nV's fleecing. We're not going to agree on this, and you've got your defensive position to hold re nVidia.

I've said it before and I'll say it again. I'm talking about the relationship between the speed of a GPU and the amount of vRAM that specific GPU needs, this has absolutely nothing to do with Nvidia, this same argument applies to all GPUs. If AMD release a GPU that's about the same speed as a 2080 and has 10Gb of vRAM this argument holds true for exactly the same reason. There's only 1 person in this coversation that is trying to make this about 1 side or the other, and that's you. And look it's kinda not that subtle, you keep talking about Nvidia doing this to fleece people, and Jensens leather jacket and all this stuff. So I guess I shouldn't be all that surprised.

e: Also you pointed to the fast NVME in the new consoles and said, "This is the way things are going," whilst again saying you don't need one in a PC.

Yes I pointed out that IN FUTURE these techniques will be used more and more because they're so effective at getting more bang for your buck and that the hardware of the future consoles that aren't out yet will have hardware specifications which allow developers to use the same effect/technique but to a greater degree.

It's really confused messaging on your part.

No it's not, it's really clear. But when you echo back to me what you think my point is, it's nothing like what I actually said. So the interpretation of what you THINK i said that exists in your head is the problem, not what I'm actually saying. It's not contradictory at all to say: "we've been using this technique for decades" and "we're going to leverage it even more in future"

e2: You can also bet your ass that the gen after this (40xx) will have more VRAM, as an incentive to upgrade, and nobody will be saying, "Nah, you'll never need more than 8 GB".

See this is why I know you don't understand what I've actually said, because I NEVER said, and no one here in this thread has ever said that you'll never need more than 8Gb, that is a BLATENT and obviously deliberate straw man of what I've said. I've said that for every given GPU there is a maximum amount of useful vRAM. That is obviously different per GPU and because as GPUs get faster, the amount of useful vRAM you need also goes up. So obviously future GPUs will need more vRAM.

Guys, is this a known troll or something? Am I just being trolled here?
 
Yes I pointed out that IN FUTURE these techniques will be used more and more because they're so effective at getting more bang for your buck and that the hardware of the future consoles that aren't out yet will have hardware specifications which allow developers to use the same effect/technique but to a greater degree.
Ah OK, so 8 GB VRAM will be fine for last console gen's games. Anything a few years old is fine.

But what many of us are pointing out is that 8 GB has a very short shelf life GOING FORWARDS.

We've already said if you're going to upgrade every gen then don't worry about it.

For those people who want their £800 investment to last more than 1 year/18 months - DO WORRY ABOUT IT :p

Is this clear now?
 
The less VRAM you have the less assets you can use in your scene (potentially affecting things like draw distance etc).

Yes but there's a cap on how many assets you can have in your scene because GPUs aren't infinitely fast, the more assets in a scene the more demand there is on the GPU to render the scene and thus the lower the frame rate. When you add too many assets to your scene your frame rate becomes unplayable.

The more you need assets that aren't in VRAM (you say "intelligent swapping" but it's really just a cache miss), you immediately become dependent on the speed of the medium you are getting those assets from, and the speed of the link between the GPU and that medium. If they're in sys RAM you're going to have a better time than if they're on the disk only.

NOPE.

This is not a cache miss. If you try to fetch an asset not in vRAM you'll get bad stuttering as you load it from a slower disk. GPUs now have unified memory which allows the GPU to intelligently shuffle data between disk and vRAM based what is needed and when. And even before unified memory developers simply did this in their game engines manually, if they know assets will be needed in the near future they'd preload them into vRAM prior to being needed and assets no longer needed are dropped from vRAM.

With more VRAM you can have more complex scenes and better draw distance.

Yes but IF, and ONLY IF your GPU is fast enough to render all of those things at a playable frame rate.

There's 2 limitations here, how many assets can you put in a scene such that your GPU can still provide a playable frame rate, and how many assets can you fit into vRAM. You're talking about vRAM which matters, but you're ignoring the fact the GPU has to actually process these assets in order to render them.

Software isn't magic that makes getting assets from a slower medium somehow as fast as having them already in VRAM. Not having enough VRAM is a limitation. I think everyone can see this.

I never said it wasn't a limitation, that is an extreme straw man of what I have been saying. Please stop misrepresenting my points like this. I never ever said that lack of vRAM can be a limitation I said that there's a MAXIMUM useful amount of vRAM any GPU can use. I'm just going to stop there and actually just ask you, do you understand the difference between these 2 things. Because if not then this discussion isn't going to go anywhere.
 
Ah OK, so 8 GB VRAM will be fine for last console gen's games. Anything a few years old is fine.

But what many of us are pointing out is that 8 GB has a very short shelf life GOING FORWARDS.

We've already said if you're going to upgrade every gen then don't worry about it.

For those people who want their £800 investment to last more than 1 year/18 months - DO WORRY ABOUT IT :p

Is this clear now?

Which game engine do you develop for now, I forget? You seem to know an awful lot about what we need in the future.
 
It certainly has :D

ibMQlawl.jpg.png

Says the guy with two 2080ti's :D
 
Which game engine do you develop for now, I forget? You seem to know an awful lot about what we need in the future.

Likewise the same question to you? You can't disprove what he said either.

Or do you just run a quad core CPU currently? Run only 8GB of system RAM?

Use a 500W PSU? Have an HDD?

Using the logic in this thread getting anything more than that is wasting money,as most games still get OK FPS with a quad core cpu,8gb of ram,hdd and a 500w psu.

OTH,a lot of people here are buying systems with 6~12 cores,32GB of RAM,NVME SSDs(SATA is old hat now),over 500W PSUs,etc to "future-proof" their systems.

The moment people say,more VRAM is useful to futureproof the system,that all seems to go out of the window,and is some luxury.
 
Last edited:
You do understand that by adding more memory the card will cost more.

As Jensen would say "the more you buy, the more you save".

In all seriousness though if your not happy with 10gb on a 3080 then just wait and buy AMD instead presuming they have 12/16gb options as putting your money where your mouth is will be the only way to make nvidia take note.
 
Last edited:
Which game engine do you develop for now, I forget? You seem to know an awful lot about what we need in the future.
Better to over-spec than under-spec, no?

Would be pretty horrid if your £800 investment didn't last very long. Unless you've got the money not to care.

In which case why aren't you buying a £1400 3090 anyhow?

Ultimately we believe some games are already hitting the limit right now, and there's no reason to suggest this won't happen more often in future.

In fact it would be entirely logical for this to happen more often in future.

If you have evidence to disprove this assumption, I'm all ears.
 
Likewise the same question to you? You can't disprove what he said either.

Or do you just run a quad core CPU currently? Run only 8GB of system RAM?

Use a 500W PSU? Have an HDD?

Using the logic in this thread getting anything more than that is wasting money,as most games still get OK FPS with a quad core cpu,8gb of ram,hdd and a 500w psu.

OTH,a lot of people here are buying systems with 6~12 cores,32GB of RAM,NVME SSDs(SATA is old hat now),over 500W PSUs,etc to "future-proof" their systems.

The moment people say,more VRAM is useful to futureproof the system,that all seems to go out of the window,and is some luxury.

:confused: I'm not the one making any claims.
 
Ah OK, so 8 GB VRAM will be fine for last console gen's games. Anything a few years old is fine.

But what many of us are pointing out is that 8 GB has a very short shelf life GOING FORWARDS.

We've already said if you're going to upgrade every gen then don't worry about it.

For those people who want their £800 investment to last more than 1 year/18 months - DO WORRY ABOUT IT :p

Is this clear now?

I understand what you're saying, I get the objection completely...but I have addressed this specific point. But I'll give it 1 more go.

The way you're thinking about this is that future games demand more vRAM, and yes generally speaking that's true. But, importantly, they also demand more processing power from the GPU, those 2 things are intrinsically linked together, as you increase demand on vRAM you also increase demand on the GPU. And the reason is that assets put into the scene to be rendered need to be in vRAM but they increase the complexity of the scene as well, which increases the load on the GPU which in turn lowers frame rate.

OK so let's just walk through how this might actually look for titles in the future. You buy the game, you install it, you run it and goto the settings and you crank them all up to ultra. You find that the expected vRAM slider is telling you that you'll need 12Gb of vRAM, ok so you play the game. But the game is getting 20fps, because all those extra assets are tanking your frame rate. OK so you go back to the setting screen and you start turning the setting down and testing again to try and get your frame rate high enough to be playable. So maybe you turn off AA and SSAO and a few other things. Boom your frame rate is now playable, but your memory usage is now maybe 8Gb instead, well inside the 10Gb of the card.

So instead of thinking about this in terms of games that are not released yet that require more vRAM, let's think about this in terms of how much vRAM do we need to service a GPU of a specific speed or ability to do calculations. Becuase this is generally speaking going to be game agnostic. This relationships exists across all games, and so you don't need to worry about what future games are going to do, what you care about is how much vRAM is enough vRAM to service the GPU in the card, if you get that right, future games wont be a problem. Yes the vRAM requirements will climb but so will the GPU requirements and if you've picked the right amount of vRAM for your GPU, the GPU will give out faster than the vRAM will.

So...real actual testable case right now. FS2020, at 4k Ultra graphics it clocks in at 12.5Gb of vRAM usage. Uh-oh but we only have 10Gb on a 3080, ok well if you look at the benchmarks for performance at those settings a Titan RTX clocks in at ~28fps which has a rough FP32 Compute of 16.3 TFLOPS which is higher than a 3080. Right so real world actual example we have benchmarks for, a 3080 is going to run into GPU limitations is FS2020 LONG before it runs into vRAM limitations. If you have a 3080 and you go to play FS2020 and you play in 4k Ultra you're going to have somewhere in the neighbourhood of 25fps. And you'll have to drop the settings, to increase frame rate, and when you drop the settings the vRAM requirements will drop.
 
:confused: I'm not the one making any claims.

You are though - you asked him prove his claims. So you need to then show your credentials to disprove what he is saying. The same with the others.

So I am asking you can you promise 8GB of VRAM will be fine for the next 5 years,at qHD and 4K?

Will there be no game from now to early 2025,which will run out of VRAM on an 8GB GPU?? That includes modded games. 4~5 years is an acceptable lifespan for a GPU,especially one over £450.

See if this was promised people wouldn't be so concerned. The issue is AMD/Nvidia can pretty much up-spec the next generation with more VRAM,to sell more GPUs,as prices will drop.

On this very forum,people are buying CPUs,RAM,SSDs,PSUs,coolers,etc which are all "overspecced" and yet no one bats an eyelid. Overclocking CPUs with £200+ motherboads and £100 CPU coolers,to get another 10% performance.

I don't see some of the people in this thread,going into those threads and saying its all a waste of money and to "prove" you need more than 4 cores running at stock,need that fast 32GB kit,need that over 500W PSU,etc.
 
Last edited:
You are though - you asked him prove his claims. So you need to then show your credentials to disprove what he is saying. The same with the others.

So I am asking you can you promise 8GB of VRAM will be fine for the next 5 years,at qHD and 4K?

Will there be no game from 2020 to 2025,which will run out of VRAM on an 8GB GPU?? Will 10GB be enough 11GB?? 12GB??

On this very forum,people are buying CPUs,RAM,SSDs,PSUs,coolers,etc which are all "overspecced" and yet no one bats an eyelid. I don't see some of the people in this thread,going into those threads and saying its all a waste of money and to "prove" you need more than 4 cores,need that fast 32GB kit,need that over 500W PSU,etc.

nice edit :rolleyes:

Again I never claimed anything whereas they are stating facts so it is perfectly reasonable to ask them to clarify how they know this.
 
I understand what you're saying, I get the objection completely...but I have addressed this specific point. But I'll give it 1 more go.

The way you're thinking about this is that future games demand more vRAM, and yes generally speaking that's true. But, importantly, they also demand more processing power from the GPU, those 2 things are intrinsically linked together, as you increase demand on vRAM you also increase demand on the GPU. And the reason is that assets put into the scene to be rendered need to be in vRAM but they increase the complexity of the scene as well, which increases the load on the GPU which in turn lowers frame rate.
Are you claiming it's a linear relationship (always)? That there is a linear relationship between VRAM usage and required GPU performance to render a frame?

Such that any increase in VRAM requirement must result in a proportional (linear) increase in GPU stress?

I'm curious.

The fact that there are sometimes multiple VRAM configurations of the same card would seem to call this into question. Eg 480 4GB and 480 8GB. In such cases you would claim that the 8GB variant is of no value? That the 4GB card must be equally as viable as the 8GB card?

I'm not putting words into your mouth here I'm clearly just asking a question(s).
 
Using the logic in this thread getting anything more than that is wasting money,as most games still get OK FPS with a quad core cpu,8gb of ram,hdd and a 500w psu.

OTH,a lot of people here are buying systems with 6~12 cores,32GB of RAM,NVME SSDs(SATA is old hat now),over 500W PSUs,etc to "future-proof" their systems.

The moment people say,more VRAM is useful to futureproof the system,that all seems to go out of the window,and is some luxury.

Again, this not a great way of thinking about the situation. Demands on vRAM for future games do not go up in isolation, they go up in tandem with the demands on GPU speed. Games require more vRAM but they also require more GPU horsepower. If you future proof your video card for future vRAM requirements by putting a lot on there, but you don't future proof it against GPU requirements then that extra vRAM is pointless because you can't make effective use of it. If you can stuff all of FS2020s assets into vRAM but your frame rate is 10fps, what's the point? You've sure got enough vRAM, you just can't play the game.

A better way to think about it, which transcends future titles and the changing (increasing) demands on vRAM is to simply ask do I have enough vRAM to service the GPU itself, when the GPU is running flat out. If the answer is yes then you'll always have enough vRAM for your games, the fact the demand goes up in future doesn't really matter when you think about it in this way, and this is how you think when you want to decide how much vRAM to put on your video card.
 
nice edit :rolleyes:

Again I never claimed anything whereas they are stating facts so it is perfectly reasonable to ask them to clarify how they know this.
It's somewhat unknowable, because we're dealing with future events.

The nature of any £800 purchase (for most of us) is that we don't really want to be gambling that the product will be fit for purpose or not (going forwards).

There are signs that it might not be. Nothing is guaranteed.

I sold my crystal ball on the MM last year, so I'm doing what everyone else is doing - speculating. Discussing. Extrapolating.

e: Also we're dealing with a somewhat bad-faith actor in nVidia :p They've screwed us before, and they'll screw us again. As often as we let them :p

The reality is nVidia isn't a non-profit making cards to enrich our lives by making our games as good as they can be for the money.

nV wants to sell you as little as possible for the money they can make from you.

Generally the only thing stopping us getting screwed to the max is competition from multiple vendors for your hard earned.

But nVidia do like a good fisting. Let's put it that way.
 
Are you claiming it's a linear relationship (always)? That there is a linear relationship between VRAM usage and required GPU performance to render a frame?

Such that any increase in VRAM requirement must result in a proportional (linear) increase in GPU stress?

I'm curious.

The fact that there are sometimes multiple VRAM configurations of the same card would seem to call this into question. Eg 480 4GB and 480 8GB. In such cases you would claim that the 8GB variant is of no value? That the 4GB card must be equally as viable as the 8GB card?

I'm not putting words into your mouth here I'm clearly just asking a question(s).

I linked to an earlier study which compared the 2GB and 4GB versions of the GTX960 and R9 380. This was an era when the GTX980 and Fury series had 4GB of VRAM,and the GTX980TI had 6GB.

The GTX960/R9 380 were signficantly slower GPUs than the GTX980/Fury X but benefited from the same amount of framebuffer.

The 2GB versions,still had worse FPS and worse frametimes in a number of games:
https://www.computerbase.de/2015-12...mm-geforce-gtx-960-frametimes-gta-v-1920-1080

People back then said 4GB was pointless for a GTX960/R9 380!! :o:D

We all saw how the Fury X lasted,as it started to hit VRAM limitations.
 
Last edited:
Back
Top Bottom