• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

These card might be $1400 but have use forgot the OCUK price on top of the $1400 will be more like £2000 for the 3090 not a chance. It just works lol.
 
You literally said your GTX1080TI couldn't use 11GB of VRAM.

If a GTX1080 at under 100% utilisation can use more than 8GB of VRAM at qHD,why are you so adament your GTX1080TI has too much VRAM??

Do you think every GPU is capable of using an unlimited amount of vram?

If not, how do you calculate how much vram a given GPU is capable of using?

How?
 
I honestly don't think my 1080Ti can actually *use* all 11gb of RAM.
Do you think every GPU is capable of using an unlimited amount of vram?

If not, how do you calculate how much vram a given GPU is capable of using?

How?

Stop trolling. I answered your question about what you asked before,and once it was proven,you started waffling about Fermi. You ended up buying the GPU with the second most amount of VRAM of the Turing GPU generation,because you wanted as much VRAM as possible.

Yet if you truly believed what you said,you could have sold your GTX1080TI,and got a RTX2070 Super for no extra cost which is now slightly quicker. Yet you hung onto it and the only people I know who did it,wanted the extra VRAM on the GTX1080TI.

Since the RTX3070 is faster than the GTX1080TI,then I expect you will buy one,and not an RTX3090 or any GPU with more than 11GB VRAM. Put your money where your mouth is.

You are now deflecting form what you said. Either read what I said,or I am going to ignore you.
 
Nobody has said that a 8800 GT should have come with 20 GB VRAM. Or a 3DFX Voodoo2.

I believe we were talking about the 3070, 3080, 1080Ti, 2080Ti, etc. The latter two actually coming with 11 GB VRAM, which you said they "couldn't possibly" use.


Okay. At least now we are getting to my point.

How do we calculate how much vram a given GPU can actually make use of?

Anyone care to hazard even a guess? No one has answered what I thought was a rather simple question.
 
These card might be $1400 but have use forgot the OCUK price on top of the $1400 will be more like £2000 for the 3090 not a chance. It just works lol.

Only clown cards will be close to $1400. The Founders is $1500 so expect decent AIB cards to be $1500 to $1800
 
The 3070 is supposed to be 2080 Ti performance yet has 3GB less VRAM.
Will it match 2080ti performance though? It's rumoured to have 25% less shaders and only 230w I wouldn't be supprised if it's closer to a 2080super in performance than a 2080ti granted the RT performance should be better though.
 
Do you think every GPU is capable of using an unlimited amount of vram?

If not, how do you calculate how much vram a given GPU is capable of using?

How?
You can't calculate an answer to a question that doesn't makes sense.
 
No chance.. Will be more like £1600 to £1800.. at least.


I paid 1300 for my 2080 ti I think if these new cards are more expensive I’m out and I hope the 30 35% is not true or it’s gonna be a big killer for me .and I’ll not be pre ordering this time The wait was a killer .
 
Okay. At least now we are getting to my point.

How do we calculate how much vram a given GPU can actually make use of?

Anyone care to hazard even a guess? No one has answered what I thought was a rather simple question.

Not a bad read but seems that 4-8Gb is OK, I thought we would have pushed beyond this a while back.
 
Not a bad read but seems that 4-8Gb is OK, I thought we would have pushed beyond this a while back.

That is with a 5500XT at 1080p if you read the slides properly. Its not so hot at qHD/4K especially with some games with very high res textures,or if you mod games.

But funny how they bought a GPU with 11GB of VRAM,a few years ago,and is now saying 11GB is way too much. Wonder why they didn't sell the GTX1080TI and get a free upgrade to a RTX2070 Super 8GB or something else.

The reason why the GTX1080TI has held its value so well,is that for certain modders it actually is better value because of its large framebuffer,and for some of these slightly older games,Turing doesn't really give you a performance uplift.

It reminds me back in the day the 8800GTX 768MB users telling people an 8800GT 256MB was more than enough,when they had cards with tons of VRAM for the era.
 
Stop trolling. You ended up buying the GPU with the second most amount of VRAM of the Turing GPU,and yet if you truly believed what you said,you could have sold your GTX1080TI,and got a RTX2070 Super which is now slightly quicker.

You are now deflecting form what you said. Either read what I said,or I am going to ignore you.

My point is if my 1080Ti can actually use 11gb of vram, (I have never seen mine use even 8gb and it is probably going to be retired before ever actually making use of even 8gb of its vram)...if that is so, why then not demand 24gb of vram?

Why stop at any amount of vram and decide that that number.....whatever it is.....is enough?

How do you calculate how much vram a given GPU can make use of?

It's like demanding that every sports car ever sold come with massive carbon-ceramic brakes because they *need* them....And then running a sports car in a 24hr endurance race with regular brakes to demonstrated that sports cars needs to be able to run under conditions that almost no one who buys the car will ever encounter.
 
The point about the 2080Ti stands. The 3070 is supposed to be 2080 Ti performance yet has 3GB less VRAM.

Sure it's cheaper, but your argument is that VRAM is paired scientifically to GPU grunt and nVidia don't give more than you need.

Halo products are slightly different because there's a class of consumer out there who simply wants the best of the best of the best, and they literally do not care about the price and so it's just about jacking up the hardware for big numbers. But that market segment is very small because not many people have huge amounts of disposable income. In the case of the 3090 that's a hilariously dumb about of vRAM to have on a card. 24gb is going to be almost 2x more vRAM than it can ever realisitcally use. Half of that would be 12Gb and we've seen what happens when a game (FS2020@4k utlra) uses that much vRAM and the answer is, unplayable frame rates.

The other piece of this puzzle is that high end cards are often used for non-gaming applications like small scale GPGPU super computers, non real time rendering, CAD-like applications and things like that. You can make an argument for more vRAM in those cases but they are not gaming cases. There's always been a class of cards which were designed for those audiences like the Quadros and there's just some product crossover. I think part of the reason people think that the 3080 should have something like 16Gb of RAM is because the 3090 has 24Gb, and that's kinda in between those values. In reality the 3090 will not make legitimate usage of more than about 12 would be my guess.

Part of the reason some older cards have way more memory than they need tends to be because of limitations of architecture. For various reasons related to how you multiply up things like the memory bandwidth you're locked into muiltiples of a certain amount of vRAM which is why the 2080Ti came out with 11Gb, that's kind of a janky number, its an unnecessary amount of vRAM for the card in general but the next step down would probably have been too little, it's better to over provision slightly when the only other option is to under provision.

Lots of fast system RAM and ultra fast NVMe disks are good things, and if you look at the next gen consoles this is the way they are going, they are going for 5.5GB/sec SSDs because they're going to be more interested in streaming assets off the disk in time for them to be needed in vRAM. You've given an example where you've crippled a PC to having no fast NVMe drive or system RAM but the hell of it is, games today and over the last 10 years have been using these techniques to stream assets from slower disks like HDDs or more realistically SATA SSDs which are common now, and that has not been a problem in the slightest. I don't have buttery smooth area transitions in GTAV as I fly around the island because I have a fast NVMe setup, you get the same smooth performance with a traditionally 700MB/sec capped SATA disk, the engine takes care of that, no problem.

But in future these techniques will be pushed harder, the architecture of the consoles was specifically designed around making more use of these techniques to escape the limitation of hardware. That's why we don't need 200GB of vRAM on a video card to play GTA V, we don't cache all the assets we do it intelligently, the less vRAM we use for caching the more we can use for storing things we need to render right now. Stuffing things into vRAM you don't need, that the GPU isn't going to do a calculation on any time soon, is a complete waste. Literally the whole point of the vRAM is so the GPU can fetch the assets it needs to do a calculation on, do the calculation, and write the outcome back to vRAM.

It's the same reason you wouldn't have like, 16Gb of L1/L2 cache on a CPU, it's the reason why the OS will write contents of RAM it doesn't expect to be used any time soon into the pagefile on the disk, you have tiered memory at different speeds and you shuffle bits back and forth in and out of faster memory intelligently so the super fast memory you need to feed the processor doing the calculation (the most expensive memory) is only as big as it needs to be.
 
Doesn't mean it actually needs it. Can't say it chocks on only 8GB on a rtx 2080. Fine tuning the settings for better performance also drops vRAM usage. I think Manhattan area is the worse, but at this point I wouldn't look at MSFS 2020 as a landmark in optimization.

I understand, but there is really no way to know how much memory the game actually "needs". I'm just reporting what the game is doing, whether it needs it who knows.

Another example is RAM - the game doesn't need you to have 32GB, 16GB is fine but with 32GB the game will use over 20GB for itself and benchmarks show that with a 32GB kit the 1% lows are 10fps higher than 16GB kit.

Yes it's not well optimized...for the CPU. Nothing wrong with the GPU and memory side, its bottlenecks are CPU based.

And while yes its just 1 game to compere so far, but its the first Next Gen game we have. We have more next gen games coming very soon.

In any case, I think Flight Simulator is a good becnchmark for next gen using the right settings.

Set the game to 4k Ultra (every slider maxed out) and lock the framerate to 30fps and then flight in New York. Does the game hold a solid 30fps or drop lower? If you can hold a solid 30fps, then your PC is next gen ready, welcome :)
 
Last edited:
My point is if my 1080Ti can actually use 11gb of vram, (I have never seen mine use even 8gb and it is probably going to be retired before ever actually making use of even 8gb of its vram)...if that is so, why then not demand 24gb of vram?

How do you calculate how much vram a given GPU can make use of?

It's like demanding that every sports car ever sold come with massive carbon-ceramic brakes because they *need* them....And then running said sports car in a 24hr endurance race with regular brakes to demonstrated that sports cars needs to be able to run under conditions that almost no one who buys the car will ever encounter.

I literally pointed out scenarios where a GTX1080 ran out of VRAM at less than 100% usage at qHD from my own usage. The same happens to GTX1080TI users in certain games when modded.

So now you are telling me a person who runs such games,and knows others who do,that we don't need extra VRAM when we hit those limits. I ran the tests and saw the results. Next time because YOU have never experienced it,don't go and tell others who have,that they are inventing it.

There are 1000s of games,each run differently,so if you want to stick your head in the sand be my guest.

I also don't like people who then invest in 11GB cards and tell to others more than 8GB isn't needed.

If you are that adament sell your GTX1080TI and get an RTX3070 8GB. Put your money where your mouth is.

Or are you one of those people who will buy a RTX3080 20GB or RTX3090 and say it was nothing to do with VRAM,but you wanted maximum performance....which also conveniently increases VRAM over what you have now.

You can't calculate an answer to a question that doesn't makes sense.

Save your sanity.
 
Last edited:
I bought 32gb of RAM when I built my old 3770k rig. I just retired that rig last year and never got anywhere near using even half of the RAM. (Although I'm sure people could have quoted examples where some computer...somewhere used 32gb of RAM.)

I don't know how much vram a given given graphics card can actually make use of. I don't even know how to calculate such a metric. I do know that I would not spend extra money for, say, 32gb of vram on any GPU today.
Well, still on my i5 3570k and recently bought another 16GB DDR3.
Nice for VMs, databases, etc. but what persuaded me to get it (only about £30 used), was playing heavily modded Skyrim SE. This new mod organiser, Wabbajack, now makes playing stable with 100s of mods easy without being a load order expert.
Due an upgrade sometime, but thinking back to that makes me thing that mATX is probably about as small as I'm willing to go as having 4 DIMMs is just so useful when keeping a rig for years.
 
People have been arguing on this forum over how much VRAM is enough for over a decade, I do not expect it will end any time soon :p
 
Back
Top Bottom