• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Just realised my Antec/Seasonic TP-650 is 10 years old in a couple months.

"Supposed" to replace them after a few years anyhow, or so I thought.

That old chestnut, heh. :p

I've got an Antec Neo 550 fully modular PSU that is from 2007 in the system sat near me, it has been running almost 24/7 for 13 years, and it's not had an easy life when it comes to load. Very few on/off cycles and a constant(ish) temperature is probably the main reason it is still rock solid, that and it gets cleaned out with an air compressor every 6 months.
 
I honestly don't think my 1080Ti can actually *use* all 11gb of RAM.

Try some modded games,with ENBs and run them at 4K with 4K and higher resolution assets.

Let me put it another way. Do you think my old GTX 560 would have any use for 11gb of vram?

I do not think it would.
My GTX1080 had more than enough grunt to run the game. But when it ran out of VRAM,the FPS tanked.

gjlWVWz.jpg


The VRAM was teetering on the edge of 8GB under DX12. The moment it went slightly over 8GB of actual usage the FPS literally halfed and the game was also more stuttery if I panned.

I think it was fixed later on,but a lot of what people think is the GPU running out of grunt can be VRAM limitations.

Your GTX1080TI can use all 11GB,as texture mods are not reliant on your GPU. I run modded Fallout 4,which is capped at 60FPS. With some of the 4K and 8K,etc texture mods,once it goes over 8GB,performance shows the same crashes.

GPU utilisation was nowhere near 100% in those scenarios with a GTX1080. The GPU wasn't the problem,the amount of VRAM was.

If I had a GTX1080TI,I would see more consistent performance at qHD due to VRAM.

Your chart has no explanation for the different color traces.

And you never answered my question. Do you think my old GTX 560 could use 11gb of vram?

YOU said that 100% your GTX1080TI can't use 11GB of VRAM,and showed zero evidence. When I pointed this out you quickly started talking about a 10 year old GPU with no driver support,to deflect from your own original comment.

Those lines are the results of 7 different runs in the same part of the map.

VRAM useage was at 7.95GB at qHD,and even a slight variation in path could push it just over 8GB as the assets would load in at slightly different times. Even at 7.95GB VRAM usage,the GPU wasn't at 100% utilisation. At over 8GB usage,the GPU utilisation went under 50% usage.

I have tested VRAM limits on modded games. Fallout 4 was capped at 60FPS at qHD. A GTX1080 can easily push 60FPS on that engine at qHD with a reasonable CPU. Its not near to 100% usage unless you push an ENB.Even at close to 8GB VRAM usage,GPU usage wasn't near 100% and when it breached 8GB GPU usage plummeted.

So if a GTX1080 at nowhere near 100% usage,can have its performance crash when it went over 8GB VRAM usage,do you think in your head,that if I had 11GB of VRAM on my GTX1080 performance would crash??

No it wouldn't. A GTX1080 8GB can certainly use 11GB of VRAM in certain scenarios,if it had the chance to.

Guess what,4K is double the number of pixels rendered than qHD. So at 4K,8GB wouldn't have been enough.

Edit!!

If that is not enough for you,then put your money where your mouth is and buy a 6GB/8GB card.
 
Last edited:
my msi afterburner

Doesn't mean it actually needs it. Can't say it chocks on only 8GB on a rtx 2080. Fine tuning the settings for better performance also drops vRAM usage. I think Manhattan area is the worse, but at this point I wouldn't look at MSFS 2020 as a landmark in optimization.

Also, modding and running optimized assets/code, is not the same thing as a normal game.

With that said, 1440p would probably be my absolute limit for 8GB.
 
That old chestnut, heh. :p

I've got an Antec Neo 550 fully modular PSU that is from 2007 in the system sat near me, it has been running almost 24/7 for 13 years, and it's not had an easy life when it comes to load. Very few on/off cycles and a constant(ish) temperature is probably the main reason it is still rock solid, that and it gets cleaned out with an air compressor every 6 months.
With the release of the Zen3 CPUs I've decided on a total rebuild anyhow. It was always going to be CPU/Mem/Mobo, with a PSU that old might as well do the full monty. Helps that I'm not fond of the current case :p Oh and the monitor needs an upgrade too.. (Just checking, how many kidneys are essential again?)

Only the GPU is likely to be re-used, because Jensen's full monty is a bit further than I'm willing to go :p And AMD are so silent I can only assume they aren't making GPUs anymore :p
 
There seems to be this pervasive misunderstanding that vRAM is chosen to future proof the card against what up coming games will demand, and that's wrong. The purpose of vRAM is to feed the GPU data to do work on, you can't just keep loading data into the game world without impact to your performance. If there's a floor to GPU performance (min acceptable frame rate) then there's an equivalent ceiling to useful vRAM. The amount of vRAM to put on a card is directly proportional to the amount of work the GPU can do, thats how you make vRAM decisions. With probably the exception of cards aimed more at CAD or doing other non-gaming things like professional rendering.

The 2080ti was released with 11GB of RAM. Current speculations put the 3070 at around 2080 TI performance but with only 8GB of RAM.

We can conclude one of two things. Either the 3070 comes with too little RAM or the 2080ti comes with too much RAM.

Considering that you have been arguing that the 3080 doesn't need more than 10GB of VRAM would I and everyone else here be right in assuming that you believe that the 2080ti came with too much RAM?
 
The PCIe 4.0 thing also ties in with the memory thing.
CB did some PCIe 4.0 scaling and while for the RX 5700 showed no difference, and the RS 5500 XT did for the 4GB version (100% vs 112%), the most interesting thing was the RX 5600 XT:
Gx8s3yo.png

https://www.computerbase.de/2020-02/amd-radeon-pcie-3.0-4.0-test/
Okay only 6% but it looks like that 6GB isn't quite enough VRAM and PCIe 4.0 helps to fill it from main memory.
BTW, also regarding memory the last time TPU had a Tahiti and GK102 in a review the difference between the two has huge.
https://www.techpowerup.com/review/nvidia-geforce-gtx-1080/26.html
1FRAC3V.png


The 770 with only 2GB is way behind the 280X. In fact, Tahiti has overtaken the GTX 780 in 1440P and 4K. Pity there's no 4GB 770 or 680 in there, but the 780 indicates that it's not just a VRAM thing.
 
I really doubt there'll be any pre-orders tomorrow.

Have there ever been pre-orders before any reviewers even have the cards?

I think the 20 series were up for preorder when they were announced. The nvidia stream was running late so they appeared on the site with pricing before the pricing was actually announced.
 
Your chart has no explanation for the different color traces.

And you never answered my question. Do you think my old GTX 560 could use 11gb of vram?
Of course it could, if there was any software which took advantage of it.

I don't think you understand how VRAM works. 11GB of data isn't being used constantly, it's stored there because it's quicker than accessing it from system RAM. No different than having data in system RAM vs getting it from disk or SSD.
 
Same people will swear up and down that water isn't wet if you had that position before them. Totally impervious to evidence to the contrary.

Agreed,and then some of them end up hypocritically telling others VRAM isn't important and buy GPUs with tons of VRAM. I remember the same lot with 8800GTX 768MB cards telling others the 8800GT 256MB was fine,as it couldn't use 512MB of VRAM,etc. In the end those 8800GT 256MB had cards which quickly fell off a cliff,and all those 8800GTX owners,had more than enough VRAM to make their cards last that bit longer.

Of course it could, if there was any software which took advantage of it.

I don't think you understand how VRAM works. 11GB of data isn't being used constantly, it's stored there because it's quicker than accessing it from system RAM. No different than having data in system RAM vs getting it from disk or SSD.

Look I pointed it out to them,in two scenarios with my GTX1080 8GB at only qHD. In both cases the GPU wasn't near 100% usage,and yet when I went past 8GB,the performance went down the drain,as it was caching into system memory. Imagine at 4K in those scenarios??

Then they started talking about 10 year old GPUs which don't have support in many games.

Let them believe what they want,its no point really.
 
Last edited:
I honestly don't think my 1080Ti can actually *use* all 11gb of RAM. I just don't think the GPU and bus can move that much data quickly enough.

We need some sort of metric that compares GPU "power" and Vram bandwidth, to find the cutoff point for capacity.

We all (instinctively) know there's a point where the GPU can't use an infinite amount of vram, but I don't think I've seen a reliable way to calculate where that point is from one card to another.

It's fairly easy, games use variable amount of vRAM depending on the settings, we know this relationship, it's uncontroversial. As you turn graphical options up, vRAM usage goes up and frame rate goes down. We have a floor on frame rate and thus a ceiling on vRAM. People have been saying, butbutbut there's this one game and it uses loads of vRAM, yes true, and it's completely unplayable at the settings where it demands that much vRAM. All the arguments I've seen so far in this thread have been kind of emotional ones about what people feel like they deserve or ones based on like some weird guess about dates. Like it's 2020 and therefore we should have a card with at least X amount of vRAM. It's like...pardon? What has the year got to do with anything, how about we test games to find out what the relationship between vRAM and GPU usage is and then assign cards an appropriate amount of vRAM based on that relationship. Which is how cards have always been produced, GPU manufacturers don't want to put anymore vRAM onto their cards than absolutely necessary because it's expensive, it drives up the cost of the product and thus is sells less, especially if you have competition.

Having looked into this further since the other night, it's also a bit of an unknown what vRAM usage is actually necessary, games will allocate a whole bunch based on what is availabile and then internally manage what they put in there. And they don't always fill what they actually reserve, and it turns out some of the extremely few examples of large vRAM usage like Resident Evil going over 8Gb is actually not being used, benchmarks with cards with different amounts of vRAM can kinda confirm this, that it's not an impact of performance, you can assign way less than 8Gb and it's fine.
 
Back
Top Bottom