• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

Permabanned
Joined
27 Sep 2019
Posts
2,570
Don't know how to check, please advise



With maxed out settings perhaps. Some games can be even worse than that. I don't mind using lower settings though - like using High instead of Ultra , but I fear with only 40% bump, even lowered settings it may not reach 120fps. I may have to settle for 100fps. (if the rumour is true)


I do not know of a way to visually see it, GPU-Z or Windows 10 fancy GPU Tab in Task Manager does not.

You can read up on it and see its quiet a big topic with lot of arguing like here.

The more VRam you have the more some game engines will use like Windows where it uses X RAM and caches unused XX RAM, again use some and caching/reserving the rest.

You are nearly using all your VRam above but if you popped in a 16GB card it would no doubt suck up 14-15GB of it again some used some some cached.

Some games have those settings (The one that warn you before you enable them) about Preloading/Caching Video Textures etc and not to use if you have a card with less VRam etc, well I find those the worse they even make a 12GB card run like crap,, best to leave and let the Textures load normally esp on an SSD.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Using or partially caching?

Thing about it is, right now there's few games you can make use of it because it's generally not supported, but when it is you can see the VRAM just getting chewed up like crazy, and it's actually doing something, see here: https://www.computerbase.de/2019-02...amm-call-of-duty-wwii-8-gb-vs-16-gb-3840-2160
But in general you can see this in COD games & Wolfenstein 2/YB (when you manually enable the highest image streaming option), as well as various other games where you can mess around with setting up LODs & Streaming manually (eg Unreal 4 open games ala Fallen Order etc). Combine that with high-res textures too (which btw the COD & Wolfenstein games don't quite have)? It's rough. Don't get me wrong, 11 GB is still enough for 4K for what's currently out there (more or less), but who's buying >$1000 GPUs just for a year or two?

The more important reason why you want more VRAM is because in the very near future the console base is moving from the awful 8 GB DDR3 (Xbox), or slightly less awful 8 GB GDDR5 (PS4), to 16 GB of GDDR6 and likely not the slowest kind either. So on PC where you can increase LOD & the like even further (especially if games are moddable or more modifiable) then that will make a TON of difference. And also, RT increases vram usage as well, so that's another factor to consider.
 

GAC

GAC

Soldato
Joined
11 Dec 2004
Posts
4,688
that 16 gigs will be shared memory dont forget. and i think they only have 14gigs of it to start with 2 gigs for os and such.
 
Soldato
Joined
6 Feb 2019
Posts
17,589
I do not know of a way to visually see it, GPU-Z or Windows 10 fancy GPU Tab in Task Manager does not.

You can read up on it and see its quiet a big topic with lot of arguing like here.

The more VRam you have the more some game engines will use like Windows where it uses X RAM and caches unused XX RAM, again use some and caching/reserving the rest.

You are nearly using all your VRam above but if you popped in a 16GB card it would no doubt suck up 14-15GB of it again some used some some cached.

Some games have those settings (The one that warn you before you enable them) about Preloading/Caching Video Textures etc and not to use if you have a card with less VRam etc, well I find those the worse they even make a 12GB card run like crap,, best to leave and let the Textures load normally esp on an SSD.

I'll have a look tonight. The game is also eating a fair bit of system RAM too - while playing warzone the game's .exe was taking 9gb of system RAM for itself.
 
Soldato
Joined
18 Feb 2015
Posts
6,484
that 16 gigs will be shared memory dont forget. and i think they only have 14gigs of it to start with 2 gigs for os and such.

True, same as now. Either way, the base target is moving up considerably so you'll want a proper increase as well for desktop not same old same old.
 
Soldato
Joined
8 Feb 2004
Posts
3,703
Location
London
40% is a huge jump from current top-end gen, if the rumour is true. Let's wait for benchmarks. Given the world situation, I'm guessing we'll be waiting until 2021 for these to hit mainstream.
 
Soldato
Joined
28 May 2007
Posts
10,069
40% is a huge jump from current top-end gen, if the rumour is true. Let's wait for benchmarks. Given the world situation, I'm guessing we'll be waiting until 2021 for these to hit mainstream.

It's not huge at all. It's half decent and to be expected but nothing special. We used to get jumps of 70-100% every year and now we get around 30-40% every 2 years. Sometimes i wish i could forget the past so this would seem exciting. I think on the AMD side we are going to get a mega bump but only because they are so far behind but that's more exciting due to competition coming back.
 
Soldato
Joined
20 Aug 2019
Posts
3,031
Location
SW Florida
The 780ti>980ti>1080ti all gave us performance increases at the same price point.

Nvidia would like everyone to forget that and just pretend the 2080Ti is normal "progress".
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,570
Location
Greater London
It's not huge at all. It's half decent and to be expected but nothing special. We used to get jumps of 70-100% every year and now we get around 30-40% every 2 years. Sometimes i wish i could forget the past so this would seem exciting. I think on the AMD side we are going to get a mega bump but only because they are so far behind but that's more exciting due to competition coming back.

The 780ti>980ti>1080ti all gave us performance increases at the same price point.

Nvidia would like everyone to forget that and just pretend the 2080Ti is normal "progress".
+1

Kind of makes me cringe people think 40% is awesome :(

That is why the 2080Ti turned out to be an epic fail as far as I am concerned. As not only did it offer so little performance over the 1080Ti, the price jumped up so much on top.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
This whole Samsung 10nm rumor is obvious BS. The 10nm process just wasn't suitable for large GPUs. If it was then Turing would have been fabbed on 10nm. Besides that, 7nm EUV is a mature process with high yields. Moreover, the EUV varient is much cheaper and easier to design and fab than the regular 7nm DUV and also acts as a stepping stone to 5nm EUV as the 7nm DUV is a bit of a dead end (although there night be a 6nm from TSMC which is DUV).
As for capacity, after the summer Apple will move to 5nm and TSMC will be desperate for 7nmEUV customers, of which Nvidia is one of their primary customers.
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
The 780ti>980ti>1080ti all gave us performance increases at the same price point.

Nvidia would like everyone to forget that and just pretend the 2080Ti is normal "progress".

The 780ti and 980ti had competition, the 1080ti did not, so Nvidia knew they could up the price of the 2080ti (even if the BOM is higher) as customers of that class of card had no alternatives.
 
Back
Top Bottom