• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,518
Location
Greater London
Since people kept mentioning their unhappiness at another round of 8Gb cards I've made a point of looking at the usage now after gaming for a few hours.

AC Odyssey @ 4K max settings peaked at 75% VRAM usage of my 11Gb 2080Ti. Guess it could be a problem sooner than I realised if the majority of cards next gen are still using 8Gb VRAM.

That's 8.4Gb VRAM usage.
Try Final Fantasy 15 if you have it. When I played it over a year ago on my Titan XP it used 11.5GB VRAM. Now what will next gen exclusive games like Resident Evil 8 coming out next year need?
 
Soldato
Joined
16 Aug 2009
Posts
7,740
Nvidia know full well Turing didn't sell in the volumes of previous generations. Maybe they predicted it and up'ed the margin Apple style to compensate

Do they need to though? Seems to me they've changed their business model and have gone from high volume stack 'em high, sell 'em (not so) cheap to low volume, high prestige products like as you say Apple after having disposed of the competition. Because they know people will buy them regardless, albeit in lower numbers. With the presumably high margins and lowered production and material costs its likely very profitable.
 

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,026
Location
Devon
Try Final Fantasy 15 if you have it. When I played it over a year ago on my Titan XP it used 11.5GB VRAM. Now what will next gen exclusive games like Resident Evil 8 coming out next year need?

Sadly I don’t own any FF games.

It would appear to be a little remiss of Nvidia if the bulk of their new 4K destined cards only come with 8Gb VRAM.

I’m looking at the 12Gb 3080Ti personally, which should be OK, or at least it looks like it at the moment.

Do we think that the next gen consoles are going to influence a greater VRAM usage then?
 
Associate
Joined
21 Apr 2007
Posts
2,485
Do they need to though? Seems to me they've changed their business model and have gone from high volume stack 'em high, sell 'em (not so) cheap to low volume, high prestige products like as you say Apple after having disposed of the competition. Because they know people will buy them regardless, albeit in lower numbers. With the presumably high margins and lowered production and material costs its likely very profitable.

I guess not whilst there is little to no competition, but all that's really changed is the marketing kinda sux for the consumer. As a business model sure it makes sense as long as you don't **** off the people who buy your products.
 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,518
Location
Greater London
Sadly I don’t own any FF games.

It would appear to be a little remiss of Nvidia if the bulk of their new 4K destined cards only come with 8Gb VRAM.

I’m looking at the 12Gb 3080Ti personally, which should be OK, or at least it looks like it at the moment.

Do we think that the next gen consoles are going to influence a greater VRAM usage then?

I recon so yes. I mean the games are going to make full use of console memory which is more than 8gb. I mean we will always have the option of using lower settings, but paying £500 for a GPU alone and needing to do that is not on when the whole console does it for that price..
 
Soldato
Joined
19 May 2012
Posts
3,633
The more I think about the 3xxx series, the more I think I should just ride my RTX 2080 until its a bit irrelavant.

At the same time, if NVIDIA follow the same model of the 3080ti now lasting and being king for aslong as the 2080ti was, then I might just jump in.
 
Caporegime
Joined
18 Oct 2002
Posts
39,299
Location
Ireland
We're gonna have the now standard GPU launch, hilariously bad prices coupled with the standard convenient "lack of supply", so be prepared to tack on a good few quid to whatever the supposed launch price is.

Maybe they can throw in millipede this time around.
 
Soldato
Joined
6 Feb 2019
Posts
17,565
It looks like Nvidia messed up, or maybe they planned it this way

https://www.igorslab.de/en/chipsize-leaked-vidia-ampere-ga102-in-sams-8-nm-lpp-always-more-likely/

Samsung's "8nm" process is really nowhere close to being as efficient as TSMC's 7nm or 7nm+.
That's why Ampere is so power hungry with several high end cards featuring over 300w TDP specifications.

Perhaps Nvidia got cocky and decided to take the cheap option so they could maximise margins.

However if Nvidia manages to still stay ahead of AMD with a vastly inferior process again, that will be very interesting indeed. On the flip side, if AMD wins the next generation - it will be a big egg on Jensen's face for taking the cheap option to go with Samsung instead of the industry leading process node from TSMC.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,687
Location
Uk
It looks like Nvidia messed up, or maybe they planned it this way

https://www.igorslab.de/en/chipsize-leaked-vidia-ampere-ga102-in-sams-8-nm-lpp-always-more-likely/

Samsung's "8nm" process is really nowhere close to being as efficient as TSMC's 7nm or 7nm+.
That's why Ampere is so power hungry with several high end cards featuring over 300w TDP specifications.

Perhaps Nvidia got cocky and decided to take the cheap option so they could maximise margins.

However if Nvidia manages to still stay ahead of AMD with a vastly inferior process again, that will be very interesting indeed. On the flip side, if AMD wins the next generation - it will be a big egg on Jensen's face for taking the cheap option to go with Samsung instead of the industry leading process node from TSMC.
These cards would have been designed a couple of years ago before AMD even made the jump to 7nm so maybe nvidia did get complacent as AMD were in no mans land at the time.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,053
It looks like Nvidia messed up, or maybe they planned it this way

https://www.igorslab.de/en/chipsize-leaked-vidia-ampere-ga102-in-sams-8-nm-lpp-always-more-likely/

Samsung's "8nm" process is really nowhere close to being as efficient as TSMC's 7nm or 7nm+.
That's why Ampere is so power hungry with several high end cards featuring over 300w TDP specifications.

Perhaps Nvidia got cocky and decided to take the cheap option so they could maximise margins.

However if Nvidia manages to still stay ahead of AMD with a vastly inferior process again, that will be very interesting indeed. On the flip side, if AMD wins the next generation - it will be a big egg on Jensen's face for taking the cheap option to go with Samsung instead of the industry leading process node from TSMC.

nVidia spent a lot of time evaluating price, availability, yields and so on - whatever decision was made won't have been made without a lot of planning.

I'd be careful as well when it comes to people on periphery of the industry as they might not have the complete picture when it comes to nVidia plans on this.
 
Caporegime
Joined
17 Mar 2012
Posts
47,568
Location
ARC-L1, Stanton System
It looks like Nvidia messed up, or maybe they planned it this way

https://www.igorslab.de/en/chipsize-leaked-vidia-ampere-ga102-in-sams-8-nm-lpp-always-more-likely/

Samsung's "8nm" process is really nowhere close to being as efficient as TSMC's 7nm or 7nm+.
That's why Ampere is so power hungry with several high end cards featuring over 300w TDP specifications.

Perhaps Nvidia got cocky and decided to take the cheap option so they could maximise margins.

However if Nvidia manages to still stay ahead of AMD with a vastly inferior process again, that will be very interesting indeed. On the flip side, if AMD wins the next generation - it will be a big egg on Jensen's face for taking the cheap option to go with Samsung instead of the industry leading process node from TSMC.

vPsWigv.png
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
I guess Nvidia are looking to drip feed marginal performance gains for as long as they can. It's the best way to drain everyone's wallet repeatedly.
 
Soldato
Joined
18 May 2010
Posts
22,372
Location
London
Well this only happens when a company feels no pressure.

Look at Intel prior to Ryzen. Lazy lazy increases year on year for what a decade.

AMD and the other players need to step up and we need a GPU Ryzen moment from another player.
 
Back
Top Bottom