1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

Nvidia recognises 10gb isn't enough - RTX3080 gets a VRAM upgrade

Discussion in 'Graphics Cards' started by Grim5, 13 Oct 2021.

  1. Grim5

    Sgarrista

    Joined: 6 Feb 2019

    Posts: 9,220

  2. LtMatt

    Vendor Rep

    Joined: 12 Jul 2007

    Posts: 36,489

    Location: United Kingdom

  3. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 2,410

    Location: Devon

    And there was me thinking Nvidia knew what they were doing. Now I’m not so sure!

    Would be a bit annoyed if I had a 3080 and they revised the card, added more VRAM and released it as a replacement SKU.
     
  4. robfosters

    Caporegime

    Joined: 1 Dec 2010

    Posts: 41,751

    Location: Welling, London

    Doesn’t bother me. I’m an UW gamer and 10GB does just fine.
     
  5. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 2,410

    Location: Devon

    Fair enough, however it looks like Nvidia now disagree.
     
  6. varkanoid

    Capodecina

    Joined: 31 Dec 2007

    Posts: 12,606

    Location: The TARDIS, Wakefield, UK

    Its called a Super they have done it before. Its not like its a surprise.

    Is that a typo ?
     
  7. robfosters

    Caporegime

    Joined: 1 Dec 2010

    Posts: 41,751

    Location: Welling, London

    I do think though, upgrades like these should be offered as a reasonably priced step up for 3080 owners.
     
  8. HRL

    Wise Guy

    Joined: 22 Nov 2005

    Posts: 2,410

    Location: Devon

    Is it? Understand if it’s branded a Super instead but I read it as they are just replacing the SKU with the upgraded VRAM.
     
  9. shankly1985

    Capodecina

    Joined: 25 Nov 2011

    Posts: 20,337

    Location: The KOP

    £££

    Anyone really believe nvidia didn't know how much VRAM is required? Really nvidia you work with game developers every day.
     
  10. Grim5

    Sgarrista

    Joined: 6 Feb 2019

    Posts: 9,220


    Nvidia has done it before, gtx1060 for example
     
  11. LtMatt

    Vendor Rep

    Joined: 12 Jul 2007

    Posts: 36,489

    Location: United Kingdom

    Good spot, maybe it's a 2060 Super with 12GB?
     
  12. Bill Turnip

    Gangster

    Joined: 1 Oct 2020

    Posts: 203

    10 or 12 GB, you'd almost certainly need to turn down the same setting whichever card you go for, so unless they discontinue the 3080 completely (It's arguable they basically have already) and raise the price, I don't get the point. Think I've spotted the point, now I've written it down...
     
  13. dualsense1673

    Perma Banned

    Joined: 30 Sep 2021

    Posts: 145

    Location: Minas Morgul

    yup
    2 gigs 960 (4 gigs 1050ti managed to make use of full flat 4 gb. so could the 960)
    3.5 gigs 970
    2 gb 770 + 3 gb 780
    3 gb alternate 1060
    4 gb 3050
    8 gb 3070
    10 gb 3080

    list goes on

    their tricks never end :)

    no proper dx12/vulkan support, both hardware and software wise for kepler fermi
    semi dx12 capable maxwell cards
    gimped async support for pascal cardcs. fake async cards... that had no benefits from actual async applications in games
    gimped fp16 for pascal so that their workload cards can get away
    no sampler feedback streaming hardware unit on ampere/turing cards (they're practically not capable of same things as ps5/xbox)

    important distinction. sampler feedback and sampler feedback streaming are completely seperate things. sampler feedback streaming IS the actual technology that saves the memory. sampler feedback is just one part of it. there needs to be dedicated hardware , ps5 sx have it now. rdna2 is also gimped in this aspect. i dont know how crucial, beneficial or integral this tech will be for future console games, but we shall about that.

    and of course, gimped sampler feedback itself in ampere and turing cards, only feature level 0.9 and all other nextgen counterparts rdna2 ps5 xbox sx have 1.0 sampler feedback.

    im pretty sure trhere are more hidden gimps behind the scenes that are yet to be revealed.
     
  14. Jay-G25

    Gangster

    Joined: 8 Sep 2020

    Posts: 452

    10GB/12GB is absolutely fine for 4K unless you are including Far Cry 6 , are there any other games ? I am playing on a 3090 and over the last year with all the games i have played at 4K ultra settings i have never seen actual Vram used go over 10GB , most will allocate around 10GB but only use 8 to 9 GB while playing. Going forward it may cause some issues but as of now there are very very few games that actually require over 10GB at 4K settings from my own experience.
     
  15. _Cereal_

    Gangster

    Joined: 23 Oct 2019

    Posts: 476


    If you had gotten a 3080FE for 650 no reason to get annoyed at all since it's not like for that price range there would be any alternatives and is still a bloody good card.
     
  16. Shaz12

    Hitman

    Joined: 25 Apr 2017

    Posts: 828

    That 2GB isn’t anywhere near material enough to be honest. I doubt 3080 owners are annoyed. It’s more likely that 3080 Ti and 3090 owners would be irritated by the 3090 Ti
     
  17. Perfect_Chaos

    Mobster

    Joined: 26 Aug 2004

    Posts: 4,818

    Location: South Wales

    Would rather knock the settings down slightly for higher frame rates anyway.
     
  18. Bill Turnip

    Gangster

    Joined: 1 Oct 2020

    Posts: 203

    It'll be interesting to see the RRP increase for 2GB of ram though...
     
  19. Alt0153

    Gangster

    Joined: 11 Jan 2021

    Posts: 101

    at least $100 per gb cause we all know thats the bom cost
     
  20. RavenXXX2

    Capodecina

    Joined: 6 Oct 2007

    Posts: 19,370

    Location: North West

    10GB Geforce is enough for me, will take that over a 16GB Radeon any day of the week.