1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

Nvidia recognises 10gb isn't enough - RTX3080 gets a VRAM upgrade

Discussion in 'Graphics Cards' started by Grim5, 13 Oct 2021.

  1. Mesai

    Hitman

    Joined: 8 Oct 2020

    Posts: 854

    Not sure what the point would be unless it’s to align production with the other 12/24 configs.

    For 4K you’re better off with a 6900xt, 3080ti or 3090.
     
  2. tyler_jrb

    Mobster

    Joined: 24 Aug 2013

    Posts: 4,549

    Location: Lincolnshire

    That will very likely be on a card with more than 10GB though? So it will use more. I've yet to see a card run out of vram from experience (unless were talking about the botched 970).

    Have seen some gameplay of the 3080 playing FC6 at 4k maxed settings just fine. 8k perhaps? Surely then were talking slideshow levels of performance even for a Ti/3090.

    I used to play 4k 60 back in the day with SLI 4GB 980's using something stupid like 4076mb/4096mb. Never gave me an issue though and ran a solid 60fps.

    More is better obviously. 10GB should be more than enough targeting 4k/60hz though.
     
  3. Finners

    Mobster

    Joined: 27 Mar 2009

    Posts: 2,818

    I bought a 3080 knowing that 10gb is/will be on the low side but at RRP in the current climate it feels like decent (not great) value for money puirchase. I'd be very surprised if a 3080 12gb in near to £650, Nvidia release super's or Ti when I'm assuming their contracts change with the foundary's. TSMC/Samsung have said they are rising prices so these will of course be passed onto us plus the extra 2gb on VRAM.
     
  4. LtMatt

    Caporegime

    Joined: 12 Jul 2007

    Posts: 36,210

    Location: United Kingdom

    Please can you show me that gameplay with a link?
     
  5. Chuk_Chuk

    Mobster

    Joined: 12 May 2014

    Posts: 2,830

    That has less to do with the extra 2GB module on the BOM cost and more to do with Nvidia regretting they ever released the 3080 for £650.

    Jensen can rest easy knowing they didn't make the mistake of launching the card with 12GB in the first place (assuming rumours are true).
     
  6. tyler_jrb

    Mobster

    Joined: 24 Aug 2013

    Posts: 4,549

    Location: Lincolnshire

     
  7. LtMatt

    Caporegime

    Joined: 12 Jul 2007

    Posts: 36,210

    Location: United Kingdom

    Last edited: 13 Oct 2021
  8. willhub

    Capodecina

    Joined: 3 Jan 2006

    Posts: 24,020

    Location: Chadderton, Oldham

    And the 12gb 3080 will probably be the price of a 3080ti so might as well go with 3080ti.

    It seems like I was only using about 8gb of vram in farcry 6 and that was with the HD pack
     
  9. tyler_jrb

    Mobster

    Joined: 24 Aug 2013

    Posts: 4,549

    Location: Lincolnshire

    Im guessing said 3080 doesnt have enough to support the texture pack? Either way shoddy design work by ubisoft as usual lol looking at the state of those dashboard textures and 30fps cutscenes.
     
    Last edited: 13 Oct 2021
  10. LtMatt

    Caporegime

    Joined: 12 Jul 2007

    Posts: 36,210

    Location: United Kingdom

    Tyler, I've got to be honest with you. Your original post scares me beyond belief and I saw it before your edit.

    Perhaps even more worryingly, your edit is not that much better although it does move the goal post a bit.

    I will pray for you tonight regardless.

    [​IMG]

    [​IMG]

    Those are captures from compressed video, both at 4K.
     
  11. bemaniac

    Mobster

    Joined: 30 Jul 2006

    Posts: 2,993

    Pretty annoyed for 3080 customers running games maxed out at 4k except having to lower textures in Far Cry 6 to medium.

    This was my concern before during and after the launch and why I was almost relieved I couldn't secure one as the 3090 was much easier to obtain at launch RRP and also seems to have been cost effective given I've gone above 10gb many times and use the card at least daily. I bought a random brand that was available and was also lucky it was not an evga or a gigabyte as there seems to be so many duds to dodge in the GPU industry now.
     
  12. Nexus18

    Caporegime

    Joined: 4 Jun 2009

    Posts: 25,121

    @tyler_jrb

    Worth reading last 3-4 pages of the far cry thread in the GPU sub forum to see the "issues" as well as the ubi thread ;)
     
  13. Darujhistan

    Soldato

    Joined: 28 Oct 2011

    Posts: 5,219

    NV being NV, cheaped out at 10GB, and lo and behold a year later here's the 2GB we robbed you of.

    That will be £900 please.
     
  14. rumple9

    Gangster

    Joined: 28 May 2010

    Posts: 237

    The card is pretty pointless when the 4000 is out in in 12 months
     
  15. chroniclard

    Capodecina

    Joined: 23 Apr 2014

    Posts: 20,496

    Location: Hertfordshire

    I bought a 3080 for 1440p, should be good for a while. :p
     
  16. Mesai

    Hitman

    Joined: 8 Oct 2020

    Posts: 854

    You can still do 4K, just not Godfall or FC6, which is no real loss. Majority of 3080 owners are unlikely to be 4K users.
     
  17. Nexus18

    Caporegime

    Joined: 4 Jun 2009

    Posts: 25,121

    Can we also please drop the godfall argument too :cry:

    Would hardly say 3080 was struggling here :p

    Oh wait, sorry, according to some "not the correct settings" :cry:
     
  18. LtMatt

    Caporegime

    Joined: 12 Jul 2007

    Posts: 36,210

    Location: United Kingdom

    He still can’t figure it out. :D
     
  19. Nexus18

    Caporegime

    Joined: 4 Jun 2009

    Posts: 25,121

    Post links and educate me then my boi :D
     
  20. mmj_uk

    Caporegime

    Joined: 26 Dec 2003

    Posts: 25,691

    The compression looks a lot worse in the NVidia though so not sure it's a fair comparison, if you can find a similar situation in the same video that will take the video compression factor out of it and would be a lot more scientific.