1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

NVIDIA ‘Ampere’ 8nm Graphics Cards

Discussion in 'Graphics Cards' started by LoadsaMoney, 4 Oct 2019.

  1. robfosters

    Caporegime

    Joined: 1 Dec 2010

    Posts: 40,362

    Location: Welling, London

    Sorry, is the 3080 standard 2X8 pin for power?
     
  2. Kelt

    Capodecina

    Joined: 14 Nov 2007

    Posts: 12,255

    Location: With the færies wearing black cherries for rings

    This is my thinking.

    One really well optimised game is COD WW2.

    I can run that maxed out at 3440x100 (apart from using FXAA, my old eyes are happy with that) and it maintains an easy 60fps locked.

    Afterburner shows the highest memory useage as around 7.5GB, cached or fully utilised, who knows.

    10GB gives a little more leeway, games might never exceed it in the two or three years I plan to keep it, but nice to have anyways.
     
  3. Kelt

    Capodecina

    Joined: 14 Nov 2007

    Posts: 12,255

    Location: With the færies wearing black cherries for rings

    No, some cards are using 3x8.
     
  4. dante6491

    Soldato

    Joined: 30 Jun 2006

    Posts: 6,077

    Location: London

    Likewise. Vega56 to a 3080 I suspect. Would like to be able to fully utilise my LG 27GL850 so something that pushes everything at high detail at above 140fps.
     
  5. Th0nt

    Capodecina

    Joined: 21 Jul 2005

    Posts: 13,924

    Location: N.Ireland

    I want to see AMD equivalent first before I pull the trigger.
     
  6. IvanDobskey

    Sgarrista

    Joined: 2 Feb 2010

    Posts: 9,095

    Location: East Midlands

    Will be waiting for proper reviews and more options before I make any decision.
     
  7. robfosters

    Caporegime

    Joined: 1 Dec 2010

    Posts: 40,362

    Location: Welling, London

    But will those come with an adaptor to convert an 8 pin to 2X8 pin?
     
  8. bicpug

    Gangster

    Joined: 30 Jul 2013

    Posts: 204

    will we start seeing reviews now or do they only appear once the cards are on sale?
     
  9. XeNoNF50

    Wise Guy

    Joined: 11 Jan 2016

    Posts: 2,241

    Location: Surrey

    I've genuinely got 13 fans currently in my case lol
     
  10. Poneros

    Soldato

    Joined: 18 Feb 2015

    Posts: 5,542

    Yup. Already sold my V64 & been suffering without it for about 2 months now. I was actually happy with the performance overall but lacking DX12_2 annoyed me greatly & I want to start using CUDA programs. If you stay at 1440p imo you don't need more than 3070.

    The extra speed can help - slightly. It's a question of how much extra vram do you really need. If you run out of it then what happens is the GPU will start juggling more memory around and will request more through RAM & storage. In practice this means either slight stutters, or major stutters, or down-right freezing - depending on game & amount required.
    Worst case scenario (look at HBCC off):


    Reviews are probably going to launch when the cards launch, or a day before.
     
  11. robfosters

    Caporegime

    Joined: 1 Dec 2010

    Posts: 40,362

    Location: Welling, London

    Surprised it doesn’t take off!
     
  12. ihatelag

    Wise Guy

    Joined: 13 Apr 2006

    Posts: 1,140

  13. Chuk_Chuk

    Mobster

    Joined: 12 May 2014

    Posts: 2,760

    I believe you end up with hitching and a large drops in frame rates.
     
  14. 3t3P

    Mobster

    Joined: 20 Dec 2006

    Posts: 3,749

    N1 cheers
     
  15. Born_2_Kill_83

    Underboss

    Joined: 10 Nov 2006

    Posts: 8,324

    Location: Lincolnshire

    Think I'll go 3080 now and brave the 10gb VRAM (which probably won't matter), but I want Cyberpunk performance, then take £100/200 hit on my 3080 and upgrade to the Ti when it comes out.
     
  16. ALXAndy

    Capodecina

    Joined: 23 Apr 2010

    Posts: 10,734

    Location: West Sussex

    It's not my problem you can not find your own posts to back up what you are saying.

    Not sure what I am supposed to have lost, BTW. Please be sure to point it out without twisting what Jen or I said and I am all ears.
     
  17. TNA

    Capodecina

    Joined: 13 Mar 2008

    Posts: 19,237

    Location: London

    I have yet to read up about it properly. Wondering if it is the same as AMD's HBCC or something better?

    Based on what I read I may end up either either a 3080 or 3070S/Ti that has 16GB.

    Wonder if Cyberpunk will make use of it on launch?


    Lol. That cracked me up Gregster. Did you not pay double that around 2 years ago for a much inferior GPU? :p:D

    That said funnily enough I do kind of agree with you, it should have been kept under £600.


    Haha :D


    I have had that comment made about me a few times from girls. The funny thing is I am not even that tall?


    You finally upgrading that 980 SLI setup?

    What you going for?


    I would not be feeling so sorry for him, likely before the year is out he will be rocking a 4 figure Ampere GPU.


    Yeah yeah. Keep crying. Thought you was gona stop posting? Lol.
     
  18. shamus21

    Wise Guy

    Joined: 29 Aug 2004

    Posts: 2,218

    Location: Alpha centauri

    Wait for all the bench marks first philmo
     
  19. deuse

    Capodecina

    Joined: 17 Jul 2007

    Posts: 23,489

    Location: Solihull-Florida


    I did look at the zotac 3090.
    But there Boost clock is slightly down, but not much.

    I will go to NV shop.
    Then if things do go pear shaped, then it's only Me to NV...NV to me.
    No messing about send it back to another company.
     
  20. Bravetart

    Gangster

    Joined: 31 Jul 2009

    Posts: 344

    Location: Reading