1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Are NVidia clearing the decks for a new high end card?

Discussion in 'Graphics Cards' started by Kaapstad, Jan 18, 2020.

  1. LoadsaMoney

    Caporegime

    Joined: Jul 8, 2003

    Posts: 28,002

    Location: In a house

    Once you go Black..........

    :D
     
  2. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 12,911

    Location: London

    ...you never go back?

    :p
     
  3. scubes

    Wise Guy

    Joined: May 28, 2007

    Posts: 1,746

    Location: Saturn’s moon Titan

    i think devs are to lazy with current tech they should maximize the pc but lazy ass devs dont think that way anymore..
     
  4. Th0nt

    Soldato

    Joined: Jul 21, 2005

    Posts: 6,754

    Location: N.Ireland

    Lazy ass devs also dont want to contemplate the off chance there are users that may want to use multi-gpu.
     
  5. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 12,911

    Location: London

    I think it comes down to time for reward. The extra time needed to do these things will not generate enough extra sales to make it worth it is what it sadly comes down to imo.
     
  6. EsaT

    Soldato

    Joined: Jun 6, 2008

    Posts: 6,789

    Location: Finland

    Well, if you paid £100+ for a game I'm sure they would spend more resources to squeezing all possible graphics candy out of hardware.
    But because huge majority of player base won't be paying as much from game any more than Nvidias pumped up high end prices, developers won't do that.
     
  7. BigBANGtheory

    Hitman

    Joined: Apr 21, 2007

    Posts: 786

    The three big warning signs for me this time around are:

    1. Nvidia selling ray tracing performance improvements over general rendering/rasterisation
    2. Price movement on the tiers relative to tangible performance gains
    3. Delaying a Ti high end part unnecessarily
     
  8. scubes

    Wise Guy

    Joined: May 28, 2007

    Posts: 1,746

    Location: Saturn’s moon Titan

    to be honest i think the console ports are a joke they make them worse looking than consoles which the pc could do a hell of a lot better they should be ashamed of themselves. and the prices of the games themselves are a joke ...
     
  9. BigBANGtheory

    Hitman

    Joined: Apr 21, 2007

    Posts: 786

    Console games are like printer cartridges sold to cover the license/tax of the platform
     
  10. Th0nt

    Soldato

    Joined: Jul 21, 2005

    Posts: 6,754

    Location: N.Ireland

    Its not necessarily about cost though. If you build a game that taps into whatever resources are available - think ashes of singularity, or a tool like Blender. As the GPU makers have shifted the responsibility to the game makers to utilise, the reason we are not seeing good scaling and monster performance improvements you could argue is narrow thinking and industry mindset of churn out as fast as possible to move onto another project.

    Just like you see with multi-core CPUs finally catching on (unfortunately adoption does take time) the GPU scene is going to revisit this at some point. To me though the whole point of having it was not just for the big spenders that got SLI titans because they could, it was about plopping in another GPU of the same a year or two down the line when they were much cheaper but could potentially double your candy.
     
  11. bru

    Soldato

    Joined: Oct 21, 2002

    Posts: 7,192

    Location: kent

    Rumour has it that NVidia's Hopper architecture which is suppose to come after Ampere will be a MCM (multi chip module) architecture. If this is the case I think it will be completely different to the way that currant SLI works, even though I'm not sure how, but NVidia have plenty of very clever people who are probably working on it already, so I'm sure they will sort it out.:)
     
  12. Chrisc

    Wise Guy

    Joined: Jun 29, 2016

    Posts: 1,122

    Location: Up Norf

    Same, i have a £500 limit, i may go over slightly but it will have to be something special. im hoping the 5800 will be around the £500 mark with a decent increase over a 2070s
     
  13. CuriousTomCat

    Hitman

    Joined: Nov 22, 2018

    Posts: 792

    Exactly, devs are lazy. I think stagnation in the GPU market is a good for the gaming industry. It means developers have to spend more time optimising for the GPUs available. Instead of thinking "so our code requires x GPU. So be it, whatever", they'll now have to spend months and months getting it looking just as amazing on the mainstream GPUs available.
     
  14. Gregster

    Caporegime

    Joined: Sep 24, 2008

    Posts: 37,822

    Location: Essex innit!

    My second Zotac card and very impressed in truth. Good cooling, quiet and overclocks fairly well. I am becoming a fan of Zotac in truth.
     
  15. Troezar

    Mobster

    Joined: Aug 6, 2009

    Posts: 3,446

    Hopefully, although realistically my 5700XT is very good at 1440p and I can't see me going to 4K anytime soon. I may hold out for the 6000 series, 7nm euv?
     
  16. Chrisc

    Wise Guy

    Joined: Jun 29, 2016

    Posts: 1,122

    Location: Up Norf

    im on a vega 56 which doesnt quite have the legs for for a solid 1440p experience, i was tempted by the 5700xt but im not sure it offers enough of a jump to justify the cost.
     
  17. LoadsaMoney

    Caporegime

    Joined: Jul 8, 2003

    Posts: 28,002

    Location: In a house

    ;) :D
     
  18. aoaaron

    Wise Guy

    Joined: May 19, 2012

    Posts: 2,257

    I agree. especially number 3. I think that Nvidia will likely just release the Ti alongside all the other cards now though as the likelihood of a 3080 blowing the 2080 and 2080ti out of the water is quite low as long as AMD don’t bring anything to the table.

    Their pricing strategy was spot on last gen (sadly for us) and maximised probably the amount of 2080ti owners to the extent its now pretty much normal to have a 2080ti on enthusiast forums whilst before it was still a BIT of a rarity.
     
  19. Th0nt

    Soldato

    Joined: Jul 21, 2005

    Posts: 6,754

    Location: N.Ireland

    Yeah it sounds likely, its a shame we have had regression over the past couple of years due to nobody wanting the hot potatoe to do the donkey work for it to be working as it should be.
     
  20. Poneros

    Mobster

    Joined: Feb 18, 2015

    Posts: 3,497

    Considering they're using TU104 chips for even the 2060 now (same chip as 2080 but obv. with stuff disabled), I'd say the answer to the title question is a definite yes.