1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Discussion in 'Graphics Cards' started by Gregster, Nov 8, 2019.

  1. JediFragger

    Capodecina

    Joined: Oct 18, 2002

    Posts: 22,332

    Location: y0 Momma's a$$

    Yes, the bull****ting industry :D
     
  2. EastCoastHandle

    Wise Guy

    Joined: Jun 8, 2018

    Posts: 1,291

    Last edited: Jun 26, 2020
  3. BigBANGtheory

    Wise Guy

    Joined: Apr 21, 2007

    Posts: 1,219

    Probably not this time around sadly
     
  4. Grim5

    Suspended

    Joined: Feb 6, 2019

    Posts: 3,588

    dlss 2.0 does not require game training, it's now a generalised training AI system that runs 24/7 learning and adapting - the output is imported into a game patch or launch files. So in theory if the developer asked/paid for it Nvidia can go back to other DLSS game and provide the latest training data and improve the visuals - in theory if they were patching the game on a frequent basis the image quality would keep getting better as time goes on.

    The main downside at this point is that each game still needs to be individually patched/injected to make DLSS 2.0 work but then there are the rumours of DLSS 3.0 - which doesn't require any game updates and DLSS 3.0 (in the rumours) can be injected into any game via the Nvidia driver so like game ready drivers, each new driver just contain the DLSS 3.0 profile to inject for whatever games that choose to support for that driver release.

    Hopefullybthat rumour is true because that's what the end game is - being able to support all or most games without the developer having to lift a finger
     
    Last edited: Jun 26, 2020
  5. humbug

    Caporegime

    Joined: Mar 17, 2012

    Posts: 34,013

    I think he's right, DirectML will become the standard given Microsoft are making it the standard.

    This seems to me a continuation of the age-old battle of ecosystems, as with PhysX and Gameworks Nvidia with their own versions of RT acceleration methods and resolution scaling are trying to set a standard that locks AMD out, trying to create a world where AMD can't "Run games properly" trying to force them into the 'Basics' section of the graphics supermarket.

    Nothing wrong with that, its a tactic and Nvidia have been working very hard on it for about a decade, but it won't work, it was Nvidia who had to run after game devs paying them to use PhysX and Gameworks, AMD had to do the same with their own ecosystem, remember TressXF with Tomb Raider? Whatever that audio thing was in Thief, and Global Illumination in Dirt Showdown <- that was the first time one of these vendors pushed Ray Traced lighting technology in a game, this shows the difference between Nvidia and AMD when it comes to marketing, Nvidia are very good at communicating what something is and why it matters..... AMD just go "We brought something a bit left field to the masses, yay PC socialism"

    Despite AMD's ineptitude in Marketing Nvidia with their brilliance haven't managed to get any Ecosystem they created to stick either, because game developers just don't go in for vendor specific features unless they are paid to do so by said vendor, even with Nvidia screaming "But But..... we own nearly the whole market, AMD don't exist"

    DirectML is different because the thing that game developers care most about is Consoles, its where they get the most sales, and if Microsoft says to them, and they are the ones with the real influence: "Use DirectML to get your games looking good and running well on the consoles" that is what they will do.

    Nvidia will get a lot of attention when they megaphone their brand connection with Cyberpunk 2077 waxing lyrical about their ecosystem in the game........ and only Nvidia fans will care!
     
    Last edited: Jun 27, 2020
  6. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,721

    Location: Planet Earth

    The "next generation" console version of Cyberpunk 2077 is only coming in 2021 according to CDPR. So this delay is more for Ampere. So I have little faith that Pascal,Turing or RDNA1 GPUs will look OK at launch. I would loved to be proven wrong,but I suspect Ampere will be the GPU of choice for the game. Then in 2021,probably things will look "better" for others. Nvidia will make sure they will have a magical driver fix months later,and CDPR some magical update during the same period.
     
  7. Besty

    Mobster

    Joined: Oct 18, 2002

    Posts: 3,471

    I don't know what the preview articles had said but given history, this new AMD card (5900XT) i think will be 2080ti speed for £499inc VAT. I doubt it will be 15% quicker than a 2080ti but I don't really care if it is.

    Considering the cheapest 2080ti dual-fan i can find is £1,077inc vat.
     
  8. Grim5

    Suspended

    Joined: Feb 6, 2019

    Posts: 3,588

    Oh geez
     
  9. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,721

    Location: Planet Earth

    CDPR on record,says the version for the new generation consoles is 2021,ie,the one made with RDNA2 in mind. So the delay isn't for the new consoles,its for Ampere. Then look back at the launch of W3. Maxwell did the best by far,but Kepler and GCN based cards had problems. These took months to solve,because CDPR had basically put in stuff,which was made specifically with Maxwell in mind,just two months before release. Months later Nvidia released "better drivers" and CDPR put in extra menu options,etc. Nvidia even bundled W3 with some of its Maxwell GPUs.

    So IMHO,Ampere is the priority for Cyberpunk 2077(well at least in 2020),and the other GPUs will have to wait. Don't you think its weird CDPR is making the RTX2080TI look rather weak in it's demos?? They are actively working with Nvidia on this game,so their marketing wouldn't want to undersell their best GPU months before launch. Now imagine,if they demo the new RTX3080TI and its twice as quick,etc....I see where this is going.

    I love to be wrong,but money talks! :p
     
    Last edited: Jun 27, 2020
  10. EastCoastHandle

    Wise Guy

    Joined: Jun 8, 2018

    Posts: 1,291

    I wouldn't consider any game that was specifically release for hardware I didn't have on the PC.
     
  11. Calin Banc

    Wise Guy

    Joined: Aug 14, 2009

    Posts: 1,061

    Played on a R290, but I can't remember any major issue except the AA/tessellation issue with gameworks. Tessellation already had a solution at driver level, as AMD was providing at the time, through their own control panel, the option to reduce the tessellation at levels selected by the gamer.

    I'm not sure why CDPR would cripple on purpose a huge user base, for a tinny bit of money. Why not just make the game truly next gen, go RT all in and don't sell it to current consoles players or players outside of RTX 2xxx series? :)
     
  12. LePhuronn

    Mobster

    Joined: Sep 26, 2010

    Posts: 4,989

    Location: Stoke-on-Trent

    Eh? So CDPR shouldn't hold back Cyberpunk purely for Ampere, but should restrict it purely to 2080 Ti owners?

    I'm not sure if it's this new coffee I'm trying or your astounding logic that's making my head spin with confusion.
     
  13. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,721

    Location: Planet Earth

    They could lock it down,but much easier for people to try and run it with what they have,find it does not perform that great,and buy a new GPU. These are the people who obviously want to run it as soon as possible and can't wait.

    Then they give it a while until all the fans buy it,and fix performance a bit for others,and mop up the sales of the second wave of buyers,etc.

    Well,they did it with W3 though. So many bought new Maxwell GPUs to play it. They said it was delayed by 3 months for bug fixing,and 2 months before launch dropped in Gameworks features,some of which replaced stuff which was done via other methods(look at the consoles). All that did was add even more "bugs" to the game.

    AMD had some words to say about it:

    W3 was also bundled with Maxwell GPUs - it was a big selling point to buy Maxwell. Then months later,Nvidia introduced fixed drivers,and CDPR made options available in the menus.

    CDPR despite the delay were more worried about plonking in Gameworks stuff,than fixing performance,etc on Kepler. They knew very well what state the game was,and as developer and publisher didn't seemed to care about the huge installed userbase of Kepler owners,and AMD owners luckily had driver hacks to minimise the damage.

    CDPR specifically said the next generation console version of Cyberpunk 2077 will be a 2021 release,and since we know they probably will launch this year,if CDPR had financial motivations they should be working on that too,not delaying it. So the next generation consoles will be running the current generation console version(apparently):

    https://www.videogameschronicle.com/news/full-next-gen-cyberpunk-2077-wont-be-a-launch-game/

    So they are prioritising the PC version over the next generation console version,and loads of people are going to be buying them. It wouldn't surprise they will sell more of the consoles in 2020 than some of these high end next generation GPUs.

    The lack of noise from AMD is concerning - I think they know this is going to be a heavily Ampere focused title at launch.

    You could ask the question about installed userbase for so many features though,which ran very poorly on the majority of GPUs. UE4 without optimisation runs,better on Nvidia as it has Nvidia specific features at the engine level(UE4 based titles can have some of the biggest performance deltas in favour of Nvidia,if they are not optimised for). Yet you could argue consoles are AMD based,and AMD still has a decent share of the dGPU market(there is a UE4 branch for AMD which was recently introduced).

    You talk about tessellation,it was shown years ago,Nvidia was overusing tessellation for many effects. When AMD started introducing driver level optimisations,they were attacked for cheating IIRC,but it showed how overused it was.

    Yet,many Nvidia users couldn't adjust tessellation,ie,I still remember by HD5850 with some tessellation tweaks in software ran some tessellation heavy games better than a GTX460,etc. The same with Gameworks features - the fact AMD had to push driver hacks,shows you how poorly optimised these features were most of the existing userbase. The same goes with PhysX,they made the CPU branch run with X87 instructions which hammered CPU performance,so they could show it ran better on a GPU,yet it ran really poorly on many GPUs,especially mainstream ones. In some games,with no PhysX activated,ALL the physics effects would disappear(people on here did some analysis),even the normal CPU generated ones. Basically it screwed over people especially Nvidia users.

    Why do you think this is done?? To sell more next generation GPUs and higher end GPUs,and it seems to be too many developers seem fine with it. So whatever is being provided is good enough for them to do this. This is also the very reasons,consoles seem to do "much better" than you would expect with their limited hardware - developers try their best to get the most out of them. On PC,both AMD and Nvidia want to sell more GPUs,so it's not in their interest to optimise for old ones if they can get away with it.

    Sponsorships are not just for "brand awareness" but also as a vehicle to sell more GPUs. I wouldn't be surprised if Ampere has Cyberpunk 2077 bundled with it.

    I could be wrong,I hope I am wrong,and the game runs farely OK on Pascal,Turing,RDNA1,RDNA2etc but I am not going to bet on it!
     
    Last edited: Jun 27, 2020
  14. weldon855

    Hitman

    Joined: Oct 21, 2013

    Posts: 845

    Location: Ild

    can we merge this thread and the Ampere one and just call it cyberpunk?
     
  15. humbug

    Caporegime

    Joined: Mar 17, 2012

    Posts: 34,013

    Toms Hardware many years ago wrote an article on PhysX in Metro 2033 comparing the PhysX performance on a Phenom II 1100T vs a GTX 480, the 6 core Phenom CPU ran PhysX better than the GTX 480. With a chart showing all 6 cores on the CPU evenly loaded up.

    From that point on Nvidia changed their approach to PhysX running on CPU's and gimped the crap out of it.
     
  16. EastCoastHandle

    Wise Guy

    Joined: Jun 8, 2018

    Posts: 1,291

    @CAT-THE-FIFTH
    Well said. A lot of detailed examples.
    I forgot about that. It's not hard to believe that CP2077 performance on Big Navi will also be sabotaged. :eek::eek::eek:.
     
  17. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,721

    Location: Planet Earth

    It worries me as the version for the new generation consoles is delayed,as these would help with RDNA2 optimisations. I really hope AMD with its new found riches,understands they need to get onboard with CDPR,and devote resources to this game. This will be a relevant game in reviews for years,and W3 was a game which is still popular today,especially due to the replayability of open world RPG games.

    If not Big Navi will be a big flop,as the whole tech press will concentrate on Cyberpunk 2077 performance for years.

    Sorry! :( I just feel big Navi will end up a big flop,if Cyberpunk 2077 does not run well on it. AMD really needs to get onboard and try to make sure it runs OK on it. I really hope they understand how damaging poor performance in Cyberpunk 2077 will be for the perception of RDNA2 based GPUs.
     
  18. LoadsaMoney

    Caporegime

    Joined: Jul 8, 2003

    Posts: 28,865

    Location: In a house

    I reckon these specs for it are well off :p

     
  19. humbug

    Caporegime

    Joined: Mar 17, 2012

    Posts: 34,013

    Of course it will, i don't know but IMO Nvidia will have practically paid for the games development and stuffed it full of their crap, sabotaging not just performance on Navi but to a lesser extent even their own GPU's to make you buy the more expensive cards to get the performance you want.

    This sort of crap has been going on for a long time and AMD have been guilty of doing this too, its just that Nvidia have more money and are better at it. its why i hate black box vendor ecosystems, we always lose and the really ##### up thing about that is most of us think we are winning with loyalty to this or that brand.

    AMD pushing consoles ever more competitive with Desktops is both a blessing and a curse, they are all be it for their own selfish reasons making sure the new 'agnostic' technologies they are developing are filtering into desktop, and AMD are good, very good at streamlining advanced technologies so they work and work well with minimal hardware resource expenditure and this should keep prices of the hardware down, but more and more people will simply move over to console which shrinks the market driving up prices.

    AMD haven't yet given up on the desktop enthusiast space, they are still trying to compete here but its also clear they no longer see their future here.
     
  20. CAT-THE-FIFTH

    Capodecina

    Joined: Nov 9, 2009

    Posts: 19,721

    Location: Planet Earth

    Maybe it will magically run well on all GPUs and be massively scaleable. You can spend your Ampere monies on gold plating your RTX2070 then,propa big money style! :p

    It's just really weird,you don't seem many of these really big games sponsored by AMD anymore,so maybe they are just minimising their expenditure on desktop. It wouldn't surprise me if RDNA2 was actually instigated due to requirements MS and Sony had,and we are just getting a refreshed line because it was cost effective to do so.