1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Discussion in 'Graphics Cards' started by Gregster, Nov 8, 2019.

  1. random_matt

    Hitman

    Joined: Jul 30, 2012

    Posts: 676

    Something better land this June, I need to buy.
     
  2. Boot$y

    Associate

    Joined: Jan 10, 2020

    Posts: 5

    Same here I was looking forward to an upgrade before Cyberpunk.
     
  3. Jacky60

    Gangster

    Joined: Jan 16, 2010

    Posts: 483

    Location: Earth

    AMD should just release the card ASAP without any fanfare or song and dance otherwise Nvidia will p on their parade if they have any warning at all.
    I suspect Nvidia already has a card ready to mess up any launch as their Turing process is so mature but I hope AMD get a few weeks before any such launch
    nevermind Ampere. Giving six months notice of a product release when you're the underdog seems naïve at best.
     
  4. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 7,455

    I was also surprised about the lack of any news of Ray-tracing and the new hardware capabilities, except that new feature SmartShift.

    But it is easy to explain because they would not like to negatively impact the sales of their current offerings.

    Do not forget that nvidia has 7 or 8 cards which are faster than the RX 5700 XT.
     
  5. BS Dave

    Mobster

    Joined: Oct 19, 2004

    Posts: 2,771

    Location: London

    Whatever happens Nvidia will want to release before the next gem consoles. If they come out and are less than say a 2070 super I can see a lot of people jumping ship
     
  6. Panos

    Capodecina

    Joined: Nov 22, 2009

    Posts: 12,649

    Location: Under the hot sun.

    Given where the prices go atm, NV & AMD would have to sell their flagships for sub £700 range. That won't happen due to yields & costs especially if NV goes for over 520mm2 chips. Cannot go smaller without removing RT & Tensor cores and maintain at least same performance for the "2080Ti" equivalent.
    Yes could go for RT capable CUDA cores, which makes sense for the next year MCM GPUs (like AMD goes for everything integrated with RDNA2) but it would make all Turing RTX cards redundant with a single stroke. As nobody going to develop further for dedicated custom RT & Tensor cores from "past GPU" and not "future GPUs".

    Which means next round of consoles going to be a bargain especially the top of Xbox one (seems going to be 2 of these). As it would guarantee 5y life circle of stable performance and visuals which will be better and more optimized as time goes by, in DX12 also.

    Also everything is on the details. MS is targeting 8K 30fps and true 4K 60fps. Sony doesn't have target the "true 4K", so more likely could use AMD GPU scaling and RIS reducing the cost for having to manufacture something bigger than a RX 5700 (no XT)

    <Sarcasm>
    And already SONY started the marketing using "30% less power than competitor". Won't be surprised if they hire Greta to point fingers as bad American corporations using more electricity destroying the planet :D
    </Sarcasm>

    And that might explain the gap also between the 2 consoles.
     
  7. Doogles

    Mobster

    Joined: Mar 22, 2014

    Posts: 3,917

    When talking consoles I remember when they (Xbox one/PS4) said about 1080p/60fps now we all know the best they managed in most titles was 1080p/30fps or 720p/60fps, with the new consoles I'm sure some titles will run 4k/60fps but only the undemanding/ported older games, within a year or two you'll see 1080p/60fps being the goal with upscaling to 4k.

    Even if Ampere isn't that great it'll still decimate anything consoles run, it's just the pricing that might be an issue, hopefully RTX 2080ti performance for £499 is possible, maybe if AMD can do something to give competition back into this market we'll start seeing massive gains again.
     
  8. humbug

    Caporegime

    Joined: Mar 17, 2012

    Posts: 32,715

    RTX 2080 released at £750, its GTX 1080TI level performance, i very much doubt we will see RTX 2080TI level performance for £500 in the next generation of GPU's, this is Nvidia....
     
  9. Grim5

    Mobster

    Joined: Feb 6, 2019

    Posts: 2,612

    PC hardware will always be faster than consoles, it's gauranteed like the sky will be blue, except in the UK where it's mostly grey - but blue like the sea ;)
     
  10. Panos

    Capodecina

    Joined: Nov 22, 2009

    Posts: 12,649

    Location: Under the hot sun.

    Which again is grey in UK given the clouds :p
     
  11. Doogles

    Mobster

    Joined: Mar 22, 2014

    Posts: 3,917

    Very true but at this point AMD still hasn't had anything to compete with top end for years, if they manage to pull a Ryzen in the GPU sector you'll see prices freefall, if you said in 2015 you could get a 8 core CPU for under £100 on a £40 motherboard you'd laugh! While the R9 290x was a loud illegitimate child at least it was better than a 970 and able to get on par with the 980.
     
  12. 4K8KW10

    Soldato

    Joined: Sep 2, 2017

    Posts: 7,455

    There are some discrepancies in the review 1 and review 2 results - R9 290X vs GTX 780.
    Later, the R9 Fury X was on par with the fastest GeForce GTX 980 Ti.

    edit: or the uber mode confused me a bit......

    [​IMG]
    https://www.techpowerup.com/review/amd-r9-290x/27.html

    [​IMG]
    https://www.techpowerup.com/review/nvidia-geforce-gtx-780-ti/27.html

    [​IMG]
    https://www.techpowerup.com/review/amd-r9-fury-x/31.html
     
  13. KungFuSpaghetti

    Wise Guy

    Joined: Apr 7, 2017

    Posts: 1,130

    Who cares, that's all history and a far reality from today's landscape. Even on 7nm the best AMD manage is the radeon vii and the 5700xt which is still slower than some 10 series cards. That's the sad reality hence the utter gouging by nvidia.
     
  14. Doogles

    Mobster

    Joined: Mar 22, 2014

    Posts: 3,917

    https://youtu.be/0G5oe-Cc994?t=251

    What I'm saying is in some games, also later on (because nVidia doesn't support older cards so well with driver updates) the 290x was able to rival the 980 while being cheaper, it was priced to compete with the 970 (which the 290x beat easily) and it could get on par with the 980, not in every game about but pairing a 980 vs 290x now the 290x will come out on top more often than not because nVidia stopped supporting their older cards with performance updates.

    The 290x was never a true high end card that could completely dominate on the flagship of nVidia but it was enough to give people the option of buying from a competitor without having to worry too much of performance loss, now with the 2080s/2080ti AMD has nothing to compete against them with, even the 5700 XT (ignoring the driver problems) struggles to get near the 2070s, which would've been beaten back in 2014 by the 290 for less.

    The Fury was a whole different story, limited heavily by the 4GB VRAM, HBM being difficult for AMD to buy (also expensive) didn't help at all, not only those issues but some of those reference AIO coolers had issues with pump noise which put people off.

    Uber mode was on the 2nd BIOS (IIRC) which meant the reference fan span higher too keep the card cool as those reference coolers were terrible, quiet mode was on the 1st BIOS (again IIRC) which made the fan run slower but also hit performance because it ran into that 94c limit :D
     
  15. Phixsator

    Mobster

    Joined: Oct 10, 2012

    Posts: 3,272

    I'm sorry what? the 5700XT is right there nipping at the butt hairs of the 2070s and for a whole lot less money. HWUB Techspot had it just 2% slower at 1440p than the 2070s on average back in july. So struggles is a bit of a strong word imho. Even if forza isn't an instant win anymore for AMD.

    [​IMG]

    EDIT: Techspot not HWUB.
     
  16. james.miller

    Capodecina

    Joined: Aug 17, 2003

    Posts: 18,058

    Location: Woburn Sand Dunes

    Techspots review of the 970 in 2019 was an interesting read. 1% slower than the R9 290(non x) over 33 games. Noted for it's more consistent performance and lower power requirements. If the 970 is 1% slower than a non x, where does that put it in relation to the 290x? 4 or 5%? That's not knocking on the door of the 980, 980's were around 30% faster than the 970s.

    Dont forget the 290/290x came first and they were $500 cards originally. They weren't priced to match the 970 until the 970 was released and at the time performance was pretty close, so although AMD did drop the prices of the 290s, it was nVidia who forced AMD to drop their prices to that level.
     
  17. pandem0nium

    Mobster

    Joined: Dec 5, 2010

    Posts: 2,739

    Location: Solihull

    The new high end really needs HDMI 2.1 and VRR support.

    Remember the Fury X only having HDMI 1.4? And recommending using a displayport adapter to connect the card to their 4k 60hz TVs. An adapter which took a good 6+ months to come to market IIRC.
    Then the Vega cards not having hdcp 2.2 support even though they had HDMI 2.0.
     
  18. Sargatanas2511

    Mobster

    Joined: Oct 26, 2013

    Posts: 3,299

    Location: Scotland

    I wouldn't say that's the best AMD can manage, that's just what they are willing to release while focusing more heavily on their cpu segment. Plus there's a long time on 7nm left for AMD to bring out much better cards.

    The 5700XT is not far at all behind the 1080Ti either, pretty much negligible now.
     
  19. JediFragger

    Capodecina

    Joined: Oct 18, 2002

    Posts: 21,932

    Location: y0 Momma's a$$



    Got mine at 2Ghz with a small mem clock too, does well :cool:
     
  20. Jacky60

    Gangster

    Joined: Jan 16, 2010

    Posts: 483

    Location: Earth

    Noteworthy that the only card of the three you can use in mGPU is the 1080ti. 1080ti SLI pi$$es all over all alternatives where supported for a far better price. Not surprised mGPU is no longer supported, it's the best way to get decent fps at a decent price hence the duopoly won't allow it. AMD/Nvidia (in no particular order) are having a laugh by stopping supporting the best value option in the GPU space. I'm running RDR2 all ultra except MSAA at 2560/1440 at 79fps. That's with 2 cards from March 2017! Both of mine steaming ahead at 1962/1949 core clock consistently in games.