• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Caporegime
Joined
18 Oct 2002
Posts
32,618
My bad. So the RT cores are useful, but it sounds like only in an RTX context. Tensor cores are a waste of space.

RTX cores could be used for things like 3D sound.

The tensor cores proved rapid 16bit float support which would require additional CUDA cores. They also get used in professional apps, and the newest DLSS is actually pretty decent
 
Associate
Joined
21 Apr 2007
Posts
2,485
I'm getting a bit bored with the same tedious negative responses that are trotted out by people because they either don't understand what is going on and/or because AMD can't do it yet - when you actually notice the techniques in action in Quake 2 RTX and understand how that can be applied to more modern applications it is almost mind blowing that we are pretty much there now not still waiting for another 10 years into the future.

Not really sure what the means tbh, but you can hardly expect a group of consumers to be wowed by "some tech" no matter how innovative when it costs a lyrical **** ton and delivers so poorly. I get it that RTX and Ray Tracing could be a very good thing and when its ready, affordable and supported I'll happily buy into to it, but don't ask me to separate a vision of things to come from the pay now buy garbage elephant in the room. Its basically a similar problem to early adoption VR only worse because in many case you're out of options.

Not having a go per se, you're 'bored with the same tedious negative responses', I'm frustrated with the lack of product and continuous product/perf drain dressed up as a welsh sheep on a stag night in Bristol. You can't separate the two its all bundled into one fur ball of mess. There is no technology discussion on consumer based products without price/value/perf its not a tech demo its been sold right now. Its overpriced and under delivers no matter how clever it might be.
 
Soldato
Joined
21 Jul 2005
Posts
20,020
Location
Officially least sunny location -Ronskistats
Its also typically pompous statement claiming that other people 'dont understand'. Let's face it, if AMD had came out with it first they would have been trolled to the rafters as another over-hyped feature that doesnt quite deliver yet. Just because nvidia released it doesnt make it immune to criticism, they basically rushed it so the marketing could stamp it for advertisers to adopt it.
 
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
RTX cores could be used for things like 3D sound.

The tensor cores proved rapid 16bit float support which would require additional CUDA cores. They also get used in professional apps, and the newest DLSS is actually pretty decent
In gaming tho, the Tensor cores are largely a solution looking for a problem, are they not? DLSS is a problem that nVidia invented to use the Tensor cores in a gaming product ;) And then tried to sell as the new must-have.

I don't think all that many people are sold on the idea that AI-accelerating cores are essential in a gaming card.
 
Soldato
Joined
6 Feb 2019
Posts
17,566
In gaming tho, the Tensor cores are largely a solution looking for a problem, are they not? DLSS is a problem that nVidia invented to use the Tensor cores in a gaming product ;) And then tried to sell as the new must-have.

I don't think all that many people are sold on the idea that AI-accelerating cores are essential in a gaming card.

Ray Tracing is a problem that was invented so AI scaling can be used? There are many ways to cut the mustard, I agree I'm not sure DLSS is the right answer. But I don't think Nvidia invented Ray Tracing to give DLSS a hope, Ray Tracing and it's poor performance has existed for a long time. Disney has been using ray tracing in their animated movies going back to the 90s
 
Man of Honour
Joined
13 Oct 2006
Posts
91,058
Not really sure what the means tbh, but you can hardly expect a group of consumers to be wowed by "some tech" no matter how innovative when it costs a lyrical **** ton and delivers so poorly. I get it that RTX and Ray Tracing could be a very good thing and when its ready, affordable and supported I'll happily buy into to it, but don't ask me to separate a vision of things to come from the pay now buy garbage elephant in the room. Its basically a similar problem to early adoption VR only worse because in many case you're out of options.

Not having a go per se, you're 'bored with the same tedious negative responses', I'm frustrated with the lack of product and continuous product/perf drain dressed up as a welsh sheep on a stag night in Bristol. You can't separate the two its all bundled into one fur ball of mess. There is no technology discussion on consumer based products without price/value/perf its not a tech demo its been sold right now. Its overpriced and under delivers no matter how clever it might be.

I get what you are saying but - VR hasn't cracked what it needs to be a convenient and seamless replacement for 2D gaming yet and especially so in early days it was a teaser of something that had a lot of hurdles to overcome that solutions still needed breakthroughs that didn't exist - a lot of the negativity was aimed at people hyping its dominance up as being soon when many people realised it wasn't - a lot less people rubbished VR itself. Ray tracing is in a different boat here as we have finally cracked the hurdles of bringing it to games with viable performance - sure there are limitations in terms of implementation and affordable hardware with the performance somewhat tenuous this generation and I'd be somewhat more forgiving if these were the main features of people's gripes but more often it is along the lines of "why do we need ray tracing anyway it just tanks performance", not just on the basis of price and delivery this generation but as writing the whole thing off, with no interest in educating themselves on the subject or quite often simply sour grapes because AMD doesn't have a solution yet and these people will quickly change their tune when AMD does which is quite frankly boring.

Its also typically pompous statement claiming that other people 'dont understand'. Let's face it, if AMD had came out with it first they would have been trolled to the rafters as another over-hyped feature that doesnt quite deliver yet. Just because nvidia released it doesnt make it immune to criticism, they basically rushed it so the marketing could stamp it for advertisers to adopt it.

Not my intention to be pompous but quite frankly I have no time for those who don't understand and have no interest in trying to understand it either and would rather just be negative. On the other point I suspect a lot nVidia fanboys would have been negative if AMD had come out with it first and it would have been just as tiresome. Ray tracing has been something of a holy grail for a long time in graphics rendering and many of us who had an interest in the tech thought we'd still be another decade away from even rendering Quake 2 with something approximating a full ray trace solution let alone any other games.

I think this is something that people aren't realising the significance of - sure Quake 2 looks incredibly dated and the engine has massive geometry limits but the path tracing implementation in Quake 2 is used for all the effects on show - not just reflections - all GI, all indirect lighting (where used) and shadows are implemented through using these techniques and unlike Minecraft the implementation isn't using optimisations that take advantage of the relatively low detail of the game - the path tracer is relatively untroubled by scene complexity - you can more than quadruple the amount of detail in the scene compared to stock Quake 2 maps and barely lose 2% performance - though additional light sources that utilise full analytics and additional bounce passes will tank performance quite quickly and you have to be conservative with the use of.
 
Last edited:
Caporegime
Joined
17 Feb 2006
Posts
29,263
Location
Cornwall
Ray Tracing is a problem that was invented so AI scaling can be used? There are many ways to cut the mustard, I agree I'm not sure DLSS is the right answer. But I don't think Nvidia invented Ray Tracing to give DLSS a hope, Ray Tracing and it's poor performance has existed for a long time. Disney has been using ray tracing in their animated movies going back to the 90s
No I didn't say ray-tracing did I. I mentioned the Tensor cores, which are the AI-accelerating cores that nV has tried to re-purpose in a consumer gaming card by inventing DLSS.

Ray-tracing is entirely separate. You don't need Tensor cores for it. As you yourself say, RT has been done in software for decades. Heck I remember RT demos in the early 90s. You just had to wait several hours/days for a single scene to render.

The point is the Tensor cores are there because nV's focus is not just gaming now. They are big in the AI space, and they wanted to sell/re-purpose compute-heavy cards for consumer/gaming.

Don't forget there are Tensor core and RTX cores.

nVidia's website said:
The GeForce RTX 2060 features 1,920 CUDA cores, 240 Tensor Cores that can deliver 52 teraflops of deep learning horsepower, 30 RT Cores that can cast 5 gigarays a second

Three different types of cores. CUDA + Tensor + RT.
 
Last edited:
Soldato
Joined
9 Nov 2009
Posts
24,828
Location
Planet Earth
No I didn't say ray-tracing did I. I mentioned the Tensor cores, which are the AI-accelerating cores that nV has tried to re-purpose in a consumer gaming card by inventing DLSS.

Ray-tracing is entirely separate. You don't need Tensor cores for it. As you yourself say, RT has been done in software for decades. Heck I remember RT demos in the early 90s. You just had to wait several hours/days for a single scene to render.

The point is the Tensor cores are there because nV's focus is not just gaming now. They are big in the AI space, and they wanted to sell/re-purpose compute-heavy cards for consumer/gaming.

Don't forget there are Tensor core and RTX cores.



Three different types of cores. CUDA + Tensor + RT.

Agreed - even the addition of RT cores wasn't done entirely for gaming,it was done for getting into the VFX market. People forget at the Turing launch,Nvidia launched a full line of commercial cards with the fully enabled GPUs,and started talking about $200 billion VFX markets - even investment sites were talking about VFX mostly and not gaming. Most of the initial talk at the Turing reveal was about VFX and not so much about pure gaming.

This explains why at launch you hardly saw many games with RT,and it took months for some of the games which had it to get it optimised and why DLSS was so hit and miss. Even developers had poor access to cards until launch - if this were all features which were meant only for gaming,most of them would have had prototype cards to try before launch and would have hit the ground running. The fact that the software stack was so poorly developed is telling - I suspect Turing was pulled forward as AMD was meant to launch Navi at the start of last year according to rumours,but didn't.

You also need to consider this - Nvidia would have had to have three lines,one for gaming,one for RT use in VFX markets and one for DP computing. They basically combined the first two lines,which saves on R and D which is a significant expenditure. Otherwise it makes little sense to sell such huge GPUs for gaming usage - traditionally when we had large gaming GPUs it has been primarily for cards which have pulled dual uses,ie,gaming and commercial usage.

Its also typically pompous statement claiming that other people 'dont understand'. Let's face it, if AMD had came out with it first they would have been trolled to the rafters as another over-hyped feature that doesnt quite deliver yet. Just because nvidia released it doesnt make it immune to criticism, they basically rushed it so the marketing could stamp it for advertisers to adopt it.

People do understand,yet a lot of dorks(us included) on hardware enthusiast forums,don't understand graphics are only one part of a game. 75% of computer gaming revenue is from consoles and phones which don't push graphics,and a large percentage of the remaining 25% of computer gaming revenue is from MMOs and twitch shooters which don't push graphics and many have cartoonish graphics styles which can scale down to slower systems. Even on Steam look at how many people have anything better than a GTX1070?? Oh,wait that is because most PC gamers are still constrained by price. On this forum most people and their friends will have better than average hardware and upgrade more often. Its like going on a watch collecting forum...how many will be taking about £20 Casio watches on there? :p

If graphics were the singular important thing,why didn't Crysis sell more copies,etc - graphics are important,but its quite telling most of the computer gaming market isn't driven forward by purely graphics. Its driven forward by games which don't even look the best with current rasterised techniques but are apparently fun or involving in some way. Even a game such as the Witcher 3 sold well because it is a good game,even Cyberpunk 2077 will live or die on how the gameplay and story holds up. Nice graphics are the icing on the cake,but they don't make the game. Vast numbers of people who ran Crysis or Witcher 3 probably never ran it at max settings either.

Most of us on forums like this are hardware enthusiasts and dorks who like talking about hardware and technical aspects of games tech,but most gamers don't care. If they cared so much about the technically best platforms which had the nicest looking games,which ran the best,phones and consoles would not be making so much money with computer games. They are inferior to a decent gaming PC.

People also don't understand there is a huge mass of graphics cards which can't do raytracing very well,and developers with the increasing dumbing down of games,want to expand their player base and sell more games. Are they going to just ignore that so many cards are poor at raytracing??

No they are not,so for the immediate future it is going to be another "max graphics option" and not essential to most games,and how many times do we have these features integrated at the behest of AMD or Nvidia,who provide support in some way. AMD and Nvidia need to keep finding ways to sell new graphics cards. Raytracing will become a standard part of PC graphics when a critical mass of graphics cards can do it OK,and that will mean even £150 ones. If the new consoles can do it OK it will be a step towards this too. Pretty much the same for anyone of the new features we have seen introduced in the last 20 years.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,618
In gaming tho, the Tensor cores are largely a solution looking for a problem, are they not? DLSS is a problem that nVidia invented to use the Tensor cores in a gaming product ;) And then tried to sell as the new must-have.

I don't think all that many people are sold on the idea that AI-accelerating cores are essential in a gaming card.
No, because any game that uses FP16 shaders will use the Tensor cores. Remember all that marketing AMD gave about rapid packed maths? Well that is done on the Tensor cores.


And DLSS was not a tacked on idea to make use of Tensor cores. Nviida had been researching deep-learning based super resolution techniques fore years before Turing was released. Nvidia's early research papers used Pascal & Maxwell GPUs, and later papers used Volta to highlight the benefits of acceleration through Tensor cores.


It would have been trivial for Nvidia to not add Tensor cores. In the Turing GTX GPus like 1660Ti, the 16bit FP support is done with dedicated CUDA cores. Nvidia evidently believe that Tensor cores will have plenty of uses for gaming. Time will tell if this is true.

There are potentially many more cases such as accelerating physics, or A.I.
 
Soldato
Joined
18 Oct 2002
Posts
21,358
Location
Cambridge, UK
Forgive me if I'm not "up to date" but isn't it the case that DLSS implementation is pretty terrible and RTX performance brings even the 2080ti to its knees in the very few titles that have RTX enabled, a very high level summary or am I "way off the mark".

Neither of the these features really make me we want to rush out an by a 2080ti ;) or pay the price premium Nvidia are asking for, as far as I'm aware the 2080ti is approx 30% faster than 1080ti?

I want and expect more from a GPU for a "Bag-a-sand" ;)
 
Soldato
Joined
18 Feb 2015
Posts
6,484
Forgive me if I'm not "up to date" but isn't it the case that DLSS implementation is pretty terrible and RTX performance brings even the 2080ti to its knees in the very few titles that have RTX enabled, a very high level summary or am I "way off the mark".

Neither of the these features really make me we want to rush out an by a 2080ti ;) or pay the price premium Nvidia are asking for, as far as I'm aware the 2080ti is approx 30% faster than 1080ti?

I want and expect more from a GPU for a "Bag-a-sand" ;)

DLSS is worth using in 2 games (Wolfenstein YB, FF XV), debatable in 2 (Control & Deliver Us the Moon), and mostly **** in the other ones (aka not better than reducing resolution yourself). And yes, Ray Tracing certainly brings even a 2080 ti to its knees, at least for resolutions that card should play at (4K or QHD >100 fps). So, right now it's absolutely not a compelling selling point for anyone but graphics enthusiasts (like myself, who even does dumb things like pseudo-RT through reshade), plus most of the implementations themselves of RT are kinda half arsed and not worth much in the grand scheme of things. I mean really, who tf cares about accurate reflections in BF V? Certainly not anyone actually playing the game rather than taking screenshots. And then you top all that with the fact that it takes many months for either, or both, RTX features to actually appear in these games and it's even more of a let down. Yeah, maybe RT is really nice in Control, but if you really wanted to play the game would you have waited the 6+ months it took for RT to arrive in something like SotTR? As much as I am a fan of playing at the highest quality possible, I don't believe in that kind of restraint. If you really want to play something then you'll play it as soon as you can, not a year from now when it finally gets implemented. They're nice additions right now, for graphics nuts like myself, but not a must have by any stretch. I'm sure it will get better with time, at least for RT, but it's still years away from proper support overall.
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
. I'm sure it will get better with time, at least for RT, but it's still years away from proper support overall.

History repeats is self, I remember very very similar conversations and comments some 19-20 years ago with Pixel shaders, again nvidia had supporting hardware first with ATI 6 months later, it took a good 18months with and good support with Morrowind and that was with a single effect. Again is sucked up a lot of performance at the time for not much visual gain.

A generation of hardware later from both camps and games got the ball rolling then the likes of Far Cry 1 and later DX9 with PS 2.0 sealed the deal the deal.
 
Status
Not open for further replies.
Back
Top Bottom