• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Raytracing thread

RT is more for the augmented reality part of VR. As you need RT to blend in reality and VR better.

This doesn't answer his question, though. Right now, the answer to that question is yes, it will. GPUs really need to be magnitudes faster due to the onus on ultra-low latency needed.
 
The denoising algorithms make use of the tensor cores, I've been told.

I wonder if it's required for them to do denoising when doing ray tracing in real time, and if it'll look like turd without it ala some of their demo vids.


the person I quoted was talking specifically about ray tracing, raytracing doesn't use tensor cores
Yea I got that's what you meant, but then you shouldn't really say RTX you should say ray tracing, as it just adds more to the confusion, which is why nvidia had to clarify RTX didn't mean ray tracing.

Plus if the method that they've used to do ray tracing requires it to be denoised, while the ray tracing calculations doesn't need the tensor core the whole process of doing ray tracing does.

Edit: if the tensors have to be used for ray tracing, I wonder what impact DLSS would have, if any, while ray tracing is being used. :confused:
 
I watched a YT video demonstrating that RT tech is quite old, and a $150 tablet was able to produce 6 Gigarays/s years ago. Nvidia’s heavy push for this incredible new product feels pretty empty imo, and the DLSS actually seems much more appealing.

Given that they’ve pushed a deal on third party customers to take old 10 series stock if they want first dibs on 20 series stock, massive price increases, plus the dilineation of GTX vs RTX, it just feels like they’re desperately trying to shift old stock after the mining boom. And the product itself really doesn’t excite me that much, it’s just ‘meh’ performance increase with some interesting new hardware.


I’m sure I’m not the only one that’s not at all impressed with the new series and I’m not buying a new card or monitor until price reflects the performance.

I’ve never been a fanboy of either side, I just chose the best performer for my budget. But all of NVs recent actions just make me want to buy from them less and less, which is a shame; I think they’ll end up really damaging the PC market. But what do they care, graphics cards aren’t really what they want to make anymore.

Sorry WantoN but I going to call you out on this video that you been talking about in a lot of the NVidia RTX threads recently.

The guy in the Video has his facts wrong. I don't know quite where he got his 6 Giga rays performance number from, but it clearly wasn't from Imagination because according to them the GR6500 does

Unmatched real-world ray tracing performance: Up to 300 MRPS (million rays per second), 24 billion node tests per second and 100 million dynamic triangles per second at 600 MHz

Taken from here.

https://www.imgtec.com/blog/powervr-gr6500-ray-tracing/

So far short of the 6 Giga rays the Video talks about. And to anyone who actually watches the video, do you really want say that their demo looks anything like the BFV video, nah didn't think so. It is very impressive to be able to do it at all on a Small SOC, but to try and pass it off as similar or as powerful as the up coming RTX cards (the 2070 in this case) is just plain wrong.


One other thing, I'll be dammed if I can find a Tablet or a phone with the 6XT GR6500 in it for $150, in fact I cannot find one using that SOC at all, but maybe my googleFu has failed me. :)
 
Sorry WantoN but I going to call you out on this video that you been talking about in a lot of the NVidia RTX threads recently.

The guy in the Video has his facts wrong. I don't know quite where he got his 6 Giga rays performance number from, but it clearly wasn't from Imagination because according to them the GR6500 does



Taken from here.

https://www.imgtec.com/blog/powervr-gr6500-ray-tracing/

So far short of the 6 Giga rays the Video talks about. And to anyone who actually watches the video, do you really want say that their demo looks anything like the BFV video, nah didn't think so. It is very impressive to be able to do it at all on a Small SOC, but to try and pass it off as similar or as powerful as the up coming RTX cards (the 2070 in this case) is just plain wrong.


One other thing, I'll be dammed if I can find a Tablet or a phone with the 6XT GR6500 in it for $150, in fact I cannot find one using that SOC at all, but maybe my googleFu has failed me. :)

Thanks for that mate, I always like to learn more. From what little reading I did, the chip was mysteriously canned. It’s on their obsolete listings on the website. Perhaps a quiet IP purchase? Who knows.

It improves my opinion on the tech. NV have clearly reiterated and improved it. Whilst it remains old tech they’re touting as a revolution in gaming, at least they’ve moved it forward considerably.

Nice to know someone’s paying attention anyway, and thanks again for taking the time to correct me.

If you come across anything else I’d be eager to read, I just cba to research anymore as I’ve made my decision. DLSS remains the most impressive and truly proprietary tech for RTX imo. The premise of better IQ than standard AA methods without the overhead is exciting. As I’ve said before, it surprises me that more wasn’t made of it.
 
The basic idea of using raytracing in gaming is not new, the gaming industry has always had the aim of emulating the movie industry for graphics.

300 mrays was clearly not enough for modern gaming, 6 gigarays is a 20x improvement in 4 years.

Someone having an idea to do something and then later actually achieving that thing is obviously 2 very different things.

Its like saying that the guy that invented the jet engine didnt really achieve anything vecause he was just iterating on what the wright brothers did.
 
If you come across anything else I’d be eager to read, I just cba to research anymore as I’ve made my decision. DLSS remains the most impressive and truly proprietary tech for RTX imo. The premise of better IQ than standard AA methods without the overhead is exciting. As I’ve said before, it surprises me that more wasn’t made of it.

Have to say that I'm also very excited for this mode, the benefits of which could be quite tasty!!
 
I've goto admit that the idea of ray tracing well enough to actually be able to use it ingame is very exciting, even in the hybrid rendering era that we look to be getting with the RTX cards.
We all know the idea isn't new, but I do believe this is the first time one of the major players in the graphics market has dedicated their card line up (so far) to it.
Think back to to the ZX81 and spending hours entering pages of code, just to get that wonderful machine to actually draw a curve. Or wolfenstein 3D, that brought us one of the first 3D esque first person shooters. Quake that switched us away from sprites in that glorious polygon fashion, which has now developed into these wonderful games that an awful lot of us love to play.
Ray tracing is coming to a game near you very soon.

Just like hybrid cars have been around since as early as 1901 they didn't really catch on until Toyota brought us the prius in 1997.

It's early days, but it is going to happen and if NVIDIA hadn't of been asking such ridiculous prices I would have jumped on the band wagon myself, I was prepared to go to £500 absolute max for an 1170 ( dammit NVIDIA learn to count):p but I will just have to stick with my 970 for another year or so till hopefully the 21xx ( that being the next number in line are you listening NVIDIA :D) arrives maybe I might be able to snag a 2160 for my £500 by then.:eek:
 
Its like saying that the guy that invented the jet engine didnt really achieve anything vecause he was just iterating on what the wright brothers did.

I agreed with you entirely up until then. I gave Nvidia their due, and stated what they had done was impressive; but it isn't new technology, as you yourself agree. Surely it would be analogous if someone invented a jet engine and someone iterated on it, improving it by a factor of 20. If they made out the jet engine was new I would say the same thing. So we're both on the same track and it's essentially a non-issue. The only difference seems to be that I don't approve of how the Turing launch is being handled, and if I'm wrong about that then we agree on everything.

Have to say that I'm also very excited for this mode, the benefits of which could be quite tasty!!

Quite mate. Like I said for me its the most impressive thing, yet the attention is all on RT. I won't buying it (as I've banged on about more than enough) but DLSS alone could really, really improve fidelity, especially in future generations. Very exciting indeed.

I've goto admit that the idea of ray tracing well enough to actually be able to use it ingame is very exciting, even in the hybrid rendering era that we look to be getting with the RTX cards.
We all know the idea isn't new, but I do believe this is the first time one of the major players in the graphics market has dedicated their card line up (so far) to it.
Think back to to the ZX81 and spending hours entering pages of code, just to get that wonderful machine to actually draw a curve. Or wolfenstein 3D, that brought us one of the first 3D esque first person shooters. Quake that switched us away from sprites in that glorious polygon fashion, which has now developed into these wonderful games that an awful lot of us love to play.
Ray tracing is coming to a game near you very soon.

Just like hybrid cars have been around since as early as 1901 they didn't really catch on until Toyota brought us the prius in 1997.

It's early days, but it is going to happen and if NVIDIA hadn't of been asking such ridiculous prices I would have jumped on the band wagon myself, I was prepared to go to £500 absolute max for an 1170 ( dammit NVIDIA learn to count):p but I will just have to stick with my 970 for another year or so till hopefully the 21xx ( that being the next number in line are you listening NVIDIA :D) arrives maybe I might be able to snag a 2160 for my £500 by then.:eek:

Bang on mate (and not just because of the epic wave of nostalgia I felt reading it). As I've said before, seeing true innovation, rather than moar fps, is a wonderful thing. Unfortunately holding back reviews until just before release, and charging a fortune just so they can shift old stock has turned me completely off. I'm not a fanboy, nor am I one to cut off my nose to spite my face, but I can't buy into the cards. Had they cards been sensibly priced I'd have been clamouring for a ti (would have happily dropped say, £700 to £800 for one), even with the other shenanigans. Regardless, I'm really hoping RayTracing takes off; a) for the people who've shelled out crazy prices for these things, and b) because we don't get enough game changing new technology these days, and DLSS and RayTracing (regardless of the spin and mickey taking) are exactly that, and should be commended for such.

One thing I can't really agree with is people complaing that there's not that many games for that support it yet. Well of course not, but unlike AMD, Nivida have the money, connections and clout to make wide adoption possible. There's not that many games when a console launches either, but they come.
 
Smart?! Lol Isn't this the maximum they can achieve with their current knowledge and quantity of transistors? :confused:

You have to start somewhere. Developers won't develop games using Ray Tracing until hardware exists that can do it. So at least Nvidia have taken that first step. And, yes, it's smart to do it now, they have very little competition at the moment. They can take a few risks to get Ray Tracing up and running, and they can charge higher prices. By the time 7nm cards are out Nvidia will have one generation of Ray Tracing hardware up and running. They couldn't have done it at a better time.
 
Last edited:
You have to start somewhere. Developers won't develop games using Ray Tracing until hardware exists that can do it. So at least Nvidia have taken that first stop. And, yes, it's smart to do it now, they have very little competition at the moment. They can take a few risks to get Ray Tracing up and running, and they can charge higher prices. By the time 7nm cards are out Nvidia will have one generation of Ray Tracing hardware up and running. They couldn't have done it at a better time.

This post sums up my thoughts :)
 
Have to say that I'm also very excited for this mode, the benefits of which could be quite tasty!!

Why are you on 1440p? Most good games have no Aa anymore its a sad fact many big games look garbage compared to older Source games because MSAA worked really good the blur was not there unlike the PUBG DOTA2 OVERWATCH etc options.


But were dawning on 4k now why would you want this at 4K? So really its already an older focused niche and i think made solely so people can Raytrace at 1080p. So why they put effort into this i do not know because RT at 1080 is silly and thus this tech and possibly entire generation might be silly as well.


I would much rather of saw a straight up shrink of pascal with 2x perf and a whole new low usage minimal driver suite and UI like AMD had ages ago.
 
Last edited:
You have to start somewhere. Developers won't develop games using Ray Tracing until hardware exists that can do it. So at least Nvidia have taken that first stop. And, yes, it's smart to do it now, they have very little competition at the moment. They can take a few risks to get Ray Tracing up and running, and they can charge higher prices. By the time 7nm cards are out Nvidia will have one generation of Ray Tracing hardware up and running. They couldn't have done it at a better time.
Agreed. Makes sense.

I hope AMD will have an answer for this in next couple of years, because unlike physx this will make a big difference and if AMD cannot provide it, it will mean even more reason to stick with nvidia which is not good for AMD’s market share and as a result our pockets. Hopefully intel are watching closely and will have such tech in their cards.
 
Last edited:
Back
Top Bottom