• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ray Tracing on an AMD Graphics Card beats nvidia

I think Nvidia tried to cash in early knowing this was coming...

Microsoft DXR has been coming for awhile now along with some developers trying to create their own software solutions - it is still vastly inferior to having dedicated hardware solutions.
 
There is no licensing fees to use GTX, or Radeon or Geforce.

RTX is just a marketing name.

Hmm I'll take your word but will certainly look into it. Effectively what you are telling me is all Nvidia api and dev tools are free to use with no licensing costs or no additional costs to include it? Was that also the case for physx? You can probably tell I'm sceptical given they want to charge manufacturers to use the "gsync compatible", then there was gpp and a poor uptake on the physx api, which, if it was free to use and feature rich as well as easy to develop with as people in here are telling us, then why was that? I think I need to do some research.

I guess what I need to know is do Nvidia effectively have their own api that leverages dx12 "dxr" or do they flat out use the dx12/Vulcan api and leverage it's use in their driver. I see rtx as Nvidia propietary much like gsync but perhaps I'm totally wrong.
 
Was that also the case for physx?

The basic dev tools and API were free to use - originally if you wanted access to the full source you had to license it around $50K IIRC (these days the source is released free).

EDIT: The reason developers tended to not use it was due to the potential for splitting their audience if it was built into the game at a fundamental level and a relatively high performance impact compared to other physics engines if your game only made relatively simple use of physics combined with a level of uncertainty due to the way nVidia handled it leaving some developers in doubt as to what the future would be i.e. if nVidia locked it down or stopped supporting older versions, etc.
 
The basic dev tools and API were free to use - originally if you wanted access to the full source you had to license it around $50K IIRC (these days the source is released free).

I wonder what model if any they use for rtx? I've updated the post above slightly as I simply don't know. I would be amazed if they wern't trying to monetise it's feature set and tools to devs.
 
It sits on top of MS DXR but I'm not quite sure where nVidia are going with it at the moment.

https://developer.nvidia.com/rtx/raytracing

https://developer.nvidia.com/rtx/raytracing/dxr/DX12-Raytracing-tutorial-Part-1

Goes into a bit of detail.

Random I was just reading about optix and there seems to be requirements to inform Nvidia as the most basic thing, as well as that there appear to be other requirements that are not clear with publicly released software. It's about as clear as mud but they do have there own optix api that leverages dxr/Vulcan which is really what you would expect as you can't monetise somebody else's api. Well I mean you could but I guess it is somewhat frowned upon.
 
I don't think RTX is going to last long once the more universal DXR starts to creep in to games tbh. Why would anyone bother?

All Nvidia's proprietary things have died off. Physx, 3D vision, gsync is on it's deathbed.
 
I don't think RTX is going to last long once the more universal DXR starts to creep in to games tbh. Why would anyone bother?

All Nvidia's proprietary things have died off. Physx, 3D vision, gsync is on it's deathbed.


Well Gsync isn't on its deathbed as such, they're just going to end up merging it with freesync or something similar
 
But now that geforce owners can use Freesync why would they pay £1-200 extra for gsync?

As before depends what your use case is and what evolution of features we see from each standard. There are still areas VESA adaptive sync falls down in where G-Sync manages better or works properly that I'd happily pay a little premium for personally.
 
Ok so it looks nice, but I don't quite get the thread title.
In what way is that video showing anything beating NVIDIA, let us wait to see how well each sides cards perform before declaring a winner and a losses.
 
Tech reviewers are completely bloody useless and just take whatever nVidia say as fact, they don't even bother to do any of their own research.

Raytracing in Cryengine is not a tech demo and it is not coming to Cryengine 5.5, its been in Cryengine for years, as far back as 3.4 (Crysis 3) and improved ever since, and 5.5; if he would just spend 2 minute looking he would see that Cryengine 5.5 has been out for about 3 months. https://www.cryengine.com/roadmap


Tim at Hardware Unboxed
https://youtu.be/Wn4MwnJhSvc?t=246

Me messing about with it in Cryengine 3.8 (2016)

Here: https://youtu.be/exW1SJUSr90?t=29

And here: https://youtu.be/exW1SJUSr90?t=120

Hunt Showdown (Crytek, Cryengine 5.2)

Video...

https://youtu.be/mU7HWCVASH0?t=181

QkrshcG.jpg.png


Cwbc6jD.jpg.png
 
You have to understand, if a site/channel has a sizeable number of views/clicks then it's almost 100% barely tech-literate, because that's the only way to have such a following with these topics. Even the ones like Gamers Nexus are mostly posing. In reality all these outlets are just outsourced marketing & nerd infotainment of sorts. For real technical stuff you aren't even know about it let alone see this kind of traffic. Overclock.net is perhaps the closest thing to being an exception, and that's not really an outlet anyway. And the epitome of this is obviously Hardware Canucks & copycats, which are essentially just shooting B-roll for affiliate links.

Meh.
 
Total Illumination in Cryengine 3.8, a 3 year old engine, According to Tim at Hardware Unboxed Total Illumination is coming to Cryengine 5.5..... Like For ##### Sake!!!!

Docs... https://docs.cryengine.com/pages/viewpage.action?pageId=25535599

It in action, i just recorded this in the Sandbox Editor.

Also notice the Ammo Box in the foreground take on the colour of the grass behind it, what total Illumination does is fire off thousands of rays for every frame and use those to calculate indirect lighting, with it off its standard Cube Map overlaid lighting, with it on its true ray traced illumination based off environmental light sources.

Ray Tracing in games is nothing new and you don't need expensive specialized hardware to run it, all nVidia are doing is making it so they can sell you expensive hardware by claiming you need RT cores to run it at a reasonable performance, you don't its BS!

Most modern hardware have more than enough Floating Point Flops to run the calculations just fine, we are not in 2004.

1440P when Youtube is done encoding it.

 
Last edited:
Would be funny though if Intel just come along and launch a card that lay a smack-down on both Nvidia and AMD on RT and gaming performance :p

One can dream right? :D

I think NVIDIA have much more to fear from Intel's entry than AMD do. They're both the market leader in terms of units and revenue (excluding consoles), and also the dishonest monopolist, with very dodgy marketing to boot. Firstly, Intel are huge and if they want to can completely drown NVIDIA's marketing efforts. Secondly, no-one does dishonest, absuive monopolist practices better than Intel - hard as NVIDIA might try.

Also, Intel are absolutely bound to throw their hat in with what AMD are doing on Ray Tracing and virtually everything else. That is, general purpose hardware, and with Intel's might, finally a big push for Vulkan - they've heavily distanced themselves from MS on the API side in recent years.

RTX is really inefficient both in absolute terms and cost / size of silicon. A 754mm Turing GPU minus RTX would be immensely faster using DX or Vulkan RT APIs with its compute based shaders than the 2080Ti could ever be using RTX. The whole thing is a joke, and NVIDIA would have known it very early in the design and simulation phase - long before they got any test silicon. However ... marketing and dishonesty, and selling a big price tag - that's how they justified it.

Problem is, with Intel entering, they're not going to be able to pull the wool over so many people's eyes. Because JHH and fellow execs can't stand to lose face, I expect the 2020 NVIDIA cards to also have RTX ... and it will hurt NVIDIA badly.

Intel will face advantages and disadvantages due to their new entry. Clean slate - no legacy, no backwards compatibility, no continuity. They can design purely for performance / efficiency / cost. Their 10nm process is likely to be a significant disadvantage. It might be a tiny, tiny bit more dense than TSMC / Samsung 7nm EUV, but judging by projected Zen 2 clocks (not on EUV) vs projected Intel 10nm clocks, they're unlikely to be able to clock near the competition, all else being equal.
 
@pmc25 AMD are not doing anything with Ray Tracing, that's down to game / engine developers, nothing to do with the GPU architecture and again you don't need RT cores to do Anything like in Metro Exodus, TR, BF5..... nVidia use Turing for machiene learning, they are also selling those GPU's to retail as gaming GPU's but to make sure you buy them they are almost claiming Ray Tracing as their own and needing these GPU's to run it, much like they did with PhysX.

Modern GPU's have more than enough compute power to Ray Trace in games, its just a lot of work for game / engine developers to implement, Cryengine are one of few who have, but i don't think they are alone.
 
Back
Top Bottom