• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Ray Tracing on an AMD Graphics Card beats nvidia

I am running Freesync on my 4K TV over HDMI at 60Hz with HDR enabled (in games that support HDR). Nvidia can't even do Gsync or Freesync over HDMI.

Who knew, I certainly didn't. Should be interesting while all this settles. Nvidias RTX will certainly make them a few quid but my bet is that it lasts no more than a few generations before RTX gets baked into GTX, or vice versa and dedicated hardware starts to die out.
I don't disagree :). Nothing to do what NV says. As you said, if cards had enough computational grunt, but they don't. Atm having cores optimised for the RT workloads is required. In the future we may not even need CPU's either.

I guess that is the 6 million dollar question. Is it a grunt thing or is it Nvidia trying to once again carve out a new market. Time will tell I guess, interesting times none the less.
 
You believe it's required because Nvidia have you believe it's required. I have read nothing in any article or in the dxr spec that suggests it's impossible for a card with enough computational grunt. This whole thing reminds me of physx and look what happened to dedicated physx hardware, besides havok was always vastly superior to physx and more commonly used anyway. As time progresses and architectures improve dedicated RT hardware will also imo die out.

No you don't need specialised hardware but ray tracing tends to use massive amounts of a fairly narrow range of maths (mostly vector stuff) which makes dedicated hardware for it hugely efficient and almost certainly the future over ever bludgeoning general purpose compute hardware as far as that area of graphics goes.

Can't agree on Havok being vastly superior - there were some area of basic physics it did better as a software solution but PhysX was a vastly more feature advanced physics engine that could do complex fluid dynamics and soft body physics, etc. that Havok could only dream of it also tended to work better when properly done for gaming albeit at the expense of having a bit of a "bouncy" feel Havok tended to impede the player more in the style of its simulation for some reason with physics objects more likely to get caught up on the player or blocking them, etc. which might have at a technical level been more realistic physic wise possibly but not necessarily a good thing from a gameplay perspective.
 
Who knew, I certainly didn't. Should be interesting while all this settles. Nvidias RTX will certainly make them a few quid but my bet is that it lasts no more than a few generations before RTX gets baked into GTX, or vice versa and dedicated hardware starts to die out.


I guess that is the 6 million dollar question. Is it a grunt thing or is it Nvidia trying to once again carve out a new market. Time will tell I guess, interesting times none the less.
I was surprised NV went with the RTX branding and don't like the RTX name for that reason. Would have been better keeping to GTX with RT as a mention. In the future RT will just happen, same as Physx now (like you said).
RT is new exciting tech hence used as marketing in the branding. Can't blame them I suppose, being the first to bring it to market, but does leave a bit of a bad impression too
 
No you don't need specialised hardware but ray tracing tends to use massive amounts of a fairly narrow range of maths (mostly vector stuff) which makes dedicated hardware for it hugely efficient and almost certainly the future over ever bludgeoning general purpose compute hardware as far as that area of graphics goes.

Can't agree on Havok being vastly superior - there were some area of basic physics it did better as a software solution but PhysX was a vastly more feature advanced physics engine that could do complex fluid dynamics and soft body physics, etc. that Havok could only dream of it also tended to work better when properly done for gaming albeit at the expense of having a bit of a "bouncy" feel Havok tended to impede the player more in the style of its simulation for some reason with physics objects more likely to get caught up on the player or blocking them, etc. which might have at a technical level been more realistic physic wise possibly but not necessarily a good thing from a gameplay perspective.

I totally knew you would pick me up on physx, am fairly sure we had some pretty interesting conversations about it back then as well. I guess when I'm talking "vastly superior" I'm talking about games built around havok as an engine vs physx, some of the best titles in our generation were havok games at the very core whereas physx always seemed tagged on. At a deep technical level I'll bow to your experience as I have zero physx experience, havok though i used to love developing with it, it strikes me as a vastly under appreciated engine.

Just look at this list... https://www.havok.com/showcase/

For something people tend to know little about that's a really impressive list.
 
I totally knew you would pick me up on physx, am fairly sure we had some pretty interesting conversations about it back then as well. I guess when I'm talking "vastly superior" I'm talking about games built around havok as an engine vs physx, some of the best titles in our generation were havok games at the very core whereas physx always seemed tagged on. At a deep technical level I'll bow to your experience as I have zero physx experience, havok though i used to love developing with it, it strikes me as a vastly under appreciated engine.

Just look at this list... https://www.havok.com/showcase/

For something people tend to know little about that's a really impressive list.

Won't disagree when it comes to games actually using physics Havok took the lead but the physics in games falls so far short of what the PhysX API is capable of it is almost criminal :(

For instance https://www.youtube.com/watch?v=6dATi4-wb3o

That doesn't even touch on the effects you can do with fluids, cloth, more advanced destruction, etc.
 
Won't disagree when it comes to games actually using physics Havok took the lead but the physics in games falls so far short of what the PhysX API is capable of it is almost criminal :(

For instance https://www.youtube.com/watch?v=6dATi4-wb3o

That doesn't even touch on the effects you can do with fluids, cloth, more advanced destruction, etc.

I'll have a watch buddy. It's a real shame but will happen again with RT, mark my words! Nvidia will somehow manage to ruin it with crazy licensing models and a "inferior" open source or cheap implementation will become the defacto because dev houses will refuse to license it if the alternative offers 80% of features for 10% of the price.
 
Won't disagree when it comes to games actually using physics Havok took the lead but the physics in games falls so far short of what the PhysX API is capable of it is almost criminal :(

For instance https://www.youtube.com/watch?v=6dATi4-wb3o

That doesn't even touch on the effects you can do with fluids, cloth, more advanced destruction, etc.
I think I understand what you mean.

PhysX definitely had potential in terms of capability, had they been implemented correctly and meaningfully. It was a bit of a shame that Nvidia's PhysX generally gave of a strong feeling of being "shoehorned" into the games, and the effect always felt more like it's there for the sake of being there; Havok on the other hand despite supposed to be weaker in capability and performance, the way that they are used for games always feel much more natural and organic, and doesn't feel out of place in the games that use it.

Also Nvidia blocking users from using PhysX despite they have a dedicated Nvidia graphic card when an ATI/AMD GPU is detected didn't help neither and further alienate the users.
 
I'll have a watch buddy. It's a real shame but will happen again with RT, mark my words! Nvidia will somehow manage to ruin it with crazy licensing models and a "inferior" open source or cheap implementation will become the defacto because dev houses will refuse to license it if the alternative offers 80% of features for 10% of the price.

I find it a bit frustrating as I know what say Quake 2 would look like even with just a straight up realtime replacement for the baked lightmaps and general lighting which I think would give people an appreciation for what can be done even with the relatively limited hardware on Turing. But it would require reworking 12,000 lines of code in the renderer to replace it with something like Vulkan as well as building in a realtime functional replacement for the lightmaps and I simply don't have the time to get upto speed on that.

(With all due respect to the current RT implementation done with Quake 2 as it is a massive undertaking it misses the mark by a long way compared to what could be done).

Also Nvidia blocking users from using PhysX despite they have a dedicated Nvidia graphic card when an ATI/AMD GPU is detected didn't help neither and further alienate the users.

Yeah that was the big problem - no dev wanted to fundamentally build their engine around it with the uncertainty and potential for blocking out a large proportion of their potential market.
 
I am running Freesync on my 4K TV over HDMI at 60Hz with HDR enabled (in games that support HDR). Nvidia can't even do Gsync or Freesync over HDMI.

If Nvidia doesn't sort this out I will be moving to AMD once they have something that can beat the 2080ti
I'm planning to grab a 120hz OLED 4k HDR tv this year and don't want to be stuck at 60hz with Nvidia

I have zero confidence that TV makers will implement displayport into their IO, so the only option is for Nvidia to support adaptive/freesync over HDMI

As for those BFG Nvidia Gsync LED TV's, nah I want OLED and also those TVs will never get sold here anyway
 
I have zero confidence that TV makers will implement displayport into their IO, so the only option is for Nvidia to support adaptive/freesync over HDMI
Just to be clear, the TV would still need to support Freesync (same as monitor) in order for it to work.

Samsung do Freesync monitor, so it wasn't too hard for them to get that working on their newer TV as well. Not too sure about Sony or Panasonic, but LG should have no excuse since they do Freesync monitors, but it is still up to them if they would include it as part of the features for their TV.
 
I totally knew you would pick me up on physx, am fairly sure we had some pretty interesting conversations about it back then as well. I guess when I'm talking "vastly superior" I'm talking about games built around havok as an engine vs physx, some of the best titles in our generation were havok games at the very core whereas physx always seemed tagged on. At a deep technical level I'll bow to your experience as I have zero physx experience, havok though i used to love developing with it, it strikes me as a vastly under appreciated engine.

Just look at this list... https://www.havok.com/showcase/

For something people tend to know little about that's a really impressive list.

As someone who had to work with PhysX and Havok (during the Intel days and post Intel days) PhysX as a tech and in support was much easy to work with and had better support in my experience (in middleware rather than games them selfs).
 
As someone who had to work with PhysX and Havok (during the Intel days and post Intel days) PhysX as a tech and in support was much easy to work with and had better support in my experience (in middleware rather than games them selfs).

One of the problems with comparing them was that PhysX being a more complete physics simulator was often a bit slower for simpler instances - I can't remember exact numbers off the top of my head but if you built them into the base game for instance and had say 20 rigid body objects on screen Havok might have only taken 1-2ms to process while PhysX might have been sitting at 6-7ms but then as you turned it up Havok would start to choke and either rate throttle the simulation or start running into the 100s of ms per frame while PhysX might have still been sub 16ms. (Garbage numbers but should illustrate my point).
 
Just to be clear, the TV would still need to support Freesync (same as monitor) in order for it to work.

Samsung do Freesync monitor, so it wasn't too hard for them to get that working on their newer TV as well. Not too sure about Sony or Panasonic, but LG should have no excuse since they do Freesync monitors, but it is still up to them if they would include it as part of the features for their TV.

Yes that is correct, 2019 LG TV's do support variable refresh gaming: https://www.flatpanelshd.com/news.php?subaction=showfull&id=1546474656
 
I'll have a watch buddy. It's a real shame but will happen again with RT, mark my words! Nvidia will somehow manage to ruin it with crazy licensing models and a "inferior" open source or cheap implementation will become the defacto because dev houses will refuse to license it if the alternative offers 80% of features for 10% of the price.


No licensing required, it is part of both DirectX and Vulkan.
 
Back
Top Bottom