• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ray Tracing - Do we care?

Ray Tracing - Do you care?


  • Total voters
    183
  • Poll closed .
Soldato
Joined
29 May 2006
Posts
5,353
All very clever and no doubt Nvidia will sell loads of Turing cards to the gullible, thinking that their games will now look like this, and I'm sure that one day they will, but its going to take numerous generations of Nvidia RTX cards before they come anywhere near that.
Its not just for games though. Its a great development for developers and its great for end consumers like me that like to mess around with Ray Tracing at home. It could also benefit those without ray tracing cards. Many of the faked pre baked shadow and light effects in current games are done via Ray Tracing in development.

Take the game development engines Unreal or Unity for example many devs use Ray Tracing to setup all the lights and shadows and once they have it looking good they pre bake it for games. Having a good RT card in development will speed this up and allow more time to be spent on getting better lights and shadows setup up for none RT games. As D.P. said better RT is great for content creation. Even if you do not use RT directly you should see the benefits from it.
 
Associate
Joined
15 Feb 2015
Posts
1,064
I care.

This looks to be the first time we've had something that provides real-time (or at least close to it) ray-tracing capabilities in consumer hardware.

Also you've got to start somewhere and by putting the capability out there Game Developers now at least have an option to use it.

I don't expect anything to happen overnight but I think this is 'a good thing'.
 
Soldato
Joined
19 Feb 2007
Posts
14,343
Location
ArcCorp
For gaming specifically no, Stick with the more traditional forms of game making, RT will eventually be useful but for the next few years it will be a gimmicky option much like Physx and Hairworks that will cripple most systems with not a lot of visual payback.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
By the time raytracing is common in games and not one off or of limited use like Metro Exodus, then we would probably be into 2020 with more powerful generation of cards.
Trying to jump to the wagon now, thinking that is future proof going to be the big mistake.

That's the problem, They're always going to be looking for ways to make hardware do things differently to how it's done on the last gen & beyond, If they didn't we'd soon hit a point where there'd be no point in upgrading.
In theory someone with a 1080 or 1080ti could spend the next 5 plus years playing 1080p or 1440p comfortably and they don't want that. One way or another they'll find a way to scupper performance.

That is the funny bit. Given that consumer Vega 64 is exactly the same chip found in the Fire Pro WX 9100, and the latter is advertised for GPU rendered ray tracing, who knows.

What we do know is that the V64 is great on computing tasks and has the grunt power on that department, even if on gaming graphics is lacking compared to a similar (compute) GPU.

Given past history of AMD over engineering their GPUs before their time, I wouldn't be surprised if it came out with a driver update allowing hardware ray tracing.

Also we know that AMD has announed that RadeonRays 2.0 is backwards supported all way to Hawaii based Fire Pro, which used the 290X, and already supporting ray tracing on DX12, Vulcan, Embree and OpenCL.

Yet that might not be possible with GCN architecture, as it is not also with plain Cuda cores found in eg Pascal. However we do know Vega is the last GCN card, as Navi is already announced that has completely new core architecture, and it's successor another new core architecture.

Time will show.

It'll be great if Vega can handle ray tracing in games that use it but if not let's hope it's something we can switch off in a games graphics menu or something AMD can lessen the impact of in the driver features as they have with tessellation.
As I've been proving to myself with my RX480 at uw1440, a tweak in a games main menu settings is all I've needed to do to make a game run a lot better and usually it can be done for a very small visual penalty so if that continues to be the case with upcoming ray tracing games not going RTX is not that big a deal.

Didn't I read that lower end Nvidia gpu's like the 1060 replacement will still use the GTX moniker? Does that mean they don't have the RT capability or is it still there and GTX is only being used to set the high end apart from the lower end?

For gaming specifically no, Stick with the more traditional forms of game making, RT will eventually be useful but for the next few years it will be a gimmicky option much like Physx and Hairworks that will cripple most systems with not a lot of visual payback.

Sounds about right.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
That's the problem, They're always going to be looking for ways to make hardware do things differently to how it's done on the last gen & beyond, If they didn't we'd soon hit a point where there'd be no point in upgrading.
In theory someone with a 1080 or 1080ti could spend the next 5 plus years playing 1080p or 1440p comfortably and they don't want that. One way or another they'll find a way to scupper performance.



It'll be great if Vega can handle ray tracing in games that use it but if not let's hope it's something we can switch off in a games graphics menu or something AMD can lessen the impact of in the driver features as they have with tessellation.
As I've been proving to myself with my RX480 at uw1440, a tweak in a games main menu settings is all I've needed to do to make a game run a lot better and usually it can be done for a very small visual penalty so if that continues to be the case with upcoming ray tracing games not going RTX is not that big a deal.

Didn't I read that lower end Nvidia gpu's like the 1060 replacement will still use the GTX moniker? Does that mean they don't have the RT capability or is it still there and GTX is only being used to set the high end apart from the lower end?



Sounds about right.

We shall see, and also we shall see what it's perf would be like though I am baffled why AMD has published the performance of the 9100 with real time rendering + ray tracing at same time (360M rays/s), while Nvidia only for ray tracing using pre-baked graphics (6G rays/s on the smaller quadro but no real time rendering)
Kinda annoying when we cannot have same scenario to make reference and comparisons.
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Kinda annoying when we cannot have same scenario to make reference and comparisons.

That's how Nvidia like it, Having the latest and greatest features performing best on their high end products reinforces the halo imagine they want products like the --80ti & Titan etc to have and if they can have said feature impacting negatively on the competition all the better even if it negatively impacts their own lower end and last gen cards as we've seen from them in the past.

I'm more than happy to stay with Vega and Freesync rather than give my money to them, it's a shame more people won't do the same and stick with there 9\10 series or RX cards, The way they are pricing cards higher and higher will continue up until the point where people say enough is enough and by that time they won't really care as they'll of moved their focus elsewhere with AI etc.
As consumers we've brought this on ourselves.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
No.

They will make games as they are currently making them. RT will be another layer which can be toggled on/off in the graphics settings.

That's how I see it being done.

No need to go out of their way to accommodate this or that cards. You either enable or disable it depending on your hardware.

A bit like how in Wolfenstein the devs recommended Nvidia users turn off some settings in the graphics options as they where focused for Vega hardware.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,618
The RTX cards won’t save content creators any work if said creators have to accommodate mid tier RTX cards and lower Tier GTX cards at the same time, it’s creating work!


You misunderstand.
Content creators do something like place a light in a gamer level and tune some parameters. Currently they either don;t get any real-time feedback, or the real-time feedback is a very poor approximation wityh greater limits on the number of lights and effects. They then render the static light-maps offline, this might use a GPU's compute capabilities but even so it can take several minutes.
With RTX, since the changes can be viewed in real-time. the content creator can just push a slider up and down and see in real-time what that looks like with the same quality as the final lightmaps. they can add more light, add more complex effects and still get to see it all in real-time. Creating the static lightmaps is then instantaneous.

Everyone then wins, even a;lhtough the game is not using real-time RT in the slightest.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
That's how Nvidia like it, Having the latest and greatest features performing best on their high end products reinforces the halo imagine they want products like the --80ti & Titan etc to have and if they can have said feature impacting negatively on the competition all the better even if it negatively impacts their own lower end and last gen cards as we've seen from them in the past.

I'm more than happy to stay with Vega and Freesync rather than give my money to them, it's a shame more people won't do the same and stick with there 9\10 series or RX cards, The way they are pricing cards higher and higher will continue up until the point where people say enough is enough and by that time they won't really care as they'll of moved their focus elsewhere with AI etc.
As consumers we've brought this on ourselves.

Is more complicated also. See my post here.
LG 34GK950G, 3440x1440, G-Sync, 120Hz

Basically Nvidia users going to be stuck at Gsync SDR for years to come, with their "faster" graphic cards, while the rest will have less FPS (not that matters with *sync) but better image quality at lower price.
 
Caporegime
Joined
17 Mar 2012
Posts
47,659
Location
ARC-L1, Stanton System
That's the problem, They're always going to be looking for ways to make hardware do things differently to how it's done on the last gen & beyond, If they didn't we'd soon hit a point where there'd be no point in upgrading.
In theory someone with a 1080 or 1080ti could spend the next 5 plus years playing 1080p or 1440p comfortably and they don't want that. One way or another they'll find a way to scupper performance.



It'll be great if Vega can handle ray tracing in games that use it but if not let's hope it's something we can switch off in a games graphics menu or something AMD can lessen the impact of in the driver features as they have with tessellation.
As I've been proving to myself with my RX480 at uw1440, a tweak in a games main menu settings is all I've needed to do to make a game run a lot better and usually it can be done for a very small visual penalty so if that continues to be the case with upcoming ray tracing games not going RTX is not that big a deal.

Didn't I read that lower end Nvidia gpu's like the 1060 replacement will still use the GTX moniker? Does that mean they don't have the RT capability or is it still there and GTX is only being used to set the high end apart from the lower end?



Sounds about right.

AMD's cards can handle Ray Tracing just fine, very bit as well as nVidia's cards.

The worry is that with nVidia pushing it and their almost complete domination of the graphics space they are going to 'Game Works' it IE deliberately making it so it runs bad on competitors GPU's, which from 2020 will include Intel.

Lots of games already use Ray Tracing.

Killzone 4 (2013) for Real Time Reflections
Crysis 3 also from 2013 for Real Time Reflections and Gloabal Illumination based lighting.
Dirt Showdown again, Real Time Reflections and Gloabal Illumination based lighting.

The last two are AMD sponsored titles and they are the ones who initially brought in Ray Traced effects in those games, remember that? especially loud about it with Dirt Showdown.

Its nothing new, nVidia are not even the first but they will claim it as their own and make sure they are best at it, that's the real concern.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
AMD's cards can handle Ray Tracing just fine, very bit as well as nVidia's cards.

The worry is that with nVidia pushing it and their almost complete domination of the graphics space they are going to 'Game Works' it IE deliberately making it so it runs bad on competitors GPU's, which from 2020 will include Intel.

Lots of games already use Ray Tracing.

Killzone 4 (2013) for Real Time Reflections
Crysis 3 also from 2013 for Real Time Reflections and Gloabal Illumination based lighting.
Dirt Showdown again, Real Time Reflections and Gloabal Illumination based lighting.

The last two are AMD sponsored titles and they are the ones who initially brought in Ray Traced effects in those games, remember that? especially loud about it with Dirt Showdown.

Its nothing new, nVidia are not even the first but they will claim it as their own and make sure they are best at it, that's the real concern.

True. Also RadeonRays is under GPUOpen it works the same on AMD and Nvidia cards, as it supports Cuda programming.
And that shows the sad and dangerous monopolistic Nvidia tactics.
 
Soldato
Joined
18 May 2010
Posts
22,376
Location
London
Nvidia cannot and will not defeat the Open Source community.

It just wont happen. So trying to black box the technology they will always lose out to the open source alternatives (as long as they are comparative in quality).

They will make it a run away success by Open Sourcing it tho. Which would include AMD.
 
Caporegime
Joined
17 Mar 2012
Posts
47,659
Location
ARC-L1, Stanton System
True. Also RadeonRays is under GPUOpen it works the same on AMD and Nvidia cards, as it supports Cuda programming.
And that shows the sad and dangerous monopolistic Nvidia tactics.

Unfortunately yes.

Again i'm not the best hobbyist to make demos about this but from this time stamp.... this is Real-Time Reflections, this is Ray Tracing.

https://youtu.be/exW1SJUSr90?t=2m1s
 
Permabanned
Joined
12 Sep 2013
Posts
9,221
Location
Knowhere
Imho even if someone like LG wanted to use the Gsync HDR module, they are realistic about the adding an extra ~$800 on the price of the monitor. Including an active fan.

And I wrote $800 because the whole module doesn't cost "$500" everybody quotes.
Seems many forgot to read the same article they are quoting, that the "$500" is estimate price for the FPGA CPU only, assuming Nvidia can buy them at 80% discount over the street price. The rest of the board BOM costs another $250-$300 according to the same article. And that without adding Nvidia profit margins and manufacturinga and designing costs.

I've seen a lot of posters quoting G-sync module component costs like these but I don't believe they're even close to accurate. Where are these figures? They don't make sense.

So, what would be preferable from now on? Having a series of Gsync HDR monitors at £2000+ while their Freesync syblings are at half the price?
Or you want the prices to be diluted and having the Gsync monitors being sold at loss (~£1500) and the Freesync at inflated prices to cover up the damages, effectively having the Freesync users paying the costs of the Gsync modules?

A monitor company increasing the price of their Adaptive sync monitors to cover the overheads of their G-sync monitors is never going to happen so I'm not sure what point this question is meant to make, It'll never be an option.

AMD's cards can handle Ray Tracing just fine, very bit as well as nVidia's cards.

The worry is that with nVidia pushing it and their almost complete domination of the graphics space they are going to 'Game Works' it IE deliberately making it so it runs bad on competitors GPU's, which from 2020 will include Intel.

Lots of games already use Ray Tracing.

Killzone 4 (2013) for Real Time Reflections
Crysis 3 also from 2013 for Real Time Reflections and Gloabal Illumination based lighting.
Dirt Showdown again, Real Time Reflections and Gloabal Illumination based lighting.

The last two are AMD sponsored titles and they are the ones who initially brought in Ray Traced effects in those games, remember that? especially loud about it with Dirt Showdown.

Its nothing new, nVidia are not even the first but they will claim it as their own and make sure they are best at it, that's the real concern.

That's what I'm expecting to happen too, hence the part about it being as damaging to their older cards performance as it is to AMD's. It'll be how they entice Pascal owners to upgrade, we've seen it before & we'll undoubtedly see it again.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
True. Also RadeonRays is under GPUOpen it works the same on AMD and Nvidia cards, as it supports Cuda programming.
And that shows the sad and dangerous monopolistic Nvidia tactics.


Nvidia RTX technology use the industry standard Microsoft DXR API.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
I've seen a lot of posters quoting G-sync module component costs like these but I don't believe they're even close to accurate. Where are these figures? They don't make sense.

The FPGA CPU used on the HDR Gsync module has street price $2600. The rest of the parts on the module cost $250-300 as it bit more beefed up than the normal Gsync module that costs $250. Do the maths.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Nvidia cannot and will not defeat the Open Source community.

It just wont happen. So trying to black box the technology they will always lose out to the open source alternatives (as long as they are comparative in quality).

They will make it a run away success by Open Sourcing it tho. Which would include AMD.


Nvidia have no desire to defeat open source, they are big proponents. Nvidia are pushing OpenGL and Vulkan far more than AMD is.

With Ray tracing, Nvidia has already provided an open source materials library MDL which is used by the likes of Adbobe and Unreal.

RTX is being supported in open source projects like Blender.
 
Back
Top Bottom