• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: Ray Tracing - Do we care?

Ray Tracing - Do you care?


  • Total voters
    183
  • Poll closed .
Caporegime
Joined
23 Apr 2014
Posts
29,524
Location
Bell End, near Lickey End
If we go by their RTX on/off BF comparison, with ROTTR.

Here we have RTX on (light source behind Laura, off screen, just like the tanks flame in BF, which was off screen, to the left):-

rise-of-the-tomb-raider-shadow-quality-003-medium.png


Same shot, with the RTX off :-

RTX_Off.png


:D

RTX off is only available on an OLED panel ;)
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
The entire point of doing rays on the GPU is because CPUs suck at it:
amd-radeonrays-2.jpg


NB: Don't try and convert between AMDs Mrays and Nvidia's Gigarays, they're not even using the same measurement system, like things would ever be that simple lol.

You are correct. AMD counts 360Mrays on the FirePro 9100 but with real time rendering also. (if someone bothers and look at the page displays multiple numbers and kinda technical stuff)
Nvidia counts their 10Grays by using pre-backed (rasterized) graphics those on the Star Wars video which was 1080p @ 24fps.

However I doubt Nvidia will be selling the RTX2080Ti with the same performance at the RTX8000 at 1/10 of the price. Something else is fishy here.
At least AMD charges big extra over the Vega 64 due to more HBM ram, and some support. Not 10 times over though.
 
Soldato
Joined
11 Jan 2014
Posts
2,754
30+ fps @1080p is really good for ray-tracing. Using a 32 core CPU would probably get <1fps. Ray-tracing will be used for effects like shadows and reflections for the first few years as that's what it's best at. Would be interesting if they got Cinebench to use it, that would give a good CPU vs GPU comparison.

Last game I ran at 30 f.p.s with all the graphic settings maxed out was Oblivion on a 1900xt. I could live with some lower frame rates if we got a move away from multiplayer to more single player, open world, visually stunning games. It will be early days for the software and drivers so hopefully things move on.
 
Associate
Joined
14 Dec 2016
Posts
958
I'm all for it, its lovely looking and exciting... but I don't think we are anywhere near ready for mainstream ray tracing yet.

We just don't have the power to make it feasible.

I do expect that we will towards the end of a decade tho, which will be nice!

This is my argument, by the time this stuff is Mainstream these RTX cards will be EOL... adding it via patches to games is rubbish, they done the same with DX12 and added that as patches to games and that is rubbish, what is going to make this any different? oh wait.. .its Nvidia so its bound to be much better... im not convinced myself.. i cant see this being very good for a couple of years, once its baked into stuff at inception, and the hardware is there to support it fully, then its going to be great.

But as i say above, i can guarantee everyone buying these cards is going to laud Nvidias bravado at heralding us into a new age of consumer tech via the godlike power of RTX, where as if this was AMD pulling this stunt there would be people with pitchforks and torches camped outside their offices.

Lets be honest here, this is the same fiasco as DX12 but even worse, as this is actually being added extra to the cards and shovelled out the door and is the actual main selling point of the cards themselves, atleast DX12 was just an added bonus for all intents and purposes.
 
Caporegime
Joined
17 Mar 2012
Posts
47,734
Location
ARC-L1, Stanton System
Ray Tracing is nothing new and its never needed special proprietary hardware to run.

I don't understand why now we need massive dedicated hardware to do the same thing done 10 years ago.

Can anyone explain that?
 
Associate
Joined
11 Jun 2013
Posts
167
30 fps @ 1080? is that like 40 terableflops a frame? the 1080/1080ti's are looking better by the hour. Think ill keep mine where it is for now.. seen a lot of ti's go on bay for around 350ish right after the broadcast, gonna be some ****** off ppl in a month or 2 ^_^
 
Associate
Joined
14 Dec 2016
Posts
958
30 fps @ 1080? is that like 40 terableflops a frame? the 1080/1080ti's are looking better by the hour. Think ill keep mine where it is for now.. seen a lot of ti's go on bay for around 350ish right after the broadcast, gonna be some ****** off ppl in a month or 2 ^_^

Quick hoover up all the cheap 1080ti's for £350 and then once the cat is out of the bag with this new card, relist them all for £450 :)
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
AMD offer no rivalry. Nvidia dominate and screw us all over.

Nvidia cannot be beaten, is like the Borg.
So since RTX2080Ti is not powerful enough for you, AMD provides you with a solution.
Buy a Threadripper CPU and you will be able to fully utilize 4 of those 2080Tis to run with the Nvlinks.

Jensen said yesterday that they will guarantee better mGPU performance and scaling from now on.

I personally blame AMD for not competing, because instead of investing money to make an expensive GPU, are forcing everyone to buy multiple RTX2080Tis, and buy into their HEDT platform for support. Which we all know it cost AMD $150 to make a 2920X/2950X having the highest profit margin.

Bad AMD..... Scoundrels of the free market....

/sarcasm
 
Caporegime
Joined
4 Jun 2009
Posts
31,109
Yup, it amazes me people praising this as the next greatest thing ever!!!!! zOMG!!!! Just people showing their real colours now ;)

It's just like every other gimpwork effect i.e. performance heavy, not necessarily any better when game developers do it properly themselves via the game engine (seriously BF 5 in game shadows/reflections look like they have all but removed the engine's effects here, it looks terrible compared to previous games :o)
 
Soldato
Joined
8 Jun 2018
Posts
2,827
If we go by their RTX on/off BF comparison, with ROTTR.

Here we have RTX on (light source behind Laura, off screen, just like the tanks flame in BF, which was off screen, to the left):-

rise-of-the-tomb-raider-shadow-quality-003-medium.png


Same shot, with the RTX off :-

RTX_Off.png


:D

LOL so true...so true...
Would it have been possible to have dedicated Ray Tracing cards that are separate from the normal GPUs in the same way a PhysX card used to be.

With NVLink being used now would the above have been possible.

This would have kept the die size and cost down on the main gaming GPU and left the choice and expense up to the end user as to if they wanted to use Ray Tracing.

Maybe the above is not practical....
Well yeah it would have made more sense. Heck I wouldn't be surprised to see something similar on consoles if Sony/MS take ray tracing serious enough.
But we did have that at one time with ageia physx card. Nvidia wouldn't have any of that and to this day the card is nothing more then a relic as it's completly shunt via drivers.
 
Soldato
Joined
1 Jun 2013
Posts
9,315
The lack of any serious advancement from Nvidia (and why should they in the face of indifferent and lacklustre competition from AMD) they are simply selling the same old architecture. But how to sell the same old same old with nothing new? Invent a new unique selling point of "RAY-TRACING !!!". It's got a geeky name, everyone's heard of it, no one really knows what it means or how it's gonna work, but it's "new and improved" to sell the 20xx series cards that aren't quite rebadged Pascal. Sure, the developers get some new development tech to play with before it's useful, and help and cash from Nvidia to implement ray tracing as spot effects just like physX, but it's mostly marketing to try and get people to buy overpriced old tech in the face of the bursting crypto-currency bubble.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
33,188
There's absolutely nothing wrong with proprietary hardware, Nvidia's cuda cores are proprietary. Proprietary hardware can speed up open source software with no problems. There is no problem adding ray tracing specific cores. Why have faster gpu cores today than the ones 10 years ago... obvious answer, because they are faster. If a gpu 10 years ago cuts performance by 90% doing 10 ray tracing beams per scene and 1 year old gpus drop 70% performance doing it, then adding some accelerators that are better optimised for ray tracing reduces that performance loss to say 40%, then as long as the new hardware takes up less space(by a fair margin) than just adding more normal gpu cores to bring performance to the same level, then it's worth doing.

But making hardware that can accelerate industry standard ray tracing code that doesn't impede what other hardware makers are doing and doesn't impede software makers having to cater to specific locked in code for one manufacturer would be bad. I doubt Nvidia has done that. The hardware can likely accelerate most general ray tracing code as their gameworks version won't be that far removed from industry standards. Just with some code backed in that makes it run worse on other hardware or not run at all.

Adding new hardware for accelerating features isn't a bad thing in general for anyone. For tessellation to even start being programmed for the hardware needed to be there, then once game devs start planning to program for it, adding more hardware to actually use it effectively becomes worthwhile. I won't hit Nvidia for adding accelerators to push the industry, I will hit them for pushing software lock ins and generally pushing flash over substance with most of their gameworks effects. Physx being added in after the fact and running like **** on cpu and AMD hardware on purpose for years held the industry back years.


The question for Nvidia uses is, how much die space does the ray tracing take up, is it mostly wasted or does it bring value yet and will Nvidia actually implement it in a way that is useful or will it kill performance, provide bad looking effects that stand out from the rest of the scene in such a way as to actually worsen the overall image (having shiny lighting in one part of a scene and very different lighting in the rest can look worse than a 'worse' lighting applied evenly to a scene).

I suspect the answer is, card prices are increasing, performance gains don't look great and performance hit for using ray tracing still looks huge for effects that aren't being implemented well.

On ray tracing itself, since the really early 90s ray tracing was held up as the ultimate quality effect... but compared to early/mid 90s rasterisation this was night and day. The thing is ray tracing hasn't moved on, because in reality it can't move on, it's the end game and it can't really be done better. But the other side, rasterisation and every trick learned in 25 years means the gap is now exceptionally small.

Humbug is talking about how the reflection was more accurate if you look carefully... but the other 99% of the scene looked pretty much the same. More over when you're actually moving and playing a game and not spending time just looking at a reflection under a bridge, it's nearly unnoticeable. That is the same thing I said about Physx. Estimating where a piece falls, at what rate and where it ends up really makes no difference to getting a realistic outcome because you can estimate well enough that they look nearly identical.

If you round up gravity in physics to 10m/s to make calculations easier, you can't physically feel the difference when you play. If a piece of wall falls 3 inches to the left and bounces twice or falls 3 inches to the right and bounces 4 times, again in reality it makes no difference. A huge amount of power to get the 'right' answer, when maybe 30% of the power can get you 99.8% of the same result is just not particularly efficient.

The strange thing is that since that early 90s when ray tracing was held up as the ultimate goal and going to be a massive game changer, the fundamental belief hasn't changed, even though obviously the amount of game changing it would have achieved early 90s compared to now is radically different.

That video Humbug posted of the gems, ultimately for all the realistic lighting, the entire scene felt fake to me. It didn't feel like the real world, it seemed too shiny, everything was 'clean', the metal disc that the stand was rotating on looked fake, basically the whole thing looked fake and the human eye couldn't detect between real reflections in the gems or estimated reflections in the first place because the detail is far too small and so many slightly different ones that if each reflection was switched to a different edge, we really couldn't tell.
 
Last edited:
Soldato
Joined
6 Jun 2009
Posts
5,436
Location
No Mans Land
The lack of any serious advancement from Nvidia (and why should they in the face of indifferent and lacklustre competition from AMD) they are simply selling the same old architecture. But how to sell the same old same old with nothing new? Invent a new unique selling point of "RAY-TRACING !!!". It's got a geeky name, everyone's heard of it, no one really knows what it means or how it's gonna work, but it's "new and improved" to sell the 20xx series cards that aren't quite rebadged Pascal. Sure, the developers get some new development tech to play with before it's useful, and help and cash from Nvidia to implement ray tracing as spot effects just like physX, but it's mostly marketing to try and get people to buy overpriced old tech in the face of the bursting crypto-currency bubble.

tiz9p_zpsg7ju6omt.jpg


:p
 
Caporegime
Joined
17 Mar 2012
Posts
47,734
Location
ARC-L1, Stanton System
@drunkenmaster as far as i can tell nVidia have added nothing to Ray Tracing, it already exists in some game engines to the same extent, some of them years old, and it does not need dedicated hardware to run.

Its as if nVidia have taken existing technology, recoded it to require their hardware and then run really not very well at all for some planed obsolescence.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
It's strange though. Why ray tracing? This release from Nvidia seems to attempt to throw shade at Intel. As this was Intel's Territory.
I wouldn't be surprised in the slightest if Intel's GPU can actually do RT 1080p at or above 60fps.
 
Last edited:
Soldato
Joined
20 Jun 2011
Posts
3,675
Location
Livingston
Raytracing = on the fly volumetric light rendering.

Amazing if you’re rendering still images and have time to appreciate and manipulate composition, but as a global setting it is entirely pointless performance hit.
 
Back
Top Bottom