Discussion in 'Graphics Cards' started by StarShock, Aug 16, 2018.
My take on this will be those who jump to the 20xx cards will soon be turning RTX off for decent fps as fancy shadows, lighting and reflections wont mean that much in games if it kills the performance. Just my opinion.
Are there any videos that categorically show how amazing ray tracing is? ie a comparison of a game with/without it on?
From the clips I’ve seen it looks great but its probably a few years away from being useful. The hit on fps looks to be just to too big to bother using.
I think it's more likely to be implemented in such a way to not kill the games performance.
Hopefully g-sync will help too.
I really don't think this is something devs and even NV want to be used if it's going to make a game become sluggish/unplayable because of it. Hopefully from both the HW and software side it'll be a decent implementation for now.
People need to be realistic though. 4K for example is a struggle anyway so adding additional eye candie/reaslism at the res isn't going to be easy.
I'm working with a company on scientific calcs at the moment. We have different algorithms. For example if you want an accurate result the calc could take 2 hours. But we have a short version that gives us a pretty decent results after 30 mins. So while RT will no doubt be much better in 10 years time, and be more accurate, I suspect this is just the first step, but it'll be a decent step.
Thing is IMO, you need to get innoviations out there and used before then improving on them. Waiting for it to be perfected is not the right way.
I found this
So yes the reflections look great, but it doesn't seem to make a great deal of difference to the overall look of the game. I would prefer to not have a ridiculous low resolution explosion/fire effect for a start....
If it kills performance to such a degree as we have seen, it is just going to be another gimmick hat isn't worth using until thee hardware is actually powerful enough to use it.
It'll be difficult to know until I see games utilising it to its fullest.
Not sure how I can have any opinion until then.
The more I look at it I actually think it's kinda meh. Definitely a great start and I'm happy it's evolving .. but currently im not feeling like I'm missing out on anything. 5 years and it will be something for everyone at all price brackets at 120fps for 1440.
I still don't think the RTX off demos a true comparison, look at the flamethrower reflection on the ground when they switch off RTX at around 2:40, there's literally no light what-so-ever. I'm not knocking that the reflections in the puddles look good, I just think the comparison is bad.
Actually this is the first time I've noticed what looks like compression 'macro blocks' in the flamethrower, I'm hoping that's just youtube being youtube.
I always revert to this video he explains it well, basically says we are still a long way away from the CPU/GPU power needed for full RT.
Then we should have movie, or TV, level graphics.
BFV certainly opened my eyes about how much power real time Ray Tracing needs. I thought when they were discussing performance in BFV in demos at the Turing launch event that the game was full Ray Traced. I didn't realise that they were only using reflections. And to even get that playable at decent frame rate they have had to reduce the number of Rays. I wonder will we see hardware capable of running a game full ray traced in the next 2 years?
It really should be possible to do a hybrid ray tracing implementation with reduced ray coverage with the kind of performance hit that is in BFV doing more than just reflections. I think there is a bit of overlooking just how impressive real time lighting of the quality seen with Quake 2/3 pre-baked shadow maps would be like never mind something higher resolution and with additional bounce passes and real time caustics, etc.
I mean even I managed to come up with a way (nothing particularly revolutionary) of doing the first pass dynamically in real time at 180-300fps that normally takes minutes to pre-bake :s
That is far far short of a real ray tracing implementation (no bounce passes or material interaction, etc.) and lots of accuracy issues without using dedicated hardware - it is running on the CPU doing the equivalent of about 47 million rays per second.
EDIT: While perfectly possible to do there isn't even any falloff, etc. modelled in the above video but it was a crude proof of concept test just to see if it was possible at all.
If raytracing is in the DirectX 12 specifications through DXR, and nvidia is just complying with the Microsoft's standard DX12 feature, then AMD must be 100% joining the party soon.
Only if they have hardware to run it.
Pascal + RTX = 0
Having said that I have seen RTX running in BFV and to be honest the extra cost incurred can not be justified on a bang for buck basis.
Yes RTX is nice but it should not add £300 to £400 to the cost of a high end card like the 2080 Ti.
Yup, as I see it, DXR is just a set of instructions which can be called by software developers. The hardware drivers then do the job of taking the instructions and running them appropriately on the hardware level. For an raytrace instruction, nVidia has hardware dedicated to running them in RTX cards ( hence the speed benefit ). Non RTX cards can still calculate the instruction, but in a less dedicated part of the chip. ( I presume as long as the drivers are written such ). AMD will likely develop cards that are more efficient on doing those raytrace calculations.
But the whole ray-trace thing … I'm just not seeing it yet. I cant imagine it being the norm for a good while yet, nor that it needs to be the norm either. Other than for a realistic 3D game, or some ray trace rendering , is it going to benefit non-3d games, photo/video encoding, I cant imagine it will.
Hence why I bought a 1080Ti on the RTX release … considering I do photo/video more than gaming …. RTX seems a waste for me.
Interesting, Radeon VII has so much compute performance that sits hidden in games - can't AMD force raytracing acceleration through these transistors?
I think if I didn't game at 4k on 65 inch TV, I absolutely would care about ray tracing. However given the screen size I game at, I feel a resolution is probably more important than ray tracing.
If It was on a smaller screen, I'd prefer better reflections and lighting.
Its similar to 4k skyrim vs 1080p skyrim with ENB. I'd pick the ENB. Obviously RT isn't that darastic at present but I think it will be very soon as more developers learn to use (if they learn to use it)
I remember John Carmack talking to Ryan Shrout years ago about testing ray tracing with Fermi GTX cards. Obviously, no way near the performance of what we have so far, but I think it was something he was a proponent of for a long time.
Using the Titan V as a basis even if they could the results would be disappointing.
The Titan V is a bit slower than a 2060 at RTX despite having hardware that can run RTX.
I wonder whether further down the line RTX type cores will be developed to run non-RT instructions in an reasonably efficient manner for good performance.
It wouldn't surprise me if AMD came out with some sort of hybrid core that could do both, and leverage its sheer compute performance as well. That type of setup would mitigate the additional costs of dedicated parts of hardware on the chip.
Separate names with a comma.