• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

It’s pants.

Maybe but it has to start somewhere.
Just think where would we be if AMD hadn't pushed Trueform when they did, or the first time vectors were used instead of sprites, or the first parallax scrolling game.
These technologies have to start somewhere and now the hardware is just about capable of ray tracing, so NVidia have introduced it to the world in the form of RTX.
It will only improve with future generations
 
What game/s did you play with DLSS on?
Did you need the performance uplift?

Was more talking about ray tracing, if you want to spend extra for ray tracing beta testing shadows then good for you. Bf5 was un noticeable when I played it. Dlss made things look worse so was better with it off. As I said it’s just pants.
 
I don't think proper real-time ray tracing is really anywhere near being viable yet. It needs a huge leap in performance, maybe not until we move on from silicone based chips.
 
My major gripe with RTX is that its a tech that's contrary to the long-term trend of ever increasing resolutions and monitor refresh rates. Looks to me like you can't have ray tracing without sacrificing resolution, frame rate, or image quality (via DLSS) - or all three. Most of us aren't on 1680x1050 or 1080p 60 hz monitors anymore. Those that are, I suspect they'd be more awed by the increase in sharpness to 1440p or 4K at 100 hz+ versus dolling up their soft as heck 60 hz experience with ray tracing.

Throw in the rumours that NVidia RTX tech will be obsolete and unsupported within 1-2 years or just massively overtaken performance-wise in ray tracing, then sounds like this first gen RTX series will have only its traditional rasterization graphics performance going for it soon except in Quake 2 RTX.
 

“Use rayctracing effectively”

That is still to be seen - all other attempts to do ray tracing using traditional cores have resulted in poor performance.

Clearly that video is just a hit piece

Because in reality and and nvidia rayvtracing will coexist. Nvidia is not using some propriety software for rayvtracing - it’s using the standard DX12 DXR feature as specified by Microsoft. The acceleration is hardware side and not software. Unless AMD uses a propriety API to implement the software side, then rayvtracing will still work on nvidia cards in the future because it’s enabled on the DX12 driver
 
“Use rayctracing effectively”

That is still to be seen - all other attempts to do ray tracing using traditional cores have resulted in poor performance.

Clearly that video is just a hit piece

Because in reality and and nvidia rayvtracing will coexist. Nvidia is not using some propriety software for rayvtracing - it’s using the standard DX12 DXR feature as specified by Microsoft. The acceleration is hardware side and not software. Unless AMD uses a propriety API to implement the software side, then rayvtracing will still work on nvidia cards in the future because it’s enabled on the DX12 driver

Can't say I'd call this poor performance:


Especially as its a 'not-even-trying-only-a-Vega 56' tech demo.

If NVidia posted that tech demo on a 2080 Ti we'd all be salivating about Cyberpunk 2077's neon ray-traced city-scapes exclusive to RTX GPUs in 2020. Instead NVidia are showcasing Minecraft and Quake 2 RTX. This sort of thing makes me wonder if NVidia and the GeForce series can possibly go the way of 3DFX and the Voodoo series, or whether too-big-to-fail applies to GPU manufacturers now.
 
That's not exactly good either. It's a tech demo, not a game which would be more demanding. It's running on global medium settings, not high, not ultra, and is only 1080p 30fps. old GTX cards achieve the same performance with no effort as well.
 
The other problem is while clever that is pretty much the limits of traditional techniques, requires a lot of special casing to make it work, and will run into a dead end along with traditional techniques - you won't see that supplanting (proper) ray tracing in the long run.
 
I won't pretend to be an expert in that demo, but I remember talk pre-RTX launch that the Vega and upcoming 7nm Vega could be capable of ray-tracing through sheer compute power in a way the old GTX cards could not. When we see the same thing running on a GTX 1070 then we can officially debunk that tech demo

Obviously Vega is a different beast to our current Navi and the RDNA 2.0 Navi coming to consoles with ray tracing, though somehow I doubt these upcoming cards copied NVidia RTX cores and suspect they will be approaching the same challenge in a new very different way.
 
I won't pretend to be an expert in that demo, but I remember talk pre-RTX launch that the Vega and upcoming 7nm Vega could be capable of ray-tracing through sheer compute power in a way the old GTX cards could not. When we see the same thing running on a GTX 1070 then we can officially debunk that tech demo

Obviously Vega is a different beast to our current Navi and the RDNA 2.0 Navi coming to consoles with ray tracing, though somehow I doubt these upcoming cards copied NVidia RTX cores and suspect they will be approaching the same challenge in a new very different way.

Firstly that demo is running an a GCN card, we don’t yet know how GCN compares to RDNA for ray tracing - we don’t know if on a per core basis it’s worse, same or better yet. It’s pre mature to suggest otherwise. What we do know is that AMD is keeping GCN for compute and RDNA for gaming which suggests GCN would still be the compute king.

You won’t see GTX cards or any cards until Crytek Releases that demo publicly. We have seen GTX cards run actual games though with similiar performance to that demo albeit games are more demanding than that demo.

The point I was making is that we should look at Ray Tracing like Tesselation, not Physx. That’s the mistake many people are making.

They think Nvidias way of games utilising ray tracing is like Phsyx and therefore it will die out.

They don’t realise that it’s actually more like tesselation in that both amd an nvidia run the same tesselation in games and the actual performance just comes down to how it’s handled by the driver and gpu architecture, it’s not locked down. If ray tracing was locked down it wouldn’t even run on GTX cards just like physx did not run if you didn’t have a physx add on card.

And then it’s again to pre mature to say one side is better than the other - we just don’t know who will end up faster yet, but I believe both will be running the same ray tracing in the same games and developers won’t need to do anything special for it, it’s just down to how calculations are done by the card.

I’d like to believe that console code translates into dominant gains for AMD on PC but in the past it just hasn’t happened. When the PS4 came out with its 8 core x86 cpu, many people expected that Bulldozer cpus would get a big performance boost thanks to all its cores - it never happened. People thought AMD GPUs would get a big boost thanks to developers using a 7000 series amd gpu - it never happened
 
Last edited:
Back
Top Bottom