• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Raytracing thread

So Microsoft DirectX 12 is behind Nvidia RTX? And for AMD they can use there own Radeon rays for DirectX 12 Ray tracing ?

DirectX Raytracing isn't limited to nvidia hardware, no. Nvidia are the first to implement a specific RT core as a seperate co-processor in a desktop GPU (powervr had a go at doing the same for a mobile chip but it didnt really go anywhere). AMD can use their GCN cores currently to also do some raytracing using DXRT but obviously the performance is going to be much less than if you have a dedicated set of RT cores.
 
Great bit of info there @Gregster cheers m8! Also having the cards for just 2 weeks already is *insanely* short, just mad!! Starting to get really excited for the RT possibilities now, the performance is going to end up being a non-issue I reckon (given the time to tune) :)
 
It will be interesting to see how Vega does. I think that can already do mixed precision execution like Turing's tensor cores. I guess it will be down to AMD's driver team to implement a path for Microsoft DXR that makes the most of Vega's abilities.
 
Honestly, I reckon 2K with reasonable framerates won't be too difficult given time. The RTX was a paper launch in every sense.
Odd considering how much time they've had to prepare for it, and that there wasn't really any pressure to launch at this time given how much stock of 10 series they have and that nothing can compete with them on performance.

They must be really Keen to roll out raytracing, they could have delayed the launch until the raytracing is actually launched by Microsoft in October.
 
So Microsoft DirectX 12 is behind Nvidia RTX? And for AMD they can use there own Radeon rays for DirectX 12 Ray tracing ?

Yeah, If an AMD GPU is used Dx12 calls in the appropriate hardware renderer, or changes to a fallback software renderer if no capable hardware is found.
 
It will be interesting to see how Vega does. I think that can already do mixed precision execution like Turing's tensor cores. I guess it will be down to AMD's driver team to implement a path for Microsoft DXR that makes the most of Vega's abilities.

That's what I want to know too. All those flops that Vega has going to waste have to be useful for something besides mining!!
 
Then they are talking this down. These cards have to do better. If not, short Nvidia stock NOW... they will be going down hard.

Actually I think if they got the games running at 60FPS at 1080p with full Ray Tracing, that would be pretty amazing. Then turn down the effects and use some kind of hybrid rendering, they could probably get much higher framerates at higher resolutions too. Real time Ray Tracing is the holy grail, it is easier to work with, but requires massive amounts of computational power.

I guess, at this moment in time, It's how much of the actual "Real" in Real time Ray Tracing we are willing to sacrifice for good frame rates.
 
Actually I think if they got the games running at 60FPS at 1080p with full Ray Tracing, that would be pretty amazing. Then turn down the effects and use some kind of hybrid rendering, they could probably get much higher framerates at higher resolutions too. Real time Ray Tracing is the holy grail, it is easier to work with, but requires massive amounts of computational power.

I guess, at this moment in time, It's how much of the actual "Real" in Real time Ray Tracing we are willing to sacrifice for good frame rates.

Yet they've chosen the RTX moniker for the entire range of cards... and if only the 2080Ti can adequately perform ray tracing to an acceptable level, Nvidia will just look stupid and it's going to cost them a lot of sales. It's going to be an absolute blood bath if this happens, which is why I'm sure there is more going on here that we aren't privy to, there has to be.
 
Yet they've chosen the RTX moniker for the entire range of cards... and if only the 2080Ti can adequately perform ray tracing to an acceptable level, Nvidia will just look stupid and it's going to cost them a lot of sales. It's going to be an absolute blood bath if this happens, which is why I'm sure there is more going on here that we aren't privy to, there has to be.
Unfortunately, this is new territory and will take time to get to grips with from a dev/vendor perspective. Nobody knows what to expect really and there is nothing to say that the 2070 can't cope with 1080P or even 1440P using RT in time. No point going off on one til everything has been tweaked to the max and frames are super low.
 
Unfortunately, this is new territory and will take time to get to grips with from a dev/vendor perspective. Nobody knows what to expect really and there is nothing to say that the 2070 can't cope with 1080P or even 1440P using RT in time. No point going off on one til everything has been tweaked to the max and frames are super low.

Exactly. I am in the (small) camp that expects performance to be better than is being suggested. The alternative just makes no sense to me. Out the gate sure, there will be teething problems... no new tech is immune to that, but I cannot fathom Nvidia committing such a grand public suicide act as bringing new tech to the marketplace knowing full well it isn't going to perform even remotely adequate enough unless consumers fork out £1200 plus for their top end GPU! More so in light of the fact they have no competition and have absolutely zero reason to commit such a bizarre act of stupidity!! If they've been working on this for 10 years as claimed, they will clearly have confidence in its capabilities, and wouldn't have named an entire range of GPUs off the back of it otherwise. I am quietly confident this will turn out well for all of us. If it doesn't, Nvidia are going to suffer the most.
 
Yet they've chosen the RTX moniker for the entire range of cards... and if only the 2080Ti can adequately perform ray tracing to an acceptable level, Nvidia will just look stupid and it's going to cost them a lot of sales. It's going to be an absolute blood bath if this happens, which is why I'm sure there is more going on here that we aren't privy to, there has to be.

There won't be a blood bath. And I am pretty sure there will be sliders, you know, full Ray tracing, partial ray tracing etc. The Metro Exodus developer hinted at profiles for different cards.

You were FULL sure that there would never be a graphics card called RTX. Some times things are exactly as the appear to be. Real Time Ray Tracing needs bundles of power.
 
Exactly. I am in the (small) camp that expects performance to be better than is being suggested. The alternative just makes no sense to me. Out the gate sure, there will be teething problems... no new tech is immune to that, but I cannot fathom Nvidia committing such a grand public suicide act as bringing new tech to the marketplace knowing full well it isn't going to perform even remotely adequate enough unless consumers fork out £1200 plus for their top end GPU! More so in light of the fact they have no competition and have absolutely zero reason to commit such a bizarre act of stupidity!! If they've been working on this for 10 years as claimed, they will clearly have confidence in its capabilities, and wouldn't have named an entire range of GPUs off the back of it otherwise. I am quietly confident this will turn out well for all of us. If it doesn't, Nvidia are going to suffer the most.

All the reasons you listed are precisely why they can do it now, no competition, can charge what they like.

Why do you keep insisting that Nvidia are going to fall if the Ray Tracing is only running at 1080p? There is more to the cards than just Ray Tracing. A lot of people are more interested in the DLSS technology. And if in normal games the 2080Ti performs 50% better than the 1080Ti, then with the added bonus of Ray TRacing and DLSS, these cards will be a success, even with the high prices. I think you are been overly dramatic with the doom and gloom. Nvidia will be fine and these cards will sell despite the prices. Ray TRacing, even at 1080p is a big carrot for a lot of people.
 
I’m collaborating with someone I know from the industry to do a piece on ROG next month that will focus on what this means for gamers and games in the near future specifically, and what we can expect. Looking forward to piecing it together, but obviously won’t be finished till the cards are here /twiddle thumbs


Not sure if this has been posted anywhere else yet, but PC Gamer uploaded their footage from Gamescon last night. Hoping that DICE give people the opportunity to try RTX features in the open beta in a few days. Despite the fact that it'll tank performance on Pascal cards.

 
Last edited:
I already told what they are targeting, that's straight from one of the more experienced developers. 60fps at 1080p. And since non RTX cards can only run the scenes in single digits at 1080p, that's a massive jump in performance.
But 1080p 60fps monitors are a decade old, or is the monitor industry so bad that this standard is in anyway accecptable? It runs on your 10 yr old monitor, really? :P
 
But 1080p 60fps monitors are a decade old, or is the monitor industry so bad that this standard is in anyway accecptable? It runs on your 10 yr old monitor, really? :p
Well the monitor industry is that bad. I have been looking to upgrade my 8+ year old monitor for years now and still haven't found anything worth while to upgrade to. Everything out today is worse in one area or another.

Plus not all games benefit from higher FPS and it will be interesting to see what type of use RT gets from indi devs.
 
I spent ages looking for a monitor and ended up going for a curved ultrawide, 1080 160hz VA panel a couple of years back. I tried the other popular gaming ones at the time and either the image quality wasn't good or there was some other issue with it (originally bought a ROG Swift but it was just awful).
 
Last edited:
Back
Top Bottom