• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX performance overhead - 9.2ms (ouch!)

The thing about RTX that I don't really understand is how they harp on about it being easier for developers to implement lighting, but until everyone is running RTX cards, they're going to actually have additional work because they will still have to implement pre-baked GI!
 
The thing about RTX that I don't really understand is how they harp on about it being easier for developers to implement lighting, but until everyone is running RTX cards, they're going to actually have additional work because they will still have to implement pre-baked GI!


yes, but in the future they wont have to bother with all the complex ricks and hacks to make things look just about OK, as everyone will have RTX capable hardware. The problem with games dev is that the tricsk and hacks are getting more and more complex, and break things like SLI, the artistic cost also increases exponentially. This isn't sustainable.RTX is a long term cure for this problem. In the short term, it increases the visual quality massively for those with the hardware.


Also, on the content creation side, it allow the artists to have real-time feedback of what baked lighting will look like rather than doing offline rendering and waiting an hour. This will reduce costs and improve the visuals for everyone
 
This...

I think a lot of people are going to be very pleasantly surprised once this is out in the wild and beginning to mature.

The problem is that today's hardware isn't capable of running Ray Tracing at any sort of acceptable framerate, so what we are getting is a hybrid. As you say using fake techniques and tricks to get the resolution and framerate up. If you were using Full Ray Tracing and optimising for that, then, yes, I could see the time taken to render the frame reducing, but, the more they have to resort to normal rasterization, the longer it is going to take, especially if you want there to be a noticeable visual difference between the game with Ray Tracing on and off.
 
The thing about RTX that I don't really understand is how they harp on about it being easier for developers to implement lighting, but until everyone is running RTX cards, they're going to actually have additional work because they will still have to implement pre-baked GI!

Most 'AAA' games are targeted at the consoles anyway, so until consoles have Raytracing capable hardware that can run at least 30fps @4K (if not 60fps), Ray tracing will be a niche optimisation. Currently it looks like the performance required for that would be at least twice that of the 2080Ti, and console hardware lags significantly behind top end discrete PC GPUS. My guess is at least 5 years until we see RT as a standard setting for games. It's got to start somewhere though.
 
I think it seems clear to me now that buying in to RTX isn't much different than putting money in to an Nvidia Ray Tracing Kickstarter campaign, only minus any real perks other than some tech demos and low frame rate gaming experiences. Even a 2080Ti isn't going to get the benefit of it in this generation. Not sure how Nvidia will recover from the embarassment and farce of all this though. I don't doubt their bank balance and shareholders will remain happy, as they are more than solvent, but their reputation will undoubtedly take a battering. That is unless devs can really pull something out the hat, but based on all evidence to date, I'm not holding my breath on that one. :rolleyes:
 
Ray tracing is demanding shocker!

I think it seems clear to me now that buying in to RTX isn't much different than putting money in to an Nvidia Ray Tracing Kickstarter campaign, only minus any real perks other than some tech demos and low frame rate gaming experiences. Even a 2080Ti isn't going to get the benefit of it in this generation. Not sure how Nvidia will recover from the embarassment and farce of all this though. I don't doubt their bank balance and shareholders will remain happy, as they are more than solvent, but their reputation will undoubtedly take a battering. That is unless devs can really pull something out the hat, but based on all evidence to date, I'm not holding my breath on that one. :rolleyes:

Someone has to push things forward, Dev's are not going to bother unless someone puts the Hardware, software AND developer support they are not going to bother.

The pricing is "poo" (the die size of the chips are huge though adn the PCB's more complex with better parts) but I appreciate the tech.

I and a few peeps will prob get the next gen of this tech. Reminds of of the Geforce 3 days, pricey cards with new pixel/vertex shader which hardly got used and people moaned about it at the time as the early uses where few and far between and mostly shiny water. Look where we are now with shaders...
 
Someone has to push things forward, Dev's are not going to bother unless someone puts the Hardware, software AND developer support they are not going to bother.

The pricing is "poo" (the die size of the chips are huge though adn the PCB's more complex with better parts) but I appreciate the tech.

I and a few peeps will prob get the next gen of this tech. Reminds of of the Geforce 3 days, pricey cards with new pixel/vertex shader which hardly got used and people moaned about it at the time as the early uses where few and far between and mostly shiny water. Look where we are now with shaders...


Yes, I'm not dumping on ray tracing itself, I think it has great promise. I just think a reality check is in order for many of those who've jumped on the RTX bandwagon, and anyone expecting full ray traced gaming experiences this generation, especially at 1440p and above, is going to be in for bitter disappointment... if that's what they are indeed hoping for, but I know many people are more realistic. I just hope the backlash against Nvidia isn't too severe, as it won't bode well for future GPU releases if they keep offering hollow promises. But as you say, they're still going stronger than ever after all these years and it's not the first time people have been up in arms about features not being utilised. The pricing this time around is something else though, that's for sure, with arguably the worst value mainstream release ever (Titans excluded perhaps). I wouldn't exactly say Nvidia have free reign to keep pulling this kind of thing forever and a day, and expecting us to just roll over. It's only by virtue of them having no competition that has created the situation we're in now. That cannot continue... at least it better not or we're all screwed ha! :rolleyes:
 
So many people cannot afford a simple rtx 2070 let alone a higher model, how on earth does nvidia expect RTX to be adopted at any reasonable rate when the cheapest card to use this feature is around 500 bucks.. It's nothing more than fluff, just as physx once was.
 
The thing about RTX that I don't really understand is how they harp on about it being easier for developers to implement lighting, but until everyone is running RTX cards, they're going to actually have additional work because they will still have to implement pre-baked GI!

If you talk about the RT as additional work, then it's not. The beauty of ray tracing is these effects happen naturally, the problem will always be the exponential computational cost. The fact we're struggling to see reasonable performance with a sample count of just one or two samples per pixel tells you just how exponential. GI in Metro will be a real eye-opener. I can't think of a better developer to take it on, though.
 
wouldnt it be nice if AMD developed a ray tracing chiplet? Something they can bolt on to any gpu. It would be cheaper to manufacture, way more flexible, customers could take it or leave it. I dont know if it's feasable or not, but wouldnt it be nice?
 
Sounds like someone trying to justify a 2080ti purchase.

Not really, as RTX won't be limited to Turing. The fact you're poo-pooing something that's not been publically tested yet speaks more about your posturing than it does Gordy in my opinion.

If ray tracing in real-time isn't something that reignites your interest then perhaps you don't really understand the implications. After all, he did refer to when it matures.
 
I'm of the view that the existing card line up will probably be the shortest lived GPUs in Nvidia's history, before we see new ones. The whole range just makes so little sense from a cost/performance stand point, yet this feels deliberate in order to clear out Pascal stock. Add to that the abundantly clear fact that even a 2080Ti isn't the all singing and dancing ray tracing solution Jensen made it out to be... not even close, and forget the 2070/2080, they aren't even in the ray tracing race... and it all points to these cards being a stop gap before 7nm lands. Given AMD are already far down that road with such products imminent, I feel it's fairly safe bet that Nvidia won't be that far behind.
 
I'm of the view that the existing card line up will probably be the shortest lived GPUs in Nvidia's history, before we see new ones. The whole range just makes so little sense from a cost/performance stand point, yet this feels deliberate in order to clear out Pascal stock. Add to that the abundantly clear fact that even a 2080Ti isn't the all singing and dancing ray tracing solution Jensen made it out to be... not even close, and forget the 2070/2080, they aren't even in the ray tracing race... and it all points to these cards being a stop gap before 7nm lands. Given AMD are already far down that road with such products imminent, I feel it's fairly safe bet that Nvidia won't be that far behind.
Here's hoping.
I think it is more likely that rate of progress is slowing and manufacturing costs are increasing across most tech.
7nm = more cost
More Rtx cores = more cost
 
I'm of the view that the existing card line up will probably be the shortest lived GPUs in Nvidia's history, before we see new ones. The whole range just makes so little sense from a cost/performance stand point, yet this feels deliberate in order to clear out Pascal stock. Add to that the abundantly clear fact that even a 2080Ti isn't the all singing and dancing ray tracing solution Jensen made it out to be... not even close, and forget the 2070/2080, they aren't even in the ray tracing race... and it all points to these cards being a stop gap before 7nm lands. Given AMD are already far down that road with such products imminent, I feel it's fairly safe bet that Nvidia won't be that far behind.
I think it's way too early to tell at the moment. Some of my positively comes from the belief that you'd be a fool to bet against nvidia getting this right. I'd give it 80/20, ie, 20% chance these GPU's in the long run will be seen as a bit of a flop and NV got it wrong. If a £1099 GPU does turn out to be a stop gap, I think there will be many irritated people. Based on the price alone, without considering other things, my guess is there wont be any kind of refresh for a long while hence why I reckon they'll just bring 7nm along with next gen.
All of the 20 series GPU's have potential, it's whether it's going to be realised. Must admit even I think "come on, get this DLSS and RT out there FFS" :p

Given Intel, NV are finding any reason for a price rise, 7nm die shrunk 2080 Ti would likely be £1500+ :p. Personally I reckon they'll wait for the new process to mature a bit, so fairly early 2020 for next gen(1st half anyway). Definitely a Titan Turing at some point failry soon.
 
Last edited:
wouldnt it be nice if AMD developed a ray tracing chiplet? Something they can bolt on to any gpu. It would be cheaper to manufacture, way more flexible, customers could take it or leave it. I dont know if it's feasable or not, but wouldnt it be nice?
I was thinking the same thing. Hopefully they have some surprises in store for us.
 
wouldnt it be nice if AMD developed a ray tracing chiplet? Something they can bolt on to any gpu. It would be cheaper to manufacture, way more flexible, customers could take it or leave it. I dont know if it's feasable or not, but wouldnt it be nice?


It is feasible, but would be much more expensive. To achieve the performance nvidia have managed required a tight integration with the CUDA cores to do a lot of heavy lifting.

TBH< I just don;t see the point. Ray-tracing is the future. It wont be a choice. The only question is that of a timeline and the exact path to full scene ray-tracing at multiple samples a pixel. AMD makign some sparate co-processor doesn't help.
AMD either need to get on board early and try and prevent Nvidia dominating the ray-tracing hardware-software design space like they did with CUDA to HPC, or they can sit back a few generations and hope to hit he ground running once the software is out there. Te risk with the latter situation is developers will be familiar with Nvidia's design choices, architecture and RTX libraries.
 
Back
Top Bottom