• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX performance overhead - 9.2ms (ouch!)

I'm of the view that the existing card line up will probably be the shortest lived GPUs in Nvidia's history, before we see new ones. The whole range just makes so little sense from a cost/performance stand point, yet this feels deliberate in order to clear out Pascal stock. Add to that the abundantly clear fact that even a 2080Ti isn't the all singing and dancing ray tracing solution Jensen made it out to be... not even close, and forget the 2070/2080, they aren't even in the ray tracing race... and it all points to these cards being a stop gap before 7nm lands. Given AMD are already far down that road with such products imminent, I feel it's fairly safe bet that Nvidia won't be that far behind.

They will have about a year before NVidia and AMD 7nm PGUs come along, and its not clear if either companies offering will aim to take on the high end until 2020.

Even at a year, that is a long time by traditional standards. There used to be a new GPU every 6-12 months.
 
Here's hoping.
I think it is more likely that rate of progress is slowing and manufacturing costs are increasing across most tech.
7nm = more cost
More Rtx cores = more cost

Hard to know what the cost implications are, but more to the point, I wonder how much more power 7nm cards would actually be able to provide? Clearly what's required for ray tracing far exceeds what even the 2080Ti can deliver. For a start, it needs to be something that those at 1440p and 4K can enjoy... it's just insane that you have a GPU being utilised for 4K gaming, yet will barely manage 1080p with RTX... that just makes no sense on any level. I do see ray tracing as the future, but can 7nm deliver THAT much more power within a year? A small bump isn't going to cut it. If it does offer a significant improvement, what price will it be given we're already at ridiculous price/performance levels with the current cards? If they lower prices, are Nvidia just going to hold their hands up and say "yeah, you early adopters got screwed, sorry"... or are GPUs only going to get more and more expensive? That won't work as there are a finite number of people who can afford a four figure GPU.

It all paints a fairly uncertain picture of what lies ahead to be honest. :rolleyes:
 
Sounds like someone trying to justify a 2080ti purchase.

Nope. I'm merely offering an opinion as someone who has a professional background in software development and several years worth of 3D programming under my belt.

Like I said, I think that there's a good chance that people will be pleasantly surprised once in-engine implementations mature and the devs begin to implement extremely cost saving techniques such as decoupling the RT resolution from the native game res.

Like everyone else, I can't possibly know anything for certain, all I can do is speculate and offer my reasoning, which I believe I have done.
 
Last edited:
Nope. I'm merely offering an opinion as someone who has a professional background in software development and several years worth of 3D programming under my belt.

Like I said, I think that there's a good chance that people will be pleasantly surprised once in-engine implementations mature and the devs begin to implement extremely cost saving techniques such as decoupling the RT resolution from the native game res.

Like everyone else, I can't possibly know anything for certain, all I can do is speculate and offer my reasoning, which I believe I have done.

I don't doubt they will leverage the power of Turing to some extent (when that will be is the bigger unknown), but I don't believe it will be anywhere near what Jensen lead us to believe. All current evidence to date doesn't paint a very encouraging picture unless you're at 1080p... and let's face it, most people who have a 2080Ti are not, they're at 1440p/UW/4K... and it doesn't seem at all likely that ray tracing is going to be happening at that resolution. Again, not to the extent that Jensen talked it up anyway. We may see some pretty effects and enhancements, but I think that'll be it. All I'm basing my speculation and reasoning off is current evidence.
 
and it doesn't seem at all likely that ray tracing is going to be happening at that resolution

Here's something to consider:

- DICE were achieving around 60 FPS at 1080p in Battlefield V after only three days with the cards.

- They've since stated that they've dialled back the use of the effects for the sake of realism and performance. So even if we ignore the fact that they've now had far longer to optimise the rendering pipeline, dialling back it's usage should have increased performance somewhat. So let's take a punt and say that that figure now looks something more like 70FPS.

- The next factor to consider is that this number was achieved with the RT cores running synchronously, in effect performing the raytracing after the GPU had completed the rendering of the rest of the scene. From what the DICE devs have said, it would appear that RT cores can be run asynchronously also. This would mean that once the geometry is uploaded, the raytracing workload could be performed at the same time as the GPU is doing a lot of the other work.

- So bearing in mind all of the above, and if they lock the RT res at 1080p, or at least offer a quality option of some sort (which is something DICE said they are intending on doing), then there's no reason why we couldn't be seeing something in the realms of 4K 60FPS for some titles.

Personally, I expect that the end result is going to vary wildly depending upon the implementation, the experience of devs, and their effort in terms of optimisation.
 
Last edited:
Here's something to think about:

- DICE were achieving around 60 FPS at 1080p in Battlefield V after only three days with the cards.

- They've since stated that they've dialled back the use of the effects for the sake of realism and performance. So even if we ignore the fact that they've now had far longer to optimise the rendering pipeline, dialling back it's usage should have increased performance somewhat. So let's take a punt and say that that figure now looks something more like 70FPS.

- The next factor to consider is that this number was achieved with the RT cores running synchronously, in effect performing the raytracing after the GPU had completed the rendering the rest of the scene. From what the DICE devs have said, it would appear that RT cores can be run asynchronously also. This would mean that the Raytracing workload could be performance at the same time as the GPU is doing the rest of the work.

- So bearing in mind all of the above, and if they lock the RT res at 1080p, or at least offer a quality option of some sort (which is something DICE said they are intending on doing), then there's no reason why we couldn't be seeing something in the realms of 4K 60FPS for some titles.


That's all well and good, but there's a lot of wishful thinking there. We shall only know for sure in time. I'd like to believe you were right... RTX succeeding benefits everyone.
 
but there's a lot of wishful thinking there

I suppose. But I wouldn't really call it wishful thinking so much as a brief explanation of my rationale as to why I think that there's a chance (even if it's a small one) that we could be in for a pleasant surprise.

As I've said, I really don't know what we should expect, none of us do. Hell, half of what the DICE devs have said could turn out to be utter nonsense for all I know.

But it seems a shame to me that the overarching discussion (I don't mean specifically here) has become somewhat myopic, one-sided and entirely 'anti-raytracing' in nature. I suspect that this is primarily because this release has been tainted by the astronomical price rises we've seen, which is entirely understandable I suppose. But if we put all of that aside for one moment, there are actually some solid reasons to be at least slightly optimistic.

I think at the very least, we can all agree that despite the criticism that NVidia has rightly earned over this release, if we want to continue to make good progress with real-time graphics, then raytracing is a technology that we desperately need to get out there in one form or another.
 
But it seems a shame to me that the overarching discussion (I don't mean specifically here) has become somewhat myopic, one-sided and entirely 'anti-raytracing' in nature. I suspect that this is primarily because this release has been tainted by the astronomical price rises we've seen, which is entirely understandable I suppose. But if we put all of that aside for one moment, there are actually some solid reasons to be at least slightly optimistic.

I think at the very least, we can all agree that despite the criticism that NVidia has rightly earned over this release, if we want to continue to make good progress with real-time graphics, then raytracing is a technology that we desperately need to get out there in one form or another.

I do see some people taking that anti-ray tracing approach, which is very shortsighted and demonstrates a lack of understanding as to what it brings to the table. I for one believe it holds great promise for the future of gaming, even if there is no guarantee RTX is what ushers it in to the mainstream (eventually).

The pricing has certainly played a MASSIVE role in this mess... had Turing launched at Pascal prices, we'd have avoided this debacle. There would still be moaning at lack of RTX titles, but there would be much revelling in the 2080Ti at 1080Ti pricing... but that was never going to happen and clearly never will. But add to that the complete absence of content, and no real evidence that we're going to get much better than 1080p/60FPS (your rationale aside), then the backlash is quite understandable I feel, and Nvidia (alongside developers) do have a quite a bit of ground to cover in order to win back consumer confidence.
 
then the backlash is quite understandable I feel

Oh absolutely, the backlash with regards to NVidia themselves is completely understandable. I think that their launch was terrible from just about every perspective. Pricing aside, the fact that it's been so obscenely rushed, leaving us without even RT capable demos to play with, was always going to invite a huge amount of understandably negative speculation and criticism.
 
QUOTE="D.P., post: 32203523, member: 1058"]yes, but in the future they wont have to bother with all the complex ricks and hacks to make things look just about OK, as everyone will have RTX capable hardware.[/quote] That's many years away so not really a valid reason for spending 12 to 15 hundred on an RTX card today.

In the short term, it increases the visual quality massively for those with the hardware.
I bet it doesn't.


Also, on the content creation side, it allow the artists to have real-time feedback of what baked lighting will look like rather than doing offline rendering and waiting an hour. This will reduce costs and improve the visuals for everyone
Eventually yes.
 
I suppose. But I wouldn't really call it wishful thinking so much as a brief explanation of my rationale as to why I think that there's a chance (even if it's a small one) that we could be in for a pleasant surprise.

As I've said, I really don't know what we should expect, none of us do. Hell, half of what the DICE devs have said could turn out to be utter nonsense for all I know.

But it seems a shame to me that the overarching discussion (I don't mean specifically here) has become somewhat myopic, one-sided and entirely 'anti-raytracing' in nature. I suspect that this is primarily because this release has been tainted by the astronomical price rises we've seen, which is entirely understandable I suppose. But if we put all of that aside for one moment, there are actually some solid reasons to be at least slightly optimistic.

I think at the very least, we can all agree that despite the criticism that NVidia has rightly earned over this release, if we want to continue to make good progress with real-time graphics, then raytracing is a technology that we desperately need to get out there in one form or another.


I'm pretty sure everyone agree's that Ray-tracing is in the future of gaming, The main problem for a lot of people is as you mentioned the price. It's adding a premium when it's an unknown commodity, We know it's going to taken time to be adopted and as well as the question of Turing's ability the vast majority of those that are Turing ti owners are enthusiasts who will be moving on when the next gen releases so they've paid a huge premium for something that will see limited use in just a handful of titles. Yes it's going to be used to take game graphics to the next level , Yes it's the future, but the sames been said about DX12 and Vulkan and look how long they're taking to get up and running.
 
QUOTE="D.P., post: 32203523, member: 1058"]yes, but in the future they wont have to bother with all the complex ricks and hacks to make things look just about OK, as everyone will have RTX capable hardware.
That's many years away so not really a valid reason for spending 12 to 15 hundred on an RTX card today.
.[/QUOTE]
It does if you want to see those effects to do, or you need more performance than a 1080ti. Remmeber, these prices are often just a few days work and for 1 years enjoyment the price is not really that much of an issue.

I bet it doesn't.
that is already been shown to be true, the graphics look amazing.

Eventually yes.

Actually, already today developers are using Quadros and RTX cards for game development. this is the one area that now provides instant benefits to everyone
 
All the denials. It wrecks frame rates. Paying £1.2K to get your frame rate wrecked.

Well maybe it does, but go look at the tomb raider bench on here, the 1080ti is only getting mid 80's for frame rates at 1080p, so if they can get 60fps at 1080p with the RTX effects on then its only a 20% drop for the better effects. which in my opinion isn't that bad.
 
Well maybe it does, but go look at the tomb raider bench on here, the 1080ti is only getting mid 80's for frame rates at 1080p, so if they can get 60fps at 1080p with the RTX effects on then its only a 20% drop for the better effects. which in my opinion isn't that bad.
RTX2080Ti for 1080P, have a word.
 
Three days with the Turing cards, but they worked on the implementation for months with Titan Vs.

You keep quoting this, but after you said it last time the only thing I could find was one article stating they had Titan V’s but the features were not unlocked?
 
Back
Top Bottom