• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is the NVidia RTX performance even worse than previously reported?

Soldato
Joined
30 Nov 2011
Posts
11,376
So you want to talk strictly about ray tracing algorithms and ignore everything else; okay. Here is a 560TI performing full raytracing algorithms in 2013.

If we are strictly talking about algorithms and nothing, else your render farm analogy is still terrible because we've been able to do this stuff on home PCs for years.

Stop trying to hype the 20xx series by comparing it to a render farm. Its not even close.

Yeah that looks like its realtime, so smooth
 
Soldato
Joined
30 Nov 2011
Posts
11,376
Please link to a render farm (or anything) doing full raytracing at 60FPS, 1080p.

I'll wait

The guy already admitted the render farm analogy wasn't great, but that isn't what turing is, so banging on about anything that isn't realtime is an equally bad reference. The preview window in blender isn't realtime, and its not a game. Turing allows a level of realtime raytracing previously not available in games. Thats it. If people don't want to recognise that as an improvement they are more than welcome to sit in the past. Ultimately rasterisation is coming to its limits and raytracing is the only way we know how to improve it. Every minor improvement in shadows and lighting we've had in the last 10 years has been a method for faking realism and at a large cost of resources that leads to all sorts of compromises and most people actually turning off/down those effects to get back fps. Now we are talking about offloading faked effects to a seperate core and using raytracing to accurately model those effects. In realtime at playable frame rates.

Something that wasnt hybrid and takes upwards of 1 second per frame for a single model is not anywhere in the same ballpark of what they are talking about with turing and going forwards. I'm baffled as to how you think blender on a 560ti is in any way comparable.

The first implementation isn't going to be renderfarm quality, of course not, but focusing on hyperbole completely misses the point of where GPU graphics is heading and you have to take that first step. If people don't want to be part of the first generation then thats fine, but trying to claim a 970 or 560ti can do the same thing or better than Turing completely misses the point and isn't at all accurate.

Equally a lot of people and even companies staked their future on saying that hardware t&l wasn't needed because it could be supported in software...
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,237
If you are going to jump into a discussion halfway in can you at least fully understand what has transpired prior to your post.

I feel like all of the points you made in your post have either been addressed already or a missrepresentation of what has been said. Neither of which is worth addressing in my opinion. If you disagree feel free to point me to the parts of your post that you think I should address.
 
Soldato
Joined
30 Nov 2011
Posts
11,376
If you are going to jump into a discussion halfway in can you at least fully understand what has transpired prior to your post.

I feel like all of the points you made in your post have either been addressed already or a missrepresentation of what has been said. Neither of which is worth addressing in my opinion. If you disagree feel free to point me to the parts of your post that you think I should address.

I was pointing out the entire renderfarm vs. blender 560ti comparison is completely irrelevant to what turing is trying to achieve, but if you want to keep on having a completely pointless argument you guys can go right ahead.

You seem to fundamentally not understand what Turing is achieving that a 970 or 560ti can't do, so if you are just going to respond and say you still don't get it and don't want to respond then theres nothing else to be said really.

You seem to have really taken issue with the renderfarm analogy and Silent has already admitted that was at least in part hyperbole, so you are basically just having an argument with yourself at this point.
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,237
I was pointing out the entire renderfarm vs. blender 560ti comparison is completely irrelevant to what turing is trying to achieve, but if you want to keep on having a completely pointless argument you guys can go right ahead.

You seem to fundamentally not understand what Turing is achieving that a 970 or 560ti can't do, so if you are just going to respond and say you still don't get it and don't want to respond then theres nothing else to be said really.

You've misrepresented what I said and have gone off on a tangent, to address a claim/point that was never made.

You seem to have really taken issue with the renderfarm analogy and Silent has already admitted that was at least in part hyperbole, so you are basically just having an argument with yourself at this point.

Oh the irony.
 
Soldato
Joined
5 Sep 2011
Posts
12,820
Location
Surrey
So you want to talk strictly about ray tracing algorithms and ignore everything else; okay. Here is a 560TI performing full raytracing algorithms in 2013.

If we are strictly talking about algorithms and nothing, else your render farm analogy is still terrible because we've been able to do this stuff on home PCs for years.

Stop trying to hype the 20xx series by comparing it to a render farm. Its not even close.

Is that a game, though? I'm not sure why you keep throwing in anything that's ray traced as if it's comparable. It's pretty simple, ray tracing in a game is magnitudes more difficult than rendering a single scene.

Not sure why you can't just appreciate that it *is*, in fact, a technique that on a different scale requires incredible computational power for what is fundamentally the same thing. Realistically, what you're arguably saying is at the bottom end of the scale, "my 970 can do this".
 
Last edited:
Caporegime
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Is that a game, though? I'm not sure why you keep throwing in anything that's ray traced as if it's comparable. It's pretty simple, ray tracing in a game is magnitudes more difficult than rendering a single scene.

Not sure why you can't just appreciate that it *is*, in fact, a technique that on a different scale requires incredible computational power for what is fundamentally the same thing. Realistically, what you're arguably saying is at the bottom end of the scale, "my 970 can do this".
Pfffft, an Amiga 500 can do Raytracing. Turing is clearly old tech lol :D
 
Associate
Joined
15 Oct 2013
Posts
55
Seems a new generation of pc gamers don't care for visual quality anymore or they are completely clueless as to how RT is going to make games look much more immersive.

I thought PC was the platform that supposed to bring innovation, this technology is a huge step, especially for developers.

People complaining about using 1080p again, do yourselves a favour and by a decent screen that contains a good scaler, 1080 on a LG 4k oled still looks great.

Also 60fps is more than adequate for a good gaming experience especially for single player games. Not Nvidias fault there are a small portion of gamers suffer adhd and think more fps is the be all and end all.
 
Soldato
Joined
12 May 2014
Posts
5,237
You said turing isn't impressive because a 970 or 560ti can do a blender preview. What have i missed?
Well for starters, i haven't stated my opinion on the turing architect. The 970 and 560ti stuff was in reference to the renderfarm analogy. Which I believe we have come to the conclusion that it is a terrible analogy.

Is that a game, though? I'm not sure why you keep throwing in anything that's ray traced as if it's comparable. It's pretty simple, ray tracing in a game is magnitudes more difficult than rendering a single scene.

Not sure why you can't just appreciate that it *is*, in fact, a technique that on a different scale requires incredible computational power for what is fundamentally the same thing. Realistically, what you're arguably saying is at the bottom end of the scale, "my 970 can do this".

See above and reread past posts.
 
Soldato
Joined
3 Jan 2006
Posts
24,955
Location
Chadderton, Oldham
I thought PC was the platform that supposed to bring innovation, this technology is a huge step, especially for developers.

People complaining about using 1080p again, do yourselves a favour and by a decent screen that contains a good scaler, 1080 on a LG 4k oled still looks great.

.

Huge steps.. which way?
 
Soldato
Joined
30 Nov 2011
Posts
11,376
False equivalence. I've been able to raytrace in "Realtime" on my single 970 since I got it brand new however many years ago that was in the past. To state that Ray tracing can only be done on a render farm is not true.
And its more than just sample rate, that differentiates the stuff that render farms handle to the rtx demos

This was your opening gambit?
Its clear to me as a 3rd party that silent was being hyperbolic, he went on to clarify his position regarding realtime hybrid gaming raytracing and you keep responding with blender non-realtime 970/560ti comments as if it has anything to do with turings gaming applications.
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Also 60fps is more than adequate for a good gaming experience especially for single player games. Not Nvidias fault there are a small portion of gamers suffer adhd and think more fps is the be all and end all.

Spot on. Just let me know when we can expect full scene ray tracing at a stable 60fps 1440p, I'll jump in then.
 
Soldato
Joined
5 Sep 2011
Posts
12,820
Location
Surrey
Well for starters, i haven't stated my opinion on the turing architect. The 970 and 560ti stuff was in reference to the renderfarm analogy. Which I believe we have come to the conclusion that it is a terrible analogy.


See above and reread past posts.

I don't need to re-read them. I've seen your single object render examples, it means nothing. You're still hung up on my render farm analogy, and frankly, nobody cares what your 970 can (or more to the point can't) do.
 
Soldato
Joined
6 Jan 2013
Posts
21,852
Location
Rollergirl
People complaining about using 1080p again, do yourselves a favour and by a decent screen that contains a good scaler, 1080 on a LG 4k oled still looks great.

You can't sit with a keyboard and mouse in front of a 55" OLED, they are a niche settee solution to gaming until OLED is integrated into PC monitors, as opposed to large screen TVs.

Also 60fps is more than adequate for a good gaming experience especially for single player games. Not Nvidias fault there are a small portion of gamers suffer adhd and think more fps is the be all and end all.

That's completely subjective. It's great that you're able to enjoy 60fps gaming, but why assume that others can't have a preference for higher frame rates? I would pass the Pepsi Challenge with it any day of the week, I definitely don't need a frame counter in the corner of the screen to see it.
 

HRL

HRL

Soldato
Joined
22 Nov 2005
Posts
3,028
Location
Devon
You can't sit with a keyboard and mouse in front of a 55" OLED, they are a niche settee solution to gaming until OLED is integrated into PC monitors, as opposed to large screen TVs.

It is a bit impractical, yep. That’s why I use a DS4 controller instead.

Don’t play competitive MP games so it’s not a problem for me. It’s got a trackpad on it if I need a mouse anyway.

At work I use monitors, at home I use the TV. Far more comfortable than sitting inches away from a monitor.

The RTX reviews can't come soon enough.
 
Associate
OP
Joined
27 Jul 2015
Posts
1,470
Interesting comment that raytracing is not NVidias tech, it's Microsofts, and NVidia just happens to make the hardware which it runs on. DXR is what makes it possible and so long as other companies can produce the silicon, there is no reason why they couldn't offer a rival. The worlds biggest graphic chip vendor is Intel albeit not the high level ones, however if they see NVidia pushing prices so high that the profits make market entry interesting to them, there is no reason why Intel could not also offer high level graphics cards too.
 
Back
Top Bottom