• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is the NVidia RTX performance even worse than previously reported?

Soldato
Joined
6 Jan 2013
Posts
21,861
Location
Rollergirl
I imagine you will be able to play at 4k, having most of the scene rendered at 4k using rasterization, while the RTX effects are rendered at 1080p then upscaled to 4k.

This sounds like a big gamble by Nvidia, and it's very suspicious not to have any conventional performance data available from them.

Anyway, time will tell.
 
Soldato
Joined
31 Dec 2006
Posts
7,224
I wish people would think before knocking 1080p its actually superior to 1440p due to the versatility. :p

If you honestly think all those people who've splashed out on ultrawide and 4K monitors, and now with an eye on the 2080Ti, are going to be convinced to downgrade to 1080p then I have some magic beans to sell you! :p

The 20xx series had better deliver in raw performance (outside of RTX) or Nvidia are going to be in serious trouble. I can quite easily see a situation where gamers will accept paying through the nose and happily ignore RTX if the cards deliver high FPS in 1440p and 4K above and beyond what the 1080Ti offers... but if not, sales are going to take a massive dent and the backlash will be severe to say the least.
 
Permabanned
Joined
28 Nov 2006
Posts
5,750
Location
N Ireland
If you honestly think all those people who've splashed out on ultrawide and 4K monitors, and now with an eye on the 2080Ti, are going to be convinced to downgrade to 1080p then I have some magic beans to sell you! :p

The 20xx series had better deliver in raw performance (outside of RTX) or Nvidia are going to be in serious trouble. I can quite easily see a situation where gamers will accept paying through the nose and happily ignore RTX if the cards deliver high FPS in 1440p and 4K above and beyond what the 1080Ti offers... but if not, sales are going to take a massive dent and the backlash will be severe to say the least.

Heh well it aint a downgrade bro if you went from 4k to the latest 240hz 1080p you keep 4k if you downsample but you get into framerates not obtainable. How much is 4k 144hz right now? I can have 4k 240hz for 450 pounds.

You simply trade how ultra sharp real 4k is for a image not as sharp but with zero and i mean zero jaggies. Its quite surreal and highly addictive there is no magic beans to be had. Infact the beans are these ultra blurry 4k displays that render a nice 4k 60hz image until you move and it all blurs back to 1080p. This does not happen on mine...


:D
 
Soldato
Joined
31 Dec 2006
Posts
7,224
Heh well it aint a downgrade bro if you went from 4k to the latest 240hz 1080p you keep 4k if you downsample but you get into framerates not obtainable. How much is 4k 144hz right now? I can have 4k 240hz for 450 pounds.

You simply trade how ultra sharp real 4k is for a image not as sharp but with zero and i mean zero jaggies. Its quite surreal and highly addictive there is no magic beans to be had. Infact the beans are these ultra blurry 4k displays that render a nice 4k 60hz image until you move and it all blurs back to 1080p. This does not happen on mine...

:D

While I'm not dissing 1080p, it doesn't change the economical facts... no one is going to be selling their expensive ultrawide or 4K monitor at a significant loss, and then buying a 1080p one, no matter how good it may be. Nor will they downscale on their existing monitor as this is never optimal. If RTX cards are to succeed, and if they can't deliver ray tracing at acceptable frame rates above 1080p (although personally I think with driver maturation and game optimisation they will), then they need to offer performance in other areas for all of those people buying them who own higher resolution monitors... and let's face it, A LOT of people who are looking at the 2080Ti aren't gaming at 1080p anymore.

What 4K monitor did you have?
 
Soldato
Joined
18 Feb 2015
Posts
6,485
Ray Tracing will for the foreseeable future be nothing more than a gimmick, and the reasons for this are evident in the history of Nvidia's other "magical" features. Simply put these features have very heavy performance impact in games, at questionable visual quality benefits. I say questionable because the implementation is still dev-dependant and therefore not guaranteed to have consistent quality from game to game (sadly a fate also shared by HDR). Indeed, if we look at its early start in SoTR we are left scratching our heads as to what the fuss is about in the first place, let alone at such a performance cost. Looking back at the other magic features I was talking about, like Hairworks et al (read this article for perspective) we can see that they often have 20ish FPS impact on performance. This may not seem much at 1080p but if we look at the proper high-end (i.e. 4K and equivalents) where you're fighting for every scrap of FPS you can to reach that blessed 60 fps, this is a huge deal. The more of these magical features you add on to a game, the less you can expect that game to still run even on the highest end hardware. If you look at that article you will see that it's still generally impossible for even older games to be truly maxed out when such Nvidia features are necessary for it to be "maxed out" at 4K, but sometimes even QHD, on a 1080ti for example.

So, given what we know of Nvidia's track record, and the facts that Ray Tracing will take years to develop and implement to a high standard (because software development and training takes time, measured in months/years), it's very clear that the RT in 2080ti (to say nothing of the no doubt expectedly abysmal RT performance from the lower cards) is more a selling point as a brag and for screenshot hunters. No doubt it's going to be very popular with the games-as-benchmarks crowd, who love to benchmark games (and scratch their upgradeitis itch) but not actually play them.

DLSS shows a bit more promise I would say but I'm skeptical of how good it's going to be in practice vs native 4K, so until launch I can't say much about it but hoping for the best as this could actually end up being a great feature from Nvidia and it might end up saving the disaster that the 2070 is looking to be.
 
Associate
Joined
19 Apr 2017
Posts
92
1080p for a gaming monitor in 2018 is dog ****/peasantry in my opinion. That's literally a 10 year old display resolution. I'd never go back to 1080p from my 1440p Ultrawide now.

I preordered a 2080 for better performance and couldn't give a rats ass about the Raytracing features it comes with since that tech is still 5 years away from being viable at high resolutions.

Bottom line: 1080p = CRAP. And i don't care how many Hz the 1080p has. A 100Hz Ultrawide beats a single 1080p 200hz all day when it comes to the more enjoyable gaming experience and only sad CS:GO no lifers would argue to the contrary.
 
Last edited:
Soldato
Joined
12 May 2014
Posts
5,239
I guess NVIDIA banked on their target audience understanding that doing something in realtime in as little as 16 milliseconds, that's conventionally only been possible on render farms at geological pace is impressive. Even with a low sample rate, it's a massive step in inovation.

Gamers don't have a benchmark for this kind of performance, nor do they care.

False equivalence. I've been able to raytrace in "Realtime" on my single 970 since I got it brand new however many years ago that was in the past. To state that Ray tracing can only be done on a render farm is not true.
And its more than just sample rate, that differentiates the stuff that render farms handle to the rtx demos
 
Soldato
Joined
5 Sep 2011
Posts
12,827
Location
Surrey
False equivalence. I've been able to raytrace in "Realtime" on my single 970 since I got it brand new however many years ago that was in the past. To state that Ray tracing can only be done on a render farm is not true.
And its more than just sample rate, that differentiates the stuff that render farms handle to the rtx demos


It's not even 7:00AM and you're telling me you can render ray tracing on your 970 in "realtime" and the above is a false equation? So to answer clearly, you could render a scene at 60 frames per second with a budget of one to five samples per second? Or would it be closer to 10 seconds for a single frame?

You can colour it how you want, you can't. The render farm analogy rings true enough, it's the same algorithms at its core used for higher fidelity. You can't really bend the definition of realtime when it comes to gaming. Also just to add, yes obviously recursion depth amongst other things has an impact, but we don't need to go into too much.

Go back to bed :p
 
Last edited:
Associate
Joined
19 Apr 2017
Posts
92
Really baffles me how people with good eyesight manage to watch much TV these days then seeings as 1080p = crap.

Some talk as if it's unwatchable, the nonsense that some come out with..

Just For your information my TV is a 4k OLED and I mostly watch HDR content ;)

P.S the PPI low resolution of 1080p is more noticable the closer you are to the screen.
 
Soldato
Joined
10 Oct 2012
Posts
4,437
Location
Denmark
It's not even 7:00AM and you're telling me you can render ray tracing on your 970 in "realtime" and the above is a false equation? So to answer clearly, you could render a scene at 60 frames per second with a budget of one to five samples per second? Or would it be closer to 10 seconds for a single frame?

You can colour it how you want, you can't. The render farm analogy rings true enough, it's the same algorithms at its core used for higher fidelity. You can't really bend the definition of realtime when it comes to gaming. Also just to add, yes obviously recursion depth amongst other things has an impact, but we don't need to go into too much.

Go back to bed :p

Well PowerVR was been able to do it in 2016 with a mobile like gpu, so i cannot see why a 970 wouldn't be able to either. Here have a read https://www.imgtec.com/blog/gdc-2016-ray-tracing-graphics-mobile/?cn-reloaded=1
 
Soldato
Joined
30 Nov 2011
Posts
11,376
DLSS shows a bit more promise I would say but I'm skeptical of how good it's going to be in practice vs native 4K, so until launch I can't say much about it but hoping for the best as this could actually end up being a great feature from Nvidia and it might end up saving the disaster that the 2070 is looking to be.

DLSS is still running the game at native resolution - its a post processing AA effect offloaded to the tensor cores, its NOT upscaling, you can't set the game to run at 1080p and expect it to look good on a 4K monitor.
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
Ray Tracing will for the foreseeable future be nothing more than a gimmick, and the reasons for this are evident in the history of Nvidia's other "magical" features. Simply put these features have very heavy performance impact in games, at questionable visual quality benefits. I say questionable because the implementation is still dev-dependant and therefore not guaranteed to have consistent quality from game to game (sadly a fate also shared by HDR). Indeed, if we look at its early start in SoTR we are left scratching our heads as to what the fuss is about in the first place, let alone at such a performance cost. Looking back at the other magic features I was talking about, like Hairworks et al (read this article for perspective) we can see that they often have 20ish FPS impact on performance. This may not seem much at 1080p but if we look at the proper high-end (i.e. 4K and equivalents) where you're fighting for every scrap of FPS you can to reach that blessed 60 fps, this is a huge deal. The more of these magical features you add on to a game, the less you can expect that game to still run even on the highest end hardware. If you look at that article you will see that it's still generally impossible for even older games to be truly maxed out when such Nvidia features are necessary for it to be "maxed out" at 4K, but sometimes even QHD, on a 1080ti for example.

So, given what we know of Nvidia's track record, and the facts that Ray Tracing will take years to develop and implement to a high standard (because software development and training takes time, measured in months/years), it's very clear that the RT in 2080ti (to say nothing of the no doubt expectedly abysmal RT performance from the lower cards) is more a selling point as a brag and for screenshot hunters. No doubt it's going to be very popular with the games-as-benchmarks crowd, who love to benchmark games (and scratch their upgradeitis itch) but not actually play them.

DLSS shows a bit more promise I would say but I'm skeptical of how good it's going to be in practice vs native 4K, so until launch I can't say much about it but hoping for the best as this could actually end up being a great feature from Nvidia and it might end up saving the disaster that the 2070 is looking to be.




You mean like hardware T&L in the original Geforce which was so sucessful it lead to the demise of 3DFX?
 
Back
Top Bottom