• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is the NVidia RTX performance even worse than previously reported?

I heard Rogue One (the droid) was real-time.

Haha, yeah according to this they used UE4 and what they generated on set was used in the final release;

https://www.google.co.uk/amp/s/www..../1/14777806/gdc-epic-rogue-one-star-wars-k2so

I also found this article about using nvidia vca for raytracing - mind you the machine they put together for this one cost about $1.4m, so the real question is... will it run crysis?

http://www.cgsociety.org/index.php/...oups_gpu_accelerated_v_ray_rt_for_final_frame
 
Last edited:
No blur here sir ... There was on my older 29inch UW but as far as I can tell right now my panel is gorgeous in motion

Without saying which panel is it a VA? Look i used strobed displays with a base 4K res so i know what really low blur is. Now you are just talking subjectively and baiting because this is all you know and you think there is none. Look unless you strobe the Ultrawides its garbage even 240hz the fastest panels currently still have some tiny amounts of motion blurring so its no good posting saying i have no blur when you probably have loads in motion and cant capture how bad it looks in slow motion.

And using the strobe mode loses you Gsync so its.a win for GSync 240hz plus 4k DSR panels. This is why RT is garbage to me i think you should get the base canvas right before smearing on more rubbish. And you simply will not be using RT at 4K...
 
I heard Rogue One (the droid) was real-time.

''shot using Unreal Engine 4. The scenes feature lovable droid K-2SO, as played by Alan Tudyk, rendered out in realtime using the games technology before being composited traditionally into the film.
This cross pollination between industries is becoming more and more prevalent and Tim Sweeney, CEO of Epic, promises that the more this happens, the better it will be for game developers. "Film makers are pushing this stuff forward," he says. "We’re listening to what they need, we’re working with them collaboratively and closely and it’s all going to benefit the games industry''
 
You can't sit with a keyboard and mouse in front of a 55" OLED, they are a niche settee solution to gaming until OLED is integrated into PC monitors, as opposed to large screen TVs.

lol it's funny reading this because I'm sitting in my armchair doing exactly that, just enjoying some the division, have been addicted to it ever since I switched to TV a year ago and can't do monitors anymore. try it, it's great.
 
lol it's funny reading this because I'm sitting in my armchair doing exactly that, just enjoying some the division, have been addicted to it ever since I switched to TV a year ago and can't do monitors anymore. try it, it's great.

I have a 43" 4k TV in my office. I can't stand gaming on it due to lack of Gsync and I definitely couldn't use it for any productivity or web surfing (which is what I meant regarding keyboard & mouse). I'm not keen on 16:9 resolution now either. I haven't tested 1080p gaming on it because, well, why would I?

I'm sure screenshots look great on your OLED, but for me the actual gaming compromises would be too much. And this is the thing, people say "this looks great" or "that's not required" but in reality, everyone has different tastes and expectations. I didn't start out with 21:9 1440 Gsync, but it's where I want to stay as a minimum. If I could get the same technology on an OLED screen then that would be amazing.

There's no way I'm regressing back to lower resolution, lower frame rates and no Gsync .... Just so I can justify £1100 on a GPU. I really don't see any way to talk myself into this as I'm just not interested in beta testing hardware for Nvidia, and that's what's coming for these early adopters.
 
Without saying which panel is it a VA? Look i used strobed displays with a base 4K res so i know what really low blur is. Now you are just talking subjectively and baiting because this is all you know and you think there is none. Look unless you strobe the Ultrawides its garbage even 240hz the fastest panels currently still have some tiny amounts of motion blurring so its no good posting saying i have no blur when you probably have loads in motion and cant capture how bad it looks in slow motion.

And using the strobe mode loses you Gsync so its.a win for GSync 240hz plus 4k DSR panels. This is why RT is garbage to me i think you should get the base canvas right before smearing on more rubbish. And you simply will not be using RT at 4K...


well if you'd look at the sig youd see the model and would be able to determin that . whats with this look business , your objection doesn't override my own experience , and your opinion isn't any more valid then my own . , subjectively and baiting ? lmaooo … its not subjective , its my experience and im telling you its gorgeous in motion and even if theres is a slight amount of blur I can tell you I don't notice it even if there is . 166hz and its gorgeous. don't tell me what it is I experience unless you've specifficly used my model . and I think youre getting me confused with someone else , I don't run 4k and never said I would be using at 4k and calling RT garbage is LOL unless you've seen un compressed live and used it . another FACT magician .
 
its not subjective , its my experience and im telling you its gorgeous in motion and even if theres is a slight amount of blur I can tell you I don't notice it even if there is

Haha first it was none here sir now its shifting like i said its there and i do not even need to use your exact panel when i owned the fastest in production and had ULMB as well as a side by side. I can gauze from that how the rest must look with lower specs.

Clearly you prove how NV get away with this kind of thing RT is for silly people its clearly the intelligent approach to get to 4k at high fps first before RT anyone who says otherwise is just a gullible moron who is going to end up doing thier eyes damage in the long run. Without RT we would be approaching the holy grail which is games without blur or aliasing and will not give motion sickness.

RT put that back two years just to milk and slow production until AMD if ever reappear.
 
I have a 43" 4k TV in my office. I can't stand gaming on it due to lack of Gsync and I definitely couldn't use it for any productivity or web surfing (which is what I meant regarding keyboard & mouse). I'm not keen on 16:9 resolution now either. I haven't tested 1080p gaming on it because, well, why would I?

I'm sure screenshots look great on your OLED, but for me the actual gaming compromises would be too much. And this is the thing, people say "this looks great" or "that's not required" but in reality, everyone has different tastes and expectations. I didn't start out with 21:9 1440 Gsync, but it's where I want to stay as a minimum. If I could get the same technology on an OLED screen then that would be amazing.

There's no way I'm regressing back to lower resolution, lower frame rates and no Gsync .... Just so I can justify £1100 on a GPU. I really don't see any way to talk myself into this as I'm just not interested in beta testing hardware for Nvidia, and that's what's coming for these early adopters.

Samsung Q9FN, 1440p 120hz & Freesync. ;) Should've mentioned, I don't care much for OLED myself, it's overhyped by marketers but that's it (details lost in dark scenes, garbage blueish whites, burn in, dim, etc etc).
 
Samsung Q9FN, 1440p 120hz & Freesync. ;) Should've mentioned

Yea, you should've mentioned... particularly when considering you were quoting this. :p

You can't sit with a keyboard and mouse in front of a 55" OLED, they are a niche settee solution to gaming until OLED is integrated into PC monitors, as opposed to large screen TVs.
 
Haha first it was none here sir now its shifting like i said its there and i do not even need to use your exact panel when i owned the fastest in production and had ULMB as well as a side by side. I can gauze from that how the rest must look with lower specs.

Clearly you prove how NV get away with this kind of thing RT is for silly people its clearly the intelligent approach to get to 4k at high fps first before RT anyone who says otherwise is just a gullible moron who is going to end up doing thier eyes damage in the long run. Without RT we would be approaching the holy grail which is games without blur or aliasing and will not give motion sickness.

RT put that back two years just to milk and slow production until AMD if ever reappear.


If you read I still confirmed I perceive there to be none. with they eyes in my head there is none even if there maybe for you but again ... Its people like you that make this a place where discussion is frowned upon .out with name calling mate there's no place for it we all do as we wish. Grow up
 
Every monitor and TV tech is marketing hype. The picture quality of OLED isn't just hype though.

It 100% is, unless you consider cyan whites and that merge with blue especially in snowy or snow+sky contexts good picture quality. Or losing details in dark scenes to the point that it's just black blobs. Or not having a bright picture. Or having burn in. And on and on.

Listen, I'm not trying to convince anyone that what they think is wrong, if you look at your OLED and you don't see those things, more power to you, you're not wrong for that. You see what you see. When I look at it though all I think is round of applause for the marketing department (including the outsourced marketing department, aka tech tubers). No one wanted OLED hype to be real more than me, I value picture quality above everything, but all I see is the exact opposite - especially when it comes to games.
 
Samsung Q9FN, 1440p 120hz & Freesync. ;) Should've mentioned, I don't care much for OLED myself, it's overhyped by marketers but that's it (details lost in dark scenes, garbage blueish whites, burn in, dim, etc etc).

Ah mate, your opening another can of worms with that statement :D

OLED is not marketing hype, the picture quality is amazing especially when watching UHD 4k dolby vision movies through it.

You can easily mitigate screen burn, I had mine for over a year and a half and mine has 0 screen burn.

Only bad thing I can say is current OLEDs suffer banding at 5% on dark shades and that a mute point as you never see it on majority of content and even then can vary through panel lottery.

I remember buying a Samsung ks8000 and dark scenes were garbage, had light bleed like a flash light could be seen, completely destroyed the immersion while playing RE7. Sent it back as faulty and the technical engineer told me it was normal when they checked it out. Swapped it for the LG and the difference was like night and day.

Only thing LED has going for it is peak brightness and that's about it. Only TV I would consider with that tech is Sony Z9 range but I'm sure OLED will be even better by the time I upgrade my TV.
 
Back
Top Bottom