Watch Dogs

Cannot verify if they are the ones, if it's got an asian woman in it and some seafood, then chances are it's the same one.

I deleted it after watching it a few times and it was last year (when I got this tv), to test out it's potential.

You saying you could see the difference watching 4k testing video on a 1080p screen...are you really saying that?
 
I don't particularly care about res when it comes to games, when it comes to TV and Movies then yes I do care.

For TV and Movies it has to be 720P @ minimum, but I prefer 1080P when I can get it and is what I usually go/search for.

Don't particularly care about FPS so long as it's not like watching a slideshow, as long as I cannot identify/see individual frames I don't care what fps it is.

Yet to play Tomb Raider which averages around 50fps at 1080P but can dip to 30fps, but when I do I'll see if I can notice it dipping or not. Movies are 24fps iirc, and they don't look like a slideshow so I'm not entirely sure what fps you need to drop to before you can really notice it drop.

I think it's the transition people notice as it can cause artifacts/tearing rather than the actual drop itself.
 
Last edited:
You saying you could see the difference watching 4k testing video on a 1080p screen...are you really saying that?

Compared to 1080P yes.

Maybe you should download one of those test videos and see for yourself, plenty of people on here as well as AVforums can tell the difference too.

It's all to do with how it's been recorded as well as the cameras used, etc. They are a lot better.

These tech demos for 4K blow all 1080P sources out the water.

Don't believe me try it for yourself, but it's like 4GB for a 1-3 minute file
 
This tbh. In all the consoles I've ever owned (which is most of them), I've never considered resolution or FPS. All I cared about is whether the game was fun to play. This is the first generation where it's been such a big deal (but also possibly the first generation where there has been such a measurable difference between the two main competing consoles).

I think developers and marketing places more emphasis on the specs now too. "WOW, LOOK! 1080p HD!! (upscaled though right?) :p. Plus you have the PC master class constantly reminding everyone that their £2k rig pushes 1440.

It's funny you guys mention that point though. Whenever I bought for any past console, whether it was the early Sega's or the PS1 to PS3, I never considered the specs.
 
You saying you could see the difference watching 4k testing video on a 1080p screen...are you really saying that?
Have a look into supersampling, if his TV has a decent scalar then yes 4K downsampled to 1080p could look better than a native 1080p feed - Child of Light uses this exact same technique...

EDIT: Actually Im not sure how percievable the end result would be and if it couldnt be attributed to confirmation bias/placebo effect - really depends on how clever the scalar is

ps3ud0 :cool:
 
Last edited:
Its not the graphics I want it for. It's the gameplay. Something different, that still looks fun.
 
Have a look into supersampling, if his TV has a decent scalar then yes 4K downsampled to 1080p could look better than a native 1080p feed - Child of Light uses this exact same technique...

EDIT: Actually Im not sure how percievable the end result would be and if it couldnt be attributed to confirmation bias/placebo effect - really depends on how clever the scalar is

ps3ud0 :cool:

I can understand if the filming equipment is better, ie filming better colours/ contrast etc, but theirs going to be no more pixels magically scaled on the pane..
 
I can understand if the filming equipment is better, ie filming better colours/ contrast etc, but theirs going to be no more pixels magically scaled on the pane..
Its more the techniques used to decide how the 4 pixels at 4k are merged into 1 pixel at 1080p and the surrounding pixels around that. While you wont see more pixels, the scalar may choose a better complement to the screen than a standard 1080p feed could.

Not sure if Ive made that that clear - but supersampling itself is definitely a technique used by loads of PC gamers to achieve noticeable results. Cant see why it couldnt be used for a normal TV image just with limited results due to working with a flat 2D image as opposed to one that also has 3D data. Its an interesting idea to mull over...

ps3ud0 :cool:
 
Guys download a 4K tech demo and watch it even on a CRT, you could tell the difference trust me, they are simply sublime.

It looked better than real life on my flagship panasonic plasma. I think they may even use some effects on the film in a studio to enhance the picture. I mean I was getting extremely hungry watching the seafood it looked better than anything you would get in a Michelin star restaurant.

Maybe it was a combo of me having one of the best TV's ever made that also made it look so good but seriously just try a 4K tech demo for yourself the difference is definitely there.

Stay away from the crap on youtube and make sure you use a tech demo, these are designed to show off a flagship 4K TV so the quality is simply insane.
 
Last edited:
£40.40 from flubit but I think it may be slightly different per person. That's delivered for standard version. My Luigi U arrived a day early from them but I wouldn't presume this is always the case.
If anyone wants an invite send me a trust message. Worth having an account for.
 
Has it always been like this, I only seem to notice talk about framerates and 1080p since ps4 and xbone more so.

The differences were often discussed briefly back on the PS3 vs 360 but I don't think most people cared to the degree that they seem to now. Have to find something to be unhappy about I suppose.
 
Last edited:
Back
Top Bottom