• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Fluid Motion "Enable 60fps Movies"

What people call 'Film Look' is just motion blur and judder in scenes.

If you won't accept 24fps in a game and bash consoles for only managing 30fps then how can you accept it in a movie? I know there is more involved with input latency and responsiveness but the issue is still the same.

It's not as simple as that. I've never watched a movie and thought 'This is too juddery'

Sometimes they are edited appallingly to the point you can't even see what's happening but that's more a fault of the director/editor. Quantum of Solace opening car chase is a prime example of hyper-edited crap. Or Michael Bay Transformers movies.

I had high hoped for HFR when Peter Jackson said he would film in 48fps and explained the benefits. James Cameroon seemed keen on making Avatar 2 and 3 at 60fps.

But honestly, HFR version of the Hobbit just didn't work for me. And I'm someone who likes to game at constant 60fps hence Titan X SLI at 4k.
 
I think id prefer films in higher frame rates. When i watch youtube videos in 60FPS it's just sooo much more better than watching in 30FPS. So when i think of this films in a higher FPS must be better.
 
What people call 'Film Look' is just motion blur and judder in scenes.

If you won't accept 24fps in a game and bash consoles for only managing 30fps then how can you accept it in a movie? I know there is more involved with input latency and responsiveness but the issue is still the same.

Because films and games aren't the same thing?

I find that while a film can technically look 'better' in 60fps - like in those Avatar 60fps clips put online a while back, it usually gives it the kind of look that makes it seem like you're watching rushes on the set or a home movie on a HD cam, rather than a film.

Technically better doesn't always equal actually better.
 
Maybe I'm misunderstanding but doesn't g-synch / freesynch operate similarly (people mentioning not adding detail but simply adding extra frames is bad yet they don't magic out new and fresh detail either)?

I might be wrong but I thought it'd work similar
 
This is true, but they is still details you can gain in the high end of the scale.

According to colour graders and people in the industry (who has also been on the forum) they aren't allowed to add colour beyond 235. This has been discussed on AVS Forum. It wont be approved if it does.

PC levels cause it to look more saturated than the rightful 16 - 235.
 
There is some serious love for 24fps films here. Frankly I'm desperate to move away from that archaic standard. There is nothing worse to me than a panning shot where the fast moving background just falls apart. No, I would embrace a smoother faster framerate and I felt that the action scenes in the hobbit were greatly enhanced as I could actually see what the hell was happening. It might feel weird when you are used to something different, but that doesn't make it bad.

On topic, I'm not really a fan of interpolation so this tech doesn't really interest me. I've moved away from AMD so I can't test it, but I would if I could just to check it out.
 
Off topic as this thread is about interpolation, but the problem with real high frame rate video and why it looks weird, is that by allowing our eyes to extract more information from the scene it actually ends up making it more obvious that it is video, so it falls in to the uncanny valley... By being just obvious enough that it's fake, 24fps we can switch off and ignore the fact it's fake
 
Now I know I'm being trolled. I'm out!

Wow!

If your TV supports the Full RGB range 0-255 and you calibrate the display with this in mind you can gain more detail in high contrast images...
While still keeping dark detailed shadows or dark parts of film.

16-235 Fits well within the range 0-255.... If it doesn't for you then
1. Your TV is dated
2. Your calibration sucks

If your TV don't support it then yes that is very bad because you will just crush your back level and white level.

Copied from AVS Forum!
If you are feeding your device to a monitor/TV configured for 0-255 levels then RGB Full would be the best choice.

If you feed a 0-255 signal (RGB Full) into a display configured for 16-235 (RGB Limited) then you will get crushed blacks and clipped whites (as the content below 16 and above 235 will not appear any different to content at 16 or 235) - and the image will also appear to be artificially more saturated (richer colours)

If you feed a 16-235 signal (RGB Limited) into a display configured for 0-255(RGB Full) then you will get grey-ish/milky blacks, dull whites - and the image will look washed out and de-saturated (less colourful) (As the 16 black level in the source will be displayed as grey as it is above the 0 black level of the display, and the 235 peak white of the source will not reach the 255 peak white of the display)
 
Back
Top Bottom