The high definition generation and lack of standardisation

Permabanned
Joined
26 Apr 2008
Posts
1,390
What the hell happened? Before it was simply plug it into your TV and enjoy. Now it's a ******* mess to put it bluntly.

Shall I get a 720p TV or a 1080p TV? Should I let the console do the upscaling or the TV? What about 1080p games running on a 720p TV? What about 720p games running on a 1080p TV?

If I get a 720p TV I won't be able to watch blu ray in 1080p, I won't be able to play 1080p games in their true resolution. Why isn't the native resolution actual 720p? Instead it's some weird resolution. I will still have to slightly upscale games that are 720p or lower causing quality loss but it's better than upscaling to 1080p and quite close to the native resolution. I'll have to downscale 1080p games causing quality loss, but not many games are 1080p.

If I get a 1080p TV I will have to massively upscale games which are 720p or lower, causing quality loss, and most games are 720p or lower, but I can play 1080p games in true 1080p and watch blu ray movies at their true resolution.

What about the PS3 that only has 1080p upscaling support for some games? What if my TV doesn't have a good scaler, will it look bad?

I mean jesus, where is the standardisation? It's a joke, too many questions, too many complications.
 
It was never simple, with composite, scart and RGB scart as well as geometry issues on flat screen CRTs, you just seem to have noticed now.
 
What’s with the sigs? :D



Yep it’s a mine field out there at the moment! You need to do loads of research into 720 & 1080 Sets.
 
It's not really that big a deal as 720p games will still look good on a 1080p screen. Plus depending on the size of your 1080p screen you may be sitting too far away from it to even notice the improvement over 720p.

I do think that all games should be 720p minimum though, seems to be a bit of fiddling with the resolutions to makes games play smoothly.

If I was buying a new TV now and it was over 32" then it would be 1080p. With 32" and under you won't notice it anyway.
 
It's not really that big a deal as 720p games will still look good on a 1080p screen. Plus depending on the size of your 1080p screen you may be sitting too far away from it to even notice the improvement over 720p.

I do think that all games should be 720p minimum though, seems to be a bit of fiddling with the resolutions to makes games play smoothly.

If I was buying a new TV now and it over 32" then it would be 1080p. With 32" and under you won't notice it anyway.


My 50' 720P set looks cracking, you need to do loads of research into the type of TV look at various forums to get a idea live AVforums which is a massive resource they have huge sections dedicated to these type of questions.


Strictly not an admirer, but a friend :)


Ah okay :D
 
I have a 26" 720P TV. I plug Xbox 360 into it via supplied component. I set the Xbox 360 to 720P in the set-up menu. Everything looks great.

I fail to see where the complication lies.
 
It was never simple, with composite, scart and RGB scart as well as geometry issues on flat screen CRTs, you just seem to have noticed now.

It certainly was simpler in the good ol' days, when I were a lad...

Choice of AV connection was clear-cut: RGB Scart > Scart > Composite > RF/UHF . End of. Didn't really matter what make of TV you had.

We didn't have to choose resolutions, so we were happy with what we had.

For a long time we didn't have to worry about widescreen aspect ratio, and 4:3 TVs split rather nicely 4 ways...


Having said all that, there are plenty of people who don't read too much into the techno-babble of the new TVs and just enjoy their gaming... they're just not the kind of people you see hanging around on internet forums... :)
 
It's always been non-standard. Ico has the lowest resolution of any PS2 game last generation, yet was a hell of a looker. Since the days of the Amiga, resolutions have often gone below the standard. You got all the hooplas with PAL/NTSC and framerates too - in the PS1 days, NTSC versions ran about 30% faster, while EU versions often had borders. Tekken 3 is a good example of this.

Nothing's ever changed. You could say this generation is the most uniform in fact. A game that works in the US works in the UK for me (PS3). Never was the case with my other, older consoles.
 
Do I know you?



This has always been the case though, no?


No I wouldn’t say so personally, with a Tube TV you knew pretty much the picture quality will be near enough the same obviously the cheap brands were a less quality, now you have so many variables like contrast ration to consider, ghosting on certain LCD set, be it HD ready or Full 1080p, there is so much to research & read into now it’s unreal :eek: Took me a week to pick my new TV I can’t recall ever taking that long over a conventional CRT.
 
It was never simple, with composite, scart and RGB scart as well as geometry issues on flat screen CRTs, you just seem to have noticed now.
I disagree.
With Composite, Scart and RGB it was just plug in and away you go.
No need to worry what setting the console was set to output to, no need to get tellys that supported specific resolutions.
Just plug it in, and it works.
Maybe not 100% simple, but still far less faffing than nowadays.
 
It certainly was simpler in the good ol' days, when I were a lad...

Choice of AV connection was clear-cut: RGB Scart > Scart > Composite > RF/UHF . End of. Didn't really matter what make of TV you had.

We didn't have to choose resolutions, so we were happy with what we had.

For a long time we didn't have to worry about widescreen aspect ratio, and 4:3 TVs split rather nicely 4 ways...


Having said all that, there are plenty of people who don't read too much into the techno-babble of the new TVs and just enjoy their gaming... they're just not the kind of people you see hanging around on internet forums... :)

No, to extract the very best out of a console (like the OP seems to be trying to do with the over analysis of IQ and various cables) it has always been this uncertain. I remember all the rubbish about gold plated connections and special power sockets that removed all voltage variance to smooth a CRTs image. It may have been simpler for you (as it was for me) but there has always been a hundred and one different ways of supposedly getting a superior image.

Just plug it in, and it works.

Plug in hdmi, select optimal, go.

or plug in component, set highest progressive output, go.

It's not complex unless you want it to be, just like in the CRT era.

Took me a week to pick my new TV I can’t recall ever taking that long over a conventional CRT.

Some people today will just buy a HDTV and plug it in like you did back with the CRT, and some people spent weeks deciding on a TV back in the CRT era.
 
Last edited:
Back
Top Bottom