• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why do amd like to spoil video quality by default?

Could you let us know what settings you changed? It seems a little short on details. Are you talking about video or gaming quality?

Wasn't it obvious enough?

Why do amd like to spoil video quality by default?

Is it really necessary to have so much post processing on by default, they have more than some tv's do, edge enhancement, dynamic contrast, smooth this, smooth that, noise reduction, skin tones, colour vibrancy and so on...

I think thats pretty clear what im talking about but if you have an amd card it would only take a few clicks to see the actual settings yourself.

Without those details it just sounds a bit or FUD or red team bashing. Thanks.

lol ok...
 
Tbf, neither nvidia nor amd enable full dynamic range, by default they're both using 16-235 - which is odd considering they turn on everything else most people don't want or need.
Since 16-235 is standard for video (including BluRay), this is sensible.

the dynamic range control in the drivers works the opposite to how i believe you think it works.

Full range is enable by default and you can test this by enabling the dynamic control and then setting it to 16-235 (limited). 16-235 should output video as it's meant to look and on a full range display it should look washed out - does on mine. Full range on as default makes sense when you consider most displays connected to PCs will be monitors and not TVs:)
 
Set up another profile with all the Post processing switched off and compare the two.

I can't recall CCC having them all on by default but I'll check anyway. :p

Edit - lol clicked default and everything went back on, yeah I have it all switched off too.
 
Just to add, it's been well known for years that many of the extra features tv's have often ruin image quality, home cinema enthusiasts will tell you that, same goes for many of these settings as well, a few might at low levels may make a slight improvement to sd and youtube at full screen but in general it will spoil the quality.

Also i just recently switched over from vlc to mpc-hc due to audio stuttering/sync issues and it's seems an all around better media player when it comes to quality and performance.
 
MPC-HC is always my go to media player.

VLC seems to have went down hill imo.

Yeah i don't know why i didn't switch sooner, i guess i was used to vlc even though i had tried mpc-hc in the past and thought it was decent, vlc has always had it's issues when using spdif but i usually fixed them and kept it, however if they still have these problems now it's version 2, then why bother.
 
I don't have CCC installed will I still have these things happening or do I not need to bother?
 
I like to keep edge enhancement on at its default of 10, as it adds a little extra sharpness which looks nice on sd content (like when watching freeview through mediacentre). Doesnt have any notable effect on hd content at this low value.

De-noise I always dissbale as even on setting of 1 (default 64) it absolutly destroys the image making it a blury mess. Almost look like the blur is following the image and trying to keep up.

Smoothvision doesnt sem to do a lot on my 4890 if its on. It dissables all the other options. Can make the screen juddery as well so I leave this off.

Dynamic contrast I always dissable. Annoys me seeing the screen vibrancy change every time a new scene appears in the video.

Deinterlacing I leave on Automatic and it seems to work well enough watching Freeview sd content.
 
image quality is not spoilt by your card, it's spoilt by the display only, i.e a 42'' tv is better for gaming than a 55'' tv... as is normal HD and SD viewing, the larger the screen the poorer the image...... this will always be the case.
What a load of rubbish...

The picture quailty on my 50" plasma is far far better then on my 43" plasma & my 32" LCD & 24" LCD
 
Last edited:
My personal preference really isn't the problem here, I don't think you realise just how much amd's settings mess with the image quality, you might like vivid colours and sharp edges but i bet you don't like losing detail as thats whats happening with all the post processing going on.

I'm not a home cinema enthusiast, if it looks good enough then I'm happy.

As you posted in the graphics section and not the Home Cinema section I didn't divulge in dissecting image quality for video viewing as it's not my thing.

Sorry but that's nonsense lol

No it's not, just posting my view from the way I saw it.

I simply pointed out that if you are used to one set of settings and then use another vendors settings, most are going to be familiar/prefer with what they use/watch day in day out, even going from one setup to another exact same setup with different settings, the one you are used to and took the time to setup(or not) the way you like it,will probably be the one you prefer.

As said above imo, the technicalities about the op would be better suited and answered in the Home Cinema & Hi-Fi section anyway.
 
Last edited:
What a load of rubbish...

The picture quailty on my 50" plasma is far far better then on my 43" plasma & my 32" LCD & 24" LCD

this depends on the manufacturer, not on the size of the screen, i.e some 50'' plasmas are dreadful some ok, some excellent.... you have a pioneer so this is one of the best, if not the best depending on the model, get realistic ok !

a large screen stretches the pixels, P.Q is generally better on a 42'' screen than a 50'', this is well known, but there are exceptions, the 60'' screen i saw today was quite a lot superior to a 50'' LG plasma or the 50'' panny plasma, i was quite shocked to see how much better..It blew all the other tvs around it to pieces and this was on SD only..... and these were 3d Smart tvs !!!

i saw a 60'' Sharp LED LC60LE536E.

in fact, the other edge lit LEDs i saw were excellent too, the plasma was dull in comparison ( brightness and colour) , the LCDs i saw at the size you have, looked awfall...

as for gaming via pc..... you can only game on ``pc mode`` and in turn, you have to turn off all video editing features via the tv, if not you get dreadful LAG of about 1/2 a second... i.e you have to turn off noise reduction/ edge enhancement/ film modes etc etc... this is the same on my tv too

i've just been on AV Forum, after returning from Currys.

http://www.avforums.com/forums/lcd-led-lcd-tvs/1554015-sharp-lc60le636e-9.html

i was after the panny 50'' or the LG....until i saw this, plus i can get it for only 900 quid
 
Last edited:
I'm not a home cinema enthusiast, if it looks good enough then I'm happy.

As you posted in the graphics section and not the Home Cinema section I didn't divulge in dissecting image quality for video viewing as it's not my thing.

If you're happy with a slightly plastic look then that's fine i guess but just because we get used to something doesn't mean it's good, now you're aware it's messing with the imagine quality don't you wonder if perhaps it might be a good idea to get used to a nice natural image?

What the brain thinks is ok isn't necessarily true, take something simple like whites, you have warm whites or cool whites, to most people when flicking back and forth from warm to cool white, cool might appear whiter and so better but in fact it's not, its known as the daz effect, also consider how people believe 24fps film is better over 60fps, even though it's a lot smoother, it's just what we get used to but not actually better if we're honest.
 
Mine seems to be already off as default in the video quality settings for both my 7950 and 6620G? Oh well, turned it on to have a look, and it does seem having it off looks better.
 
What the brain thinks is ok isn't necessarily true, take something simple like whites, you have warm whites or cool whites, to most people when flicking back and forth from warm to cool white, cool might appear whiter and so better but in fact it's not, its known as the daz effect, also consider how people believe 24fps film is better over 60fps, even though it's a lot smoother, it's just what we get used to but not actually better if we're honest.

Spyder 4 calibrates the whites to a cooler white. It looks more natural but it's not bluish.

I've had to turn most of the settings off. Didn't know all this stuff was on in the CCC before. I've never liked settings artificially enhancing the picture quality. It's like the equivalent of somebody manually ramping the settings up. I prefer a picture of natural blacks/whites natural vibrance and a natural crisp clear picture. None of this dynamic stuff.

I've done the best I can to match the video quality on my Mac which looks as natural as possible since all displays are hardware calibrated. As it uses an NVIDIA GPU.

CCC settings:

Basic Video Color: Use Video Player Settings (VLC default settings from first install)

Advanced Video Color: Everything off

Video Quality: Everything off

Except Edge-enhancement: 10% (10% sharpness is what I use on the HDTV)

Apply current video quality settings to Internet video

Select the deinterlacing mode: Vector adaptive

Pulldown detection



Everything looks nice and natural. No washout, no silly contrast. Isn't the whole purpose of video quality to represent real life? neutral colours most of the time, you'll always get scenes done that are rather vibrant and contrast like as you would on a very bright summers day... you'll also have scenes that are quite dull/overcast.

 
No it's not, just posting my view from the way I saw it.

Well, you know what they say about preconceptions....

I simply pointed out that if you are used to one set of settings and then use another vendors settings, most are going to be familiar/prefer with what they use/watch day in day out, even going from one setup to another exact same setup with different settings, the one you are used to and took the time to setup(or not) the way you like it,will probably be the one you prefer.

As said above imo, the technicalities about the op would be better suited and answered in the Home Cinema & Hi-Fi section anyway.

Absolutely some might and do think that way, however it's pretty clear from the tone and contents of the op than the was more interested in what the video should look like, rather than the result of the default AMD driver settings.

this depends on the manufacturer, not on the size of the screen, i.e some 50'' plasmas are dreadful some ok, some excellent.... you have a pioneer so this is one of the best, if not the best depending on the model, get realistic ok !

a large screen stretches the pixels, P.Q is generally better on a 42'' screen than a 50'', this is well known, but there are exceptions, the 60'' screen i saw today was quite a lot superior to a 50'' LG plasma or the 50'' panny plasma, i was quite shocked to see how much better..It blew all the other tvs around it to pieces and this was on SD only..... and these were 3d Smart tvs !!!

i saw a 60'' Sharp LED LC60LE536E.

in fact, the other edge lit LEDs i saw were excellent too, the plasma was dull in comparison ( brightness and colour) , the LCDs i saw at the size you have, looked awfall...

as for gaming via pc..... you can only game on ``pc mode`` and in turn, you have to turn off all video editing features via the tv, if not you get dreadful LAG of about 1/2 a second... i.e you have to turn off noise reduction/ edge enhancement/ film modes etc etc... this is the same on my tv too

i've just been on AV Forum, after returning from Currys.

http://www.avforums.com/forums/lcd-l...0le636e-9.html

i was after the panny 50'' or the LG....until i saw this, plus i can get it for only 900 quid

Given everything else is the same, even the distance you sit from the set, then yes a bigger screen might look worse off IF you were already sitting at optimum distance or closer for the smaller screen. Sit the optimum distance from both and I would expect them to be largely the same, if not identical. However it should be noted that a LOT of people sit far to far away from their screens to get the full potential from them. IE, you should really be sat around 7-8ft away from a 1080p 50" screen......how many people really do that?

Think of it like this...you say larger screens stretch the pixels, but if you were set the correct distance from the set then, no matter the size of the screen, the pixels would appear to be exactly the same size..... Then it only really comes down to manufacturing difference between the sets, and i cant think of any examples right now of sets where bigger versions are noticeably poorer than the smaller sets with the exception of SD performance on my panasonic th50pz80b, which was it was pretty much universally panned for next to the smaller th42pz80.
 
Last edited:
i sit about 9ft away from a 55'' and the 60'' sharp LED looked ok at the same range.... but i dont like buying a tv without a good review first and i cant find anything for the Sharp.

although it looks excellent, better than a plasma, this could be because the plasmas were not set up correctly in Currys, but if so; the Sharp shouldn't have been either..... all the LED tv's looked better.

BUT i would be far happier going for an LG plasma 60'' that i know is good, rather than the Sharp that might be fooling my eyes... we all prefered the Sharp but this might be due to it being a brighter image only, rather than contrast ratio/ black levels/ stability/ evenness across the screen etc.

The first thing i always think of when i see my Panny 42'' is...``that screen is too dark`` yet it's on 75% brightness, the image is perfect, but it has no WOW factor.
 
Last edited:
Back
Top Bottom