*** Grand Theft Auto IV - The Official Thread ***

I think he means the TVs owned by the people playing the games.

that just doesnt make any sense lol. i dont know what the recommended setting for sharpness is on the samsungs but if its anything other than 50 then i dont know why he's using sharpness for gaming only

is it? I don't have any issues with the game tbh - I think it's great and genuinly don't understand what all the fuss is about!!! :confused:

yes, it is an issue lol. nobody would willingfully look through frosted glass when playing games so why shouldnt people be unhappy about it? and yes, i know, you keep saying yours is fine. thats great. it looks terrible on mine.
 
Last edited:
it's strange, I find the framerate can be pretty bad most of the time on the ps3 version. A good example is anytime you drive a boat or driving at speed along a bridge it becomes slideshowy.
 
yes, it is an issue lol. nobody would willingfully look through frosted glass when playing games so why shouldnt people be unhappy about it?
Come on though, it's nothing like looking through frosted glass though is it. So the edges are soft, what does it matter. I thought Crackdown was really good and the edges on that had black lines around them, but that was called cell shading... I just see the out of focus look of the background in GTA as the style and I actually like it. The blurryness you talk about up close I haven't noticed myself but then maybe that's my monitor or maybe I'm not as sensitive to it as you are - who knows?

I guess I just think the game is awesome and find things like the comparison shots and complaining about the blurryness as nit picking what is a superb game imo.
 
Come on though, it's nothing like looking through frosted glass though is it. So the edges are soft, what does it matter. I thought Crackdown was really good and the edges on that had black lines around them, but that was called cell shading... I just see the out of focus look of the background in GTA as the style and I actually like it. The blurryness you talk about up close I haven't noticed myself but then maybe that's my monitor or maybe I'm not as sensitive to it as you are - who knows?

I guess I just think the game is awesome and find things like the comparison shots and complaining about the blurryness as nit picking what is a superb game imo.
see you're doing it again. it is EXACTLY like looking through frosted glass and i keep saying its not just the background, its everything. re4 on the wii had the same look but that never bothered me because it was on the wii. its not acceptable on the ps3. if you arent seeing what im seeing on my 40" screen then im happy for you. its blindingly obvious on a big screen. you wonder why people keep saying the same thing when you arent listening lol.

It does. Very much so.
i cant see how? he shouldnt need to use sharpness for games if he doesnt for film. if he's only using for gta4 then doesnt that prove my point? i already showed i could see he was using it straight away because the hud was sharper than it should be in his photo:)
 
Last edited:
I am extremely bothered with fps drops and such like, I sold COD4 because of the shoddy framerate. Am yet to notice any on GTA.

^Im joking by the way.


But anyway, Joebob is right.

I guess I just think the game is awesome and find things like the comparison shots and complaining about the blurryness as nit picking what is a superb game imo.
 
that just doesnt make any sense lol. i dont know what the recommended setting for sharpness is on the samsungs but if its anything other than 50 then i dont know why he's using sharpness for gaming only

The factory default setting for sharpness is 50 and I leave it at that for everything including movies.

It's configured to D65 standard, the only setting recommended by D65 which I don't use is the sharpness turned down because I think it looks awful in everything.

I don't buy a £1k High Definition TV to take away all the sharpness that's supposed to be its main benefit in the first place.
 
Last edited:
gamers televisions? the level of sharpness depends on the tf yes, but there's no 'gamers' tv that requires a different sharpness setting lol

What I meant was show me a TV that ships with the sharpness at 0 by default, it's only the movie buffs who are anal about achieving a pixel perfect image that will ever turn the sharpness off completely, the majority of gamers including those on this forum will all likely have some level of sharpness enabled.

So while the PS3 output may be slightly softer most TV's will compensate and considering the Xbox 360's dithering problems IQ will likely be just as good if not better.
 
Last edited:
What I meant was show me a TV that ships with the sharpness at 0 by default.

Every TV will ship with settings that are classed as default, but it doesn't mean that those settings are the optimal settings for that set.

Just go through a forum like AVForums and you'll find a huge amount of threads about TV's and their optimal settings.

Simply because Samsung say that 50 is the default setting, does not mean that it's the optimal setting for what you want to do with it.

EDIT:

That said, I don't call myself a Movie Buff at all, but if I buy a new TV I will always spend a good half hour or so (sometimes more) trying to get optimal (or at least good/better) image settings on each source I use. Why wouldn't you want to get a good picture by changing the settings? After all - that's what they're there for :confused:
 
cargtaiv.jpg


My car is pimp, yo! :p:cool:
 
I don't buy a £1k High Definition TV to take away all the sharpness that's supposed to be its main benefit in the first place.

But its not at literal as that, push it too far and you just get halo's and noise around things as the processing goes OTT. Its not like a camera lense focusing in on something to make it sharp!

Same as brightness and contrast dont really describe what they change in the picture.
 
I thought 50 sharpness meant no convolution filter was applied to the image while >50 meant that a high pass filter was applied to the resulting image and <50 meant that a low pass filter was applied.

Correct me if I'm wrong though.
 
And thats a standard that all manufacturers adhere to? I imagine its just a setting somewhere on the processing that deals with edges and adjacent cell differentials, 50 is just roughly around the middle of that and will vary hugely between different models and makers. Your example would mean that deviating either side of 50 would apply a 'filter' whereas 50 has no filter, that makes zero sense. :confused:

In PC mode the sharpness adjust is disabled on mine, so I assume thats no processing at all applied which helps with input lag.
 
And thats a standard that all manufacturers adhere to? I imagine its just a setting somewhere on the processing that deals with edges and adjacent cell differentials, 50 is just roughly around the middle of that and will vary hugely between different models and makers. Your example would mean that deviating either side of 50 would apply a 'filter' whereas 50 has no filter, that makes zero sense. :confused:

In PC mode the sharpness adjust is disabled on mine, so I assume thats no processing at all applied which helps with input lag.

Most TV manufacturers, yes. I don't think PC monitors use that standard at all though.

I makes perfect sense for 50 to have no processing on a "sharpness scale" as you'd expect low sharpness to blur the image (low pass filter) and a higher sharpness to enhance edges (high pass filter) and the middle to not blur or enhance edges. The middle is expected to be neutral just like colour and contrast on the TV sets.

Convolution masks use fast fourier transformations so you would not me able to perceive the delay on any decent set with sharpness being adjusted. I presume sharpness is disabled on PC mode because it is expected to receive a 1:1 mapped signal where each pixel directly maps to one pixel on the screen. In this instance sharpness has absolutely no use and will just trash the image quality.
 
Last edited:
Back
Top Bottom