Soldato
- Joined
- 1 Apr 2010
- Posts
- 3,034
Let's look at one example, Crysis v Crysis: Warhead. The image quality was slightly worse with the sequel even though they both looked fabulous. The latter was significantly smoother as well. Image Quality should not always be a priority.
IQ is always a priority for me. I spend allot of time and cash trying to get the best IQ from my games. If not I would be running a single card set up with a mid range card.
As I said in my previous post.If you're such an enthusiast of AV, tell me what are the exact differences between two images. Sharpness, colours, contrasts? Better AA, AF, mipmapping?
Reading this thread and looking at the pictures I think its fair to say that some games will be effected and others wont. Personally I do not like any decrease in image quality at all. One of the main reasons I choose pc gaming over console gaming is so that I can have the very best image quality possible in games.
It matters as I own a few ATI cards and will own many over the years to come. It matters as I am a customer and have an opinion just the same as you do.You are, you aren't, what does it matter? You're discussing the subject without having a first hand experience. I've never implied that you are/were not an owner of ATI/AMD graphics card.
As you are also discussing this topic can you expand on your first hand experiences?
Where did I mention videos or anything like that? How do you know what I have read?What strikes me is that you complain about IQ based on the videos that some German website posted to prove that the IQ is indeed worse. If you even bothered to read what is on that website, or Guru3D for that matter, you have to look very carefully for those flickering textures/anisotropic filtering distortions as they're not noticeable fully in an image.
Your main point is that graphics cards manufacturers should not optimize their drivers any further because there is a supposed "standard" for IQ.
Seriously Krugga you need to take a chill pill and stop being so defensive of ATI/AMD.
What I have said and I will repeat once again for you is that I do not like any form of reduction on IQ being set as a default setting. I hope this is now clearer to you.
You think its a great idea to lower the IQ on default and I think its a bad idea. I personally think by doing this AMD/ATI have opened their selves to being perceived as not providing as much quality. IMO what would have been a better move is if AMD/ATI had kept the default IQ and then released a new mode where you get 99% of the IQ for a 8% boost in frames. This to me personally would have been a better move. However I suspect that AMD/ATI didn't do this so that when a card gets reviewed on stock default settings in a review they get more frames on the chart. Call it a hunch but you gotta admit its a strategic move in the war of gpu's. Could it potentially harm them in the long run? Time will tell.

Last edited: