I have read about people saying so many times that they only need a 4850 or 4870 for a 22" or even 24" monitor.. I don't understand why as you only need to see real game benchmarks to see that having an x2 at these resolutions is needed, unless of course you like playing your games that can slow down a lot.
Another thing that bothers me is the fact that usually good midrange cards are just about as good as the high end card of the same generation, it makes me wonder why pay extra for the best card when it only has a bit of advantage instead of a much bigger advantage? yeah, you tell me. or is it simply greed?
I just want to see peoples opinions on this.
real game benchmarks, what exactly are you talking about though. Some review sites use their own timedemo's, though not many, and most for the sake of testing performance will use max settings and use the most intensive scene in the game. Firstly that doesn't mean that all the settings offer tangible visual quality increase vs performance decrease. Secondly a dip to 20fps for a split second in the most intensive part of a game is not noticeable unless you run fraps constantly.
For instance Fear, the benchmark utility included offers scene complexity not offered in the actual game, and I've never been as slow in game as in the benchmark, likewise the "soft shadows" option is a true performance killer, yet its horribly implemented and the shadows actually look worse. So you turn them off get a better picture AND better performance.
Reviews aren't the be all and end all.
Even saying that most of the reviews I see show a single 4870 kill almost all games at 1920x1200, sure some show 30fps rather than 80fps, but those games will be Company of Heroes, a RTS, a very slow paced game where you rarely move the screen fast, you simply don't notice the difference in a RTS above maybe 20-25fps, while in lots of FPS games the same framerate would be horrible. you don't need 60fps min in every game, because many games don't need that performance, and again RTS's in general have more units, more calculations and are more cpu based than gpu based. for instance I don't think anyone ever complained about the last 2/3 C&C games being slow yet are all capped at 30fps.
Theres very few games that won't run great on a single gfx card in the midrange price bracket, you can spend twice as much if you want that one game you play for 15 hours thats slower than the rest before you move onto another game that plays fine again, or you can spend half as much and turn down settings marginally on that one game a year thats a pain to run.