I know all of that. I am perfectly aware of WSGF and I am perfectly aware of how to tweak the ini files to make games 'work'.
IE adjusting the settings in Fallout 3's INI to make parts of the hud appear on the right parts of the screen. However, Fallout 3 and New Vegas both suffered when you entered VATS and there was no fix for that.
With all due respect you are slightly talking to me like you think I didn't know what I was doing in the slightest which is patently wrong mate.
If that's the case, fair enough, but it did come across like that.
The VATS, is that an nVidia issue or something? As I've never had it myself. In fact, my old 6950 would run Fallout 3, and New Vegas both maxed out at 7680x1440 with very playable performance, so much so that I was actually very surprised by it.
I did, but like a lot of others I just didn't think it was worth all of the aggro piddling around. That's fair enough yes?
Of course, it's not for everyone.
It was just another one of those things that to me seemed a bit pointless. All that faffing around and it just didn't really add that much.
Which boils down to opinion, but, an opinion shared by many it seems.
Well, another thing that I forgot to add is that from my experience, a single larger monitor is much much better than 3 smaller ones. I think you'd have a very different opinion if you used it on 3 larger monitors, for example, I've got 3x 27" monitors, and I would personally prefer a single 27" monitor to 3x 20" ones because 20" is just too small for me.
I liked RAGE which did seem to work really well.
But Fallout 3 just did not have large enough or crisp enough textures for it to work. Dirt 2 and 3 are only fun for so long before you complete them and then you are left with the realisation that most of the games you play simply don't look right.
It's all about the peripheral, but I get your point in that some don't like how the sides look.
But I do think the effect was definitely lost on you by using 20" monitors, because for example, my 27" monitors are roughly twice the surface area of your 20" monitors, which I think goes a long long way in impacting the immersion factor.
Plus at that time you absolutely had to run SLI if you were on Nvidia which just made it even more annoying and frustrating.
Well of course, that's a fair enough point too, but it's not a valid point against multiple monitors, just nVidia's implementation of them at the time, as you've been able to do multiple monitors on a single AMD card since they first brought "Eyefinity" out.
I'm not putting people down who use it, just stating fact. These technologies are niche products and because of that will never ever be universally accepted. Which is a vicious circle, because due to that they will never be adopted or supported properly (as that ain't where the money's at !, see also Quad SLI for game developers) and so on.
Well I never thought you were putting people down, I think your posts are dry sometimes (so are mine) and people tend to take it the wrong way and get uppity about it.
Now not all devs will support it, that's a given, some devs purposefully lock it out, however the only prerequisite of it working generally is the game has to support hor+, which most do. So for me it's easy enough to "faff" to get games working.
As you can see though, I've disagreed with stuff you've said, and my solution to that is just to say say and post a coherent post that details what I disagree with and why, instead of getting a sore arse and moaning about aggression and negative nancies.