• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce Titan rumours.

It certainly is niche, however I think you lack understanding on what it takes to support such features, in that, it's actually very little.

I know all of that. I am perfectly aware of WSGF and I am perfectly aware of how to tweak the ini files to make games 'work'.

IE adjusting the settings in Fallout 3's INI to make parts of the hud appear on the right parts of the screen. However, Fallout 3 and New Vegas both suffered when you entered VATS and there was no fix for that.

With all due respect you are slightly talking to me like you think I didn't know what I was doing in the slightest which is patently wrong mate.

I did, but like a lot of others I just didn't think it was worth all of the aggro piddling around. That's fair enough yes?

It was just another one of those things that to me seemed a bit pointless. All that faffing around and it just didn't really add that much.

Which boils down to opinion, but, an opinion shared by many it seems.

I liked RAGE which did seem to work really well.

Rage2011-11-2523-29-06-82.jpg


But Fallout 3 just did not have large enough or crisp enough textures for it to work. Dirt 2 and 3 are only fun for so long before you complete them and then you are left with the realisation that most of the games you play simply don't look right.

Plus at that time you absolutely had to run SLI if you were on Nvidia which just made it even more annoying and frustrating.

I'm not putting people down who use it, just stating fact. These technologies are niche products and because of that will never ever be universally accepted. Which is a vicious circle, because due to that they will never be adopted or supported properly (as that ain't where the money's at !, see also Quad SLI for game developers) and so on.
 
I know all of that. I am perfectly aware of WSGF and I am perfectly aware of how to tweak the ini files to make games 'work'.

IE adjusting the settings in Fallout 3's INI to make parts of the hud appear on the right parts of the screen. However, Fallout 3 and New Vegas both suffered when you entered VATS and there was no fix for that.

With all due respect you are slightly talking to me like you think I didn't know what I was doing in the slightest which is patently wrong mate.

If that's the case, fair enough, but it did come across like that.

The VATS, is that an nVidia issue or something? As I've never had it myself. In fact, my old 6950 would run Fallout 3, and New Vegas both maxed out at 7680x1440 with very playable performance, so much so that I was actually very surprised by it.

I did, but like a lot of others I just didn't think it was worth all of the aggro piddling around. That's fair enough yes?

Of course, it's not for everyone.

It was just another one of those things that to me seemed a bit pointless. All that faffing around and it just didn't really add that much.

Which boils down to opinion, but, an opinion shared by many it seems.

Well, another thing that I forgot to add is that from my experience, a single larger monitor is much much better than 3 smaller ones. I think you'd have a very different opinion if you used it on 3 larger monitors, for example, I've got 3x 27" monitors, and I would personally prefer a single 27" monitor to 3x 20" ones because 20" is just too small for me.

I liked RAGE which did seem to work really well.

Rage2011-11-2523-29-06-82.jpg


But Fallout 3 just did not have large enough or crisp enough textures for it to work. Dirt 2 and 3 are only fun for so long before you complete them and then you are left with the realisation that most of the games you play simply don't look right.

It's all about the peripheral, but I get your point in that some don't like how the sides look.

But I do think the effect was definitely lost on you by using 20" monitors, because for example, my 27" monitors are roughly twice the surface area of your 20" monitors, which I think goes a long long way in impacting the immersion factor.



Plus at that time you absolutely had to run SLI if you were on Nvidia which just made it even more annoying and frustrating.

Well of course, that's a fair enough point too, but it's not a valid point against multiple monitors, just nVidia's implementation of them at the time, as you've been able to do multiple monitors on a single AMD card since they first brought "Eyefinity" out.

I'm not putting people down who use it, just stating fact. These technologies are niche products and because of that will never ever be universally accepted. Which is a vicious circle, because due to that they will never be adopted or supported properly (as that ain't where the money's at !, see also Quad SLI for game developers) and so on.

Well I never thought you were putting people down, I think your posts are dry sometimes (so are mine) and people tend to take it the wrong way and get uppity about it.

Now not all devs will support it, that's a given, some devs purposefully lock it out, however the only prerequisite of it working generally is the game has to support hor+, which most do. So for me it's easy enough to "faff" to get games working.

As you can see though, I've disagreed with stuff you've said, and my solution to that is just to say say and post a coherent post that details what I disagree with and why, instead of getting a sore arse and moaning about aggression and negative nancies.
 
As you can see though, I've disagreed with stuff you've said, and my solution to that is just to say say and post a coherent post that details what I disagree with and why, instead of getting a sore arse and moaning about aggression and negative nancies.

Of course :)

We're all passionate about the subject at hand which is good. World would be incredibly boring if we all agreed on everything :)

Yeah Surround (tm) was definitely a complete bodge and the only answer Nvidia had to Eyefinity at the time. I would imagine Eyefinity is far more mature given that it wasn't just a sloppy afterthought :D

The issues with Fallout must have certainly been down to Surround, then. Basically when you opened the Pipboy the text all overlapped making the hacking almost impossible.

Which for me (a die hard Fallout 3 fan I'm *still* playing it) rendered Surround completely useless.

Sure I like a game of Dirt whatever but they're very arcadey casual games to me and can't be taken too seriously.

The size did impress me, even on three 20" monitors especially in Dirt 2 and 3, but the issues with pretty much all of the games I loved to play.
 
Yeah Surround (tm) was definitely a complete bodge and the only answer Nvidia had to Eyefinity at the time. I would imagine Eyefinity is far more mature given that it wasn't just a sloppy afterthought :D

nVidia was doing multimonitor gaming support (span mode) waaay before Eyefinity was even a glimmer of an idea nVidias surround is a far more mature technology - however they took it out the drivers circa 2001 and made it a Quadro "only" feature and it was only ATI/AMD introducing Eyefinity that forced them to put it back onto GeForce. Its far from a sloppy afterthought.
 
nVidia was doing multimonitor gaming support (span mode) waaay before Eyefinity was even a glimmer of an idea nVidias surround is a far more mature technology - however they took it out the drivers circa 2001 and made it a Quadro "only" feature and it was only ATI/AMD introducing Eyefinity that forced them to put it back onto GeForce. Its far from a sloppy afterthought.

Never knew that !

I do remember gaming on two 17" CRTs on a Matrox lol. Max Payne on dual monitors lol with two huge cream bezels bang smack in the middle :D
 
nVidia was doing multimonitor gaming support (span mode) waaay before Eyefinity was even a glimmer of an idea nVidias surround is a far more mature technology - however they took it out the drivers circa 2001 and made it a Quadro "only" feature and it was only ATI/AMD introducing Eyefinity that forced them to put it back onto GeForce. Its far from a sloppy afterthought.

This is certainly true, however there is little to multiple monitor support so there's very little to even "mature".
 
Like the idea of a temperature target/celling with GPUB2.0 The 6+8 power suggests reasonable power consumption, the performance looks top notch. The only question is the price, I wonder what the cost to upgrade will be from a couple of 7970's :D
 
Before I get flamed I have looked back a couple of pages to see if this has already been linked/posted,

http://www.xtremesystems.org/forums...onsumer-part&p=5171329&viewfull=1#post5171329

Conclusion
GeForce GTX Titan average increase over Radeon HD 7970 GHz Edition:
(47 + 30 + 32 + 35 + 43 + 24) / 6 = 35 %
GeForce GTX Titan average increase over GeForce GTX 680: (54 + 49 + 24 + 54 + 37 + 37) / 6 = 42.5 %

Ties in with that Arab slide, its not bad in my opinion, less than i expected, but 35% faster than the next fastest card? can't grumble at that unless its a silly price.

But it also means its not unreachable in the next round.

Be sure you know what your spending your money on, and how much of it, is all i will say.
 
Ties in with that Arab slide, its not bad in my opinion, less than i expected, but 35% faster than the next fastest card? can't grumble at that unless its a silly price.

But it also means its not unreachable in the next round.

Be sure you know what your spending your money on, and how much of it, is all i will say.

I think those figures sound about right.

The important figure is how the Titan overclocks, there not much point buying one if you have got an overclocked HD 7970 breathing down your neck.
 
I think those figures sound about right.

The important figure is how the Titan overclocks, there not much point buying one if you have got an overclocked HD 7970 breathing down your neck.

I mentioned that but I think it got lost in all the hate.

The 7970/50 was/is a mediocre card till you ramp up the clocks. With Titan being unlocked, I can imagine the same.

Add another 200Mhz to the core and the same for the memory and you could have a beast in wolves clothing (already a beast).

Hopefully some solid data today.

http://www.fudzilla.com/home/item/30500-nvidia-geforce-gtx-titan-leaked

Availability next week according to Fudzilla.

Another point... Does anybody know how it can make a 60Hz monitor run at 80Hz? This has me puzzled.
 
Last edited:
I mentioned that but I think it got lost in all the hate.

The 7970/50 was/is a mediocre card till you ramp up the clocks. With Titan being unlocked, I can imagine the same.

Add another 200Mhz to the core and the same for the memory and you could have a beast in wolves clothing (already a beast).

Hopefully some solid data today.

Up early this morning greg. Hope to hear good news for you and others on titan at 2pm as Gibbo can then speak.
 
Back
Top Bottom