• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Do games REALLY need more power atm?

what, a game based on the riverly of them too hmm, I bet it would be based on football -_-
 
Your missing the ponit and ranting about how good Nvidia are again ! Nvidia are run poorly, deal with it.

DX10 software development is pointless, that has nothing to do with some programers not getting their heads round DX9C :confused: but I was talking about how this has messed up hardware development. We have hardly moved on from DX9c I would call that a crawl.

I'm sure Nvidia pulled the plug on 10.1 for many reasons I dont care why now, but we all know it wasnt to anyone's benifit, it just means we all need new cards and DX10 has been a long costly mess.

I'm sorry missed what point? Developers are barely putting DX9 capabilities to full useage - even if there was a flawless DX11 implementation out right now with supporting hardware/API we wouldn't see a huge shift from where we are now... so any point about the failures of DX10/10.1 having a hold back on development are for the most part of no consequence.

and I was not ranting about how good nVidia are - either your trolling or have worse reading comprehension than even I do.

But as to pulling the plug on 10.1 not being to anyones benefit... tell me what the point is of paying 30-50% extra and nVidia/ATI spending time on development and hardware support for features that (a) aren't going to be used during the cards useful lifetime (b) can't be run on that generation of cards with any useable performance figures... even the 200/4800 series cards struggle with SSAO and differred shading/lighting features, etc.
 
The corporation I like is better than the corporation you like! :cool:

It's like a surreal dystopian future where people love corporations, could be a plot for a new game . . .

hehe I dislike both - I just find nVidia the easiest to live with of 2 evils.
 
I will only get Nvida when they learn bang for buck, ATI cards are not that far behined if not any in most situations
 
But as to pulling the plug on 10.1 not being to anyones benefit... tell me what the point is of paying 30-50% extra and nVidia/ATI spending time on development and hardware support for features that (a) aren't going to be used during the cards useful lifetime (b) can't be run on that generation of cards with any useable performance figures... even the 200/4800 series cards struggle with SSAO and differred shading/lighting features, etc.
But it's about the big picture. Sure cards now may not be able to take proper advantage of all the features but if nVidia had implemented them from the get go then DX10.1 would be seeing support from developers much sooner than it is. As it stands DX10.1 will be largely ignored as a result. nVidia has held back progress for this generation and they're a bit slow with their DX11 parts as well. However, there are a lot more factors involved, including the hostility by many towards Vista (which is required for DX10).

DX10 has been largely wasted but hopefully Win7 and DX11 will be a breath of fresh air for the industry.
 
At the end of the day, can all games be maxed (minimum 60 fps max settings at the highest possible res?)
as long as the answer to that question is no the we still need more power
 
But it's about the big picture. Sure cards now may not be able to take proper advantage of all the features but if nVidia had implemented them from the get go then DX10.1 would be seeing support from developers much sooner than it is. As it stands DX10.1 will be largely ignored as a result. nVidia has held back progress for this generation and they're a bit slow with their DX11 parts as well. However, there are a lot more factors involved, including the hostility by many towards Vista (which is required for DX10).

DX10 has been largely wasted but hopefully Win7 and DX11 will be a breath of fresh air for the industry.

A combination of lack of support for DX10 in XP, market focus on console compatibility, talent and innovation stiffled in the name of developing commercially safe games and a lack of real outside the box thinking has held back progress in this generation.
 
A combination of lack of support for DX10 in XP, market focus on console compatibility, talent and innovation stiffled in the name of developing commercially safe games and a lack of real outside the box thinking has held back progress in this generation.

And Nvidia
 
You have to remember that NVIDIA / ATI are also in league with game developers...

for example..

NVIDIA says to EA, if you want your games to run out OUR graphics cards and have a little cash sum so we can put our logo on your product you have to:

A. make the game run slightly better on our cards rather than AMD's cards.
B. make the game run better on our latest cards and require the technology on our latest cards.

EA games say yes, and accept a very nice cash sum from NVIDIA, who also get to place their pretty logo just after EA's pretty logo.

This could be DX10 for example, Crysis didn't need DX10, it never did, all the demos/promotions/screenshots etc were ALL on DX9, DX10 Crysis only really came about closer to launch. Also pretty much all of Crysis's DX10 features can be run under DX9, simply by changing the numers from '3' to '4' in the cfg game files. - A perfect exmaple of how software/hardware companies are in league. Crytek never had to use DX10, but both NVIDIA + Microsoft got to promote themselves in doing so, and Crytek earnt some extra money.

Its the name of the game people :p

So in answer to your question, no they don't 'REALLY' need any more power what so ever. Look, if Crytek can manufacture Crysis 2, which looks the same, if not better than Crysis 1, on the PS3 which uses a 7800GTX-Like card at a steady frame rate, then why the hell can't they make an older, less advanced game run on something that is about 10x as powerful at a steady frame rate?

Same answer as everything else in this world - business needs to make money ;)

So in result, YES you do need to buy the latest gear to run the latest games. It's up to you really, if you want to kiss NVIDIA's bottom in order to play eye-candy filled games then thats what you have to do, unfortunately us being customers, can do sod all about it :p
 
Last edited:
I cant stand consoles and would pick pc gaming above them any day tbh.

That was true but sadly I feel consoles have reeled in and surpassed the PC for gaming in many ways

I picked up a playstation 3 from tesco for £199 and have a and have a 1080p 100 Hz TV.

To build a PC to beat that setup would cost what... five - ten times more ?
 
Back
Top Bottom