• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

8800GTX: waste @ 1280 x 1024?

Associate
Joined
22 Aug 2003
Posts
768
Location
Bucks
Was considering getting myself a 8800GTS 640mb for Oblivion + Qarls texture pack etc. Now work have gone & offered me an extra 200 smackers for a bit of overtime this month & that's set me thinking about just going for a GTX instead (will be getting a DS3P & DDR2 ram end of next month). Thing is, I can't see myself upgrading LCDs as well so do you think it's gonna be wasted performance considering I game at 1280 x 1024 max?

Or what about I just go for the EVGA 640mb card and then upgrade LCD monitor when I upgrade the EVGA card with the stepup program?

The devil on my shoulder is saying just go for the willy waver card now :D
 
Get the GTX, gets on my **** when ppl keep saying it will be a waste for that res, you can ramp everything to the max including AA/AF, and it will last you longer than it would at a higher res, thus saving money in the long run as you wont have to keep splashing out on a new card all the time when the one you have starts to struggle. :)
 
make sure you have enough power in your system to actually wave that willy first tho, as they are quite thirsty cards.
 
mcc49 said:
make sure you have enough power in your system to actually wave that willy first tho, as they are quite thirsty cards.

Already got that covered: going for a 620W Corsair HX & gonna stick the lot into an Akasa Eclipse 62 :)
 
I felt like a mug at first for buying an 8800GTS to play games in 1440x900, a very similar size to 1280x1024. I ended up going to 1680x1050 and it's a little more taxed now, but still not to its limits except in poorly coded games.

If you think you will keep the GTX for years and years then go for it, but if you think you'll be replacing it any time soon then it will be a huge waste of money and you'll regret it.

Personally I am planning to keep my GTS through until at least this time next year and will probably get an even bigger monitor by then or something, so it might not actually last me.

Will probably end up caving and buying G90 or R700 when Crysis rolls around as the GTS will probably suck then.
 
It's not really a waste.

If you upgrade monitor later, it means you don't have to splash out on a new graphics card as well.
 
if you can afford the gtx go for it, at that res it will be a stunner,then if need be you can upgrade your monitor later, i currently use an 8800gts 640 on a vx2025 widescreen, only thing holds me back is my cpu and ram
 
I have a 8800GTX and a Samsung 930BF native res 1280x1024. Am I happy with my card? I sure am, is it overkill? no thats a load of toss.

Gives me a lovely option of upgrade monitor and res wise. Its prolly the first top of the range card I ever bought, I normally would buy a GT or somthing, I had the cash at the time and decided to get the best ie... GTX and I dont regret it. :)
 
Ulfhedjinn said:
Thank God I don't play horribly-coded console ports. :p

I just play horribly-coded PC games like S.T.A.L.K.E.R. ;)

I actually feel like screaming at Cyber mav. He brings up rainbow 6 EVERY damn post to complain about that it kills GPU's.

The only thing it kills is the programmers pride of work really, when they realize what a crappy job they managed in the port.
 
Dark_Angel said:
I actually feel like screaming at Cyber mav. He brings up rainbow 6 EVERY damn post to complain about that it kills GPU's.

The only thing it kills is the programmers pride of work really, when they realize what a crappy job they managed in the port.


you will be taking that back when unreal tournament 2007 is released and performs just as bad. so far all unreal engine3 games on pc have performed crap and i can't see ut2007 being any better.
 
Dark_Angel said:
I actually feel like screaming at Cyber mav. He brings up rainbow 6 EVERY damn post to complain about that it kills GPU's.

The only thing it kills is the programmers pride of work really, when they realize what a crappy job they managed in the port.
I agree, it's a real shame that developers are putting so little effort into games now that it actually hurts something as powerful as G80, and it's not just console ports anymore. :(

I was hoping Stalker would be some insanely-optimised cruising machine since it was five years in the making, but I was sorely disappointed. Hopefuly Crysis will be well-optimised and show developers a thing or two. :)

Cyber-Mav said:
you will be taking that back when unreal tournament 2007 is released and performs just as bad. so far all unreal engine3 games on pc have performed crap and i can't see ut2007 being any better.
That's like saying the Source engine is crap because Vampire: Bloodlines and Dark Messiah had serious performance issues. It is completely dependant on the developer, and UT3 will be in-house on an in-house engine so it might fly.

I know that Stalker was in-house on and in-house engine, but it had inexperienced devs. The devs of the Unreal series are definitely not inexperienced. :)
 
Back
Top Bottom