• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Worth it now?

My css settings are (atm) 1024x768 resolution with everything set to minimum like the AA and AF, also in ATI ccc, apart from cat.ai of course. Afaik css only uses two cores.

Ye, 100 fps is what I experience as a 'normal' lower number. This may sound high fps to some but when you're used to getting an average of 160 - 180, sometimes and quite often 200 - 300, 100 feels quite different. It's what you get used to. I would have killed for 100fps when I had my X800 and 2.5 athlon, lol, but that was never going to happen. :)

Was playing until recently (get this) at 640x480 res with everything turned down / off apart from the obvious cat.ai and I think my lowest fps reading was around 180 - 200. Plus cranking the cpu up then to around 3.4 and then the taunts and accusations of "hacker" began:D Thing I've learnt though is that so much is server dependent.

I will add that in no way am I saying that I am a very good player, truth is I am not. I'm certainly above average though, quite a bit some days. The skills and hand-eye coordination I simply don't possess so I doubt I'll ever be a very good player. :(
 
Last edited:
as someone who was very very very good at CSS, the more you focus on things like fps, and resolution the less you'll focus on things that matter. if you have an fps above crap, you're fine. I could play at 1024x768, or 1920x1200, which is what I stuck to, with an easy constant 100-150fps, with aa/af, it made NO difference except it looked nicer, and it was easier to aim for me, and easier to see what was going on.

On private servers, quite a lot of clan matches etc, etc, no one ever pawned me, the low res thing, is myth, the massive refresh, is myth. The skill in these games is reading the player and working out where to shoot infront of him, if you're aiming for where they already are, you're losing. If you're aiming for ahead of where they are going, it doesn't matter what your fps is, because you'll always be aiming ahead anyway.

different fps's will feel slightly different, as your screen/card will be dropping different frames at different rates, and it WILL feel different, but it doesn't actually make a difference once you're used to it. THats what people need to learn, if you've say got a 60hz screen, or if you've got a 100hz crt, if you're generating more frames than you can see you're dropping frames, sometimes you'll drop every 6th frame, or maybe every 9th frame, thats the difference you're feeling, but either way you're seeing the same amount of frames and if you lock the fps to one setting you won't feel the constant change and thats the best way to play, as you never have to readjust to how the game feels.

Thats the key, it doesn't matter what your framerate is(beyond a certain fairly low point) its just about finding a stable framerate, using settings that stick to that framerate, and getting used to that framerate, thats all.
 
I do sometimes wonder if I enabled full detail I might be able to see more detail, like ct's camping in the dark corners in aztec for example.

Cheers for the advice dude :)

edit : think I must like high frames due to the smoothness it gives me.
 
Last edited:
I do sometimes wonder if I enabled full detail I might be able to see more detail, like ct's camping in the dark corners in aztec for example.

Cheers for the advice dude :)

edit : think I must like high frames due to the smoothness it gives me.

trust me, its not the framerate, you're simply used to it. Eyes are freaking weird, but VERY adaptable. When i first got a lcd, alongside my 120hz capable llyama it wasnt' fantastic in games, but wasn't bad either. However, in the time i'd switch between the monitors I had a horrible time, games on one, vid on the other, switch that around, I simply couldn't get used to it and the reason was simply, constantly switching between them you can't get used to either.

Stopped using the CRT for a while and quickly got used to the tft, it felt miles better than before at a much lower framerate. Started using them both again, horrible, started using only the crt, fantastic again.

Eventually made the switch because less tired eyes over prolonged periods of using monitors, no flicker, less eye strain, etc, etc.

But the key, is getting used to something, if most maps are running at 200fps, and just one runs at 100fps, it will just feel wrong.

Its been ages since I played properly, but you can set a max framerate in CS right? Whack it to 100 or something for a few days, or run vsync, keep playing, take some breaks, play for a few days, see if its really worse, or if you just have to get used to it. Vsync, or a multiple thereof might work best for you 60/120. Sometimes the problem you can hit, without vsync is specific framerates where you get really bad tearing, like at 120/180fps it will drop every 2nd frame, while at 100hz it might be dropping them mid frame so you get the half of one frame, half of the next.

It can vary from one screen and one card to the next, though frankly with tft's i've never ever had a problem with vsync on in any game. meaning i run 60fps in css, and i see 60fps, never more or less. There are the odd games here and there where the engine runs at different speeds based on the fps, those games are few and far between, but again, if you're relying on special uber jumps at a magic fps, you aren't good enough anyway :p
 
I think everyone has their own way of finding 'what works for them', a bit like some players I've known have had (or they say they have) low spec machines but play very well, ...on some servers. They use very standard, basic optical mice and do admirabley, lower spec gfx cards etc. I'm not discounting what you're saying, you clearly are more knowledgable than me about computers fullstop. Did try whacking up settings earlier and, to be fair, I only played for a map, if that. Fps dropped to around a constant 50 - 75 and I didn't like it. Guess I'm to used to being a fps junkie now, besides....how else will I get my kills;)
A few posts back I did mention that a lot is server dependent. I think there is certainly some firm truth here. I play on two servers atm. One has a reputation as having 'bad reg'. I've never complained myself about reg whilst playing on a server, after all, I don't have to play on it if I don't wish to but here I have to agree. Something doesn't seem quite right but the camaraderie and laughs more than make up for it. It's also friendly fire enabled which I like but it does break the flow of the game so that makes the kills, for me, harder too. The other server I go on, well, its quite the reverse. You shoot - you make contact and they drop. It won't be long before I am in their top five players. I think its called 100 tick;) Thanks heavens for fps_max 999 :)
 
The basic wheel mouse optical is actually the best gaming mice by far :P (WMO1.1a).

60fps with vsync has horrid input latency - any "great" player would notice it.
 
The basic wheel mouse optical is actually the best gaming mice by far :P (WMO1.1a).

60fps with vsync has horrid input latency - any "great" player would notice it.

As well as disabling vsync I also don't enable smooth out mouse movement. Apparently causes or can cause lag. Don't know if that's true though.
 
screen first, grab a cheap quad later in the year IF games start to need it(which is unlikely tbh). Though if its doing 3.5Ghz easily and higher isn't reliable, then you won't need one for a long time. If 3.5Ghz isn't reliable, run at 3.2Ghz, more than enough for any game out there.

To be honest a 4890 1gb is pretty damn close to a 285gtx, heavily overclocked versions of both cards, at least overclocks they are sold at, the 4890 might even more ahead in several games(I don't think the 4890's go much further than 1gz pre-overclocked, no idea if 285's overclock much further or not). there really isn't any single card available, that can match a 2 core setup be that two cards or 2 cores on one card. The 295gtx is FAR better value than a 285gtx because its still less than twice the price for very very big performance. But from a 4870x2, the only "upgrade" will be sli of some sort, or 2x4890's, or maybe a single card from ati or nvidia's next gen range.

Yer I can see that according to all the reviews - the X2 spanks everything else apart from the 295 in terms of frames but maybe my issue is with multi-GPU gaming in general. As the X2 is the only multi-gpu solution I have seen thus far I base my opinion on that. Even after spending a day A/Bing my rig to my best mates - we both agreed that his was smoother (same specs as I built his apart from he had single 4870 1GB). Even though according to FRAPS I was getting higher FPS his was so much more repsonsive cos it didnt have MICRO-STUTTERING! Depends whether or not you consider it an issue - I personally will never touch multi-gpu solutions again.
 
The basic wheel mouse optical is actually the best gaming mice by far :P (WMO1.1a).

60fps with vsync has horrid input latency - any "great" player would notice it.

Yer, I could smooth out the games with VSYNC enabled with the X2 but as soon as you do that you get input latency. By the sounds of it, I am hyper-sensitive to these issues more so than most so disregard my opinion relating to the OP.

I really feel that it's just a total mess atm the whole multi-gpu thing. Single card solutions are where my moneys going in the future.
 
I think the microstuttering is mainly due to the alternate frame rendering method they use. Apparently their are other methods, but they require huge amounts of communication betwee the two GPUs.
 
As well as disabling vsync I also don't enable smooth out mouse movement. Apparently causes or can cause lag. Don't know if that's true though.

mouse smoothing usually averages input across 2-3 frames (usually 2) - generally it isn't too noticeable unless it uses 3 or more frames.

This is one reason I prefer the WMO to the deathadder (which has a far better sensor otherwise) - it seems to be smoothing the mouse input in its drivers causing a noticeable lag behind your input - especially when you first move it after a period of inactivity.
 
ATI have apparently fixed the micro-stuttering in newer drivers on the X2 at a very minor performance decrease... last time I tried it tho it still didn't feel as smooth as my 260GTX SLI to me tho let alone as smooth as a single 260 - but thats a few driver revisions back.

Its probably more than fine for casual gamers or single player games but I do find it less than ideal for twitch shooters personally.
 
mouse smoothing usually averages input across 2-3 frames (usually 2) - generally it isn't too noticeable unless it uses 3 or more frames.

This is one reason I prefer the WMO to the deathadder (which has a far better sensor otherwise) - it seems to be smoothing the mouse input in its drivers causing a noticeable lag behind your input - especially when you first move it after a period of inactivity.

I'm sure I'm going to say "Ohh!:o" but what is WMO?
 
Back
Top Bottom