CRT recommendation

That's the 256mb 1950 pro nowhere is the 512mb version it's just a few fps below the 512
mb x 1950 xt.There's about 10 fps between the 1950pro 512mb and the 8800 gts 320mb at high
res.
 
Last edited:
That's the 256mb 1950 pro nowhere is the 512mb version it's just a few fps below the 512
mb x 1950 xt.There's about 10 fps between the 1950pro 512mb and the 8800 gts 320mb at high
res.
nvm clicked wrong button
 
Last edited:
It's the 256mb version of the 1950 pro like i just posted m8 lol the 512mb card is a lot closer because at higher res the 512mb keeps it closer to the 8800 which will begin to struggle at 1600-1200.
Anyway this is about crt monitors not an ati vs nvidia debate.
 
Last edited:
It's the 256mb version of the 1950 pro like i just posted m8 lol the 512mb card is a lot closer because at higher res the 512mb keeps it closer to the 8800 which will begin to struggle at 1600-1200.
Anyway this is about crt monitors not an ati vs nvidia debate.
You have to get up to 1900*1200 with 4xAA until the 320mb starts running out of ram, and even in that res the 320mb GTS still beats an X1950 XTX 512mb more often than not according to those Toms benchmarks. And we were talking about Bioshock in 1680*1050. I've already posted benchmarks showing the 320mb GTS is less than 2 fps slower than the 640mb GTS, even in a stupidly high res like 2560*1600 so clearly Bioshock does not need or benefit significantly from more than 320mb ram. If that's not a high res I don't know what is. Take another look, compare 640 mb and 320 mb GTS cards again -

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page5.asp

Virtually no difference. Bioshock just doesn't need the extra ram. Swapping an 8800 GTS 320mb for an X1950 Pro 512mb would give me a much lower framerate in Bioshock in 1680*1050. X1950 just doesn't have the plain horsepower for Bioshock-

http://www.firingsquad.com/hardware/bioshock_directx10_performance/page6.asp

X1950 Pro is a smidge slower than the X1800 XT I had previously, and my 8800 GTS is in my experience nearly twice as quick as the X1800 XT was in every game. We're talking 4440 with the X1800 XT in 3DMark06 up to 7800 with the 8800 GTS.

You will not find a single benchmark of an X1950 Pro 512mb beating an 8800 GTS 320 mb in any res up to and including 1600*1200.
 
I didn't say it did beat it i said it was up there with it meaning not far behind.All i know is which is what started this debate is a guy on these forums said he's getting the occasional slowdown on bioshock at high resolution as to whether this is the 8800 gts bug whereby the cache fills up and you have to alt tab out and back in again or the resolution is anyones guess.
 
Of course he's going to get occaional slowdowns though - it's an extremely demanding game on full settings, it's also quite buggy and IMO not particularly well optimized. And one persons slowdown is another persons liquid smooth. These are entirely subjective terms. Some people think 20 fps is fine, others can't stand anything less than 60 fps. You also can't read anything into how just one single system performs, there's a million different factors that could be adversely affecting performance. Unless you get hard numbers you can't reach any meaningful conclusions, and the hard numbers say 320mb isn't a limitation in Bioshock, and that the 8800 GTS 320mb is around 50% quicker than X1950 Pro (which isn't close).

I do know about that memory bug, but haven't run into it myself yet in the 6 weeks I've had my 8800, and that includes a 7hr SupCom game and several other 5+hr games, and playing R6Vegas all the way through in 3hr+ sessions. Not saying the bug doesn't exist, but I don't think it necessarily occurs that often.

Btw I'm not denying an 8800 GTS 320mb will struggle a bit with Bioshock in 1680*1050, I said as much in the post that started this debate, but the 640mb version would struggle equally, and an X1950 with any amount of vram would struggle significantly more, it's just a very demanding game shader-wise, it uses masses of pixel shader effects. An 8800 GTX/Ultra would struggle in 1920*1200.
 
Old school of thought was 256MB for upto 1600x1200 and 512MB for 1600x1200 and above, new games are more powerfull now but so is 320MB over 256MB and 640MB over 512MB.

A GTX or Ultra will not struggle at 1920x1200, I play BioShock all settings max @ 1920x1440 and get 70FPS Min at any given time, and 80FPS Min for demo of MOH-Airbourne, both in Vista64.
 
Last edited:
A GTX or Ultra will not struggle at 1920x1200, I play BioShock all settings max @ 1920x1440 and get 70FPS Min at any given time
So with a GTX @ stock, running 1920*1440 with all settings on full you get 70 fps minimum throughout the whole of Bioshock even in big fights? :eek: I guess the GTX is lot quicker than I realized.

My GTS certainly can't run Bioshock acceptably in 1680*1050 on full settings (8xAF 0xAA). I wouldn't even say the framerate was 'perfect' at 1280*800, I'm sure it's still dipping down to 30-40 fps at times (dx10, vista32). My 3DMark06 score is just over 10K btw.
 
yea i don't believe that either i doubt it's a constant 70 fps 16x fsaa at 1600 nevermind 1900.
 
A GTX or Ultra will not struggle at 1920x1200, I play BioShock all settings max @ 1920x1440 and get 70FPS Min at any given time
Firing Squad with a single GTX got 63 fps average in 1920*1200 (17% lower res than 1920*1440) in Medical Pavillion which is a small area and they used a save where they had already cleared the area of all enemies. That's average remember, they also included minimum fps, and on a single GTX in 1600*1200 they recorded a minimum of 41 fps on the same test.

So 1920*1200 (or even the 1920*1440 you used), a bigger map, a seriously firefight with several enemies and I wouldn't be surprised to see the fps dip down to say the 30-35 fps region. It's a game that tends to look fairly smooth even when it's running 30-40 fps - I think if you run Fraps you'd see numbers a lot lower than 70.
 
The samsung 1100mb CRT is pretty tempting, its a shame its impossible to find. To get a TFT that will do a similar res you have to fork out the best part of a grand. Currently using a Dell CRT m119, does 1600*1200 @ 70Hz or 1024*768 @ 100 Hz.
 
Last edited:
Still using my 19" Hyundai ImageQuest Q995, still as good as when I bought it a few years ago, think it was from overclockers when they used to stock them actually. Doubt I've got the invoice still but I have a hazy memory of them being ridiculously cheap, like £70 or something.

Never heard about the 24" widescreen Sony GDM-FW900 though before reading this thread, would love to get hold of one of them ;)
 
I use the Dell p1110 is a great CRT cant fault it at all.

they are very cheap now..
 
Last edited:
Firing Squad with a single GTX got 63 fps average in 1920*1200 (17% lower res than 1920*1440) in Medical Pavillion which is a small area and they used a save where they had already cleared the area of all enemies. That's average remember, they also included minimum fps, and on a single GTX in 1600*1200 they recorded a minimum of 41 fps on the same test.

So 1920*1200 (or even the 1920*1440 you used), a bigger map, a seriously firefight with several enemies and I wouldn't be surprised to see the fps dip down to say the 30-35 fps region. It's a game that tends to look fairly smooth even when it's running 30-40 fps - I think if you run Fraps you'd see numbers a lot lower than 70.


I dont really care what a site got m8, I only know what I got on my hardware, the GPU is only part of the equation.
 
yea i don't believe that either i doubt it's a constant 70 fps 16x fsaa at 1600 nevermind 1900.

Making me out a liar ?

The game by default does not have AA and the Supersampling AA I have on perm in Nvidia driver wont take effect. (sure now we know you can force on with 3rd Party APP's, but it aint needed in this game IMO or 2K Games), so you need go back and do your homework on the games settings.

I got 70FPS MIN and thats that ok ?.

Neither of you above have a 8800Ultra or exact same spec as me, so you can not comment 1st hand, I would only be kidding myself on if I made it up, Codemasters DiRT on the other hand all at MAX in driver and Game Menu only gets 35FPS AVG but its a buggy console port and not to even be used in comparision.
 
Last edited:
Well i could go check but i know it to be the case already so don't need to bother myself.
 
I dont really care what a site got m8, I only know what I got on my hardware, the GPU is only part of the equation.
FiringSquad also tested without AA, and you didn't say you had an Ultra, you said GTX or Ultra.

Next time you play it, set Fraps running with it set to save minimum fps benchmark data, and just play the game normally including fighting enemies on a reasonably sized level (neptunes bounty, arcadia etc), in 1920*1440 (8xAF or higher) on full detail in DX10 and report back your results. Obviously you would need to play for at least half an hour or so to get a reasonable sample of the different environments in the game. I'd be surprised if the minimum was 70 or above.
 
Back
Top Bottom