• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

VF900 + A X1900XT/X? Worth it?

How is 75c hot? My x800 used to run at 85c on the stock cooler. My 7800GT used to run about 72c on the stock cooler.

And any x1900 will run oblivion fine unless you think 35fps+- 1680x1050 2xAA 8xAF with HDR on a tweaked ini is not playable.

You have to remember this is not an FPS. In FEAR I consider anything under 45fps unplayable but with oblivion anything under 25fps is unplayable because of the way the engine works.

As long as you follow the ini tweak guides you can get excellent framerates.
 
gonna wait for G80 or R600, need to play oblivion at 1600x1200 8XAA 16xAF HDR and all details at max with minimum framerate dropping to 60fps. anything less and id be wasting my money on a gfx card.

hopefully my 6800nu can hold up till then.
 
Cyber-Mav said:
gonna wait for G80 or R600, need to play oblivion at 1600x1200 8XAA 16xAF HDR and all details at max with minimum framerate dropping to 60fps. anything less and id be wasting my money on a gfx card.

hopefully my 6800nu can hold up till then.

I think oblivion will be pretty much a outdated game in almost a years time,,,

If the rumors of the g80 & R600 being release around the same time as windows vista which is now on for a feb 2007 release are true..(if microsoft can even keep to that release date)..
 
Last edited:
Cyber-Mav said:
gonna wait for G80 or R600, need to play oblivion at 1600x1200 8XAA 16xAF HDR and all details at max with minimum framerate dropping to 60fps. anything less and id be wasting my money on a gfx card.

hopefully my 6800nu can hold up till then.

Min fps at 60? Thats not the cards - thats the cpu you want to be upgrading then. The reason why the framerate drops so much is because of all the shadows/physics and AI.

Even if the x2000 or 8800gtx are twice faster than an x1900 - thats not gonna give you 60fps min fps in oblivion. That would probably get you an average of 60fps (compared to two x1900 in xfire @ 60fps) and the minimum will still be about the 30's mark.

I think you are expecting too much framerate wise from oblivion. Why on earth would anyone "need" 60fps as a minimum is beyond me. In this game the framerate is smooth above 30.

Any why do you need 8xAA? I can't see any jaggies at 4x. Heck even 2x is enough at 1680x1050.
 
Just a update, GFX idle is 33 degrees (was 38)
CPU idle 26 degrees
MB 23 degrees.

It does still go highish, but much lower than before. Is there a utility that will load GFX card but in windows? (that way I can keep CCC to the side) I think highest I got with stock was 81, perhaps 85 degrees.

So haven't noticed any increase in case temps at all. I thought I would! :) So as long as you have a powerful enough exhaust fan around mobo mosfet and CPU should be expelled quickly enough to stay cool. Most likely if I switch off the Thermaltake temps will go up (Coolermaster ones hardly shift any air, but they're silent and at least provide some air movement...which is better than natural air movement due to heat convection.

So a worthwhile purchase.
 
cant wait til the end of the month for paycheque, definately going to invest in a VF900 for my 512 1800xt, 57 degrees at idle scares me tbh! :eek:
 
Generally just play Fear or COD 2 for a while, probably not the most reliable tests of course. But I leave CCC open and soon as I quit the game CCC is there with temp reading. It is lower than before though, usually Fear cooks the GPU for a bit with stock. If there were a TSR that recorded room, lowest, highest and average GFX core and left that running for a week with both stock and Zalman best.

Also I don't have a fixed room temp, so it could be hotter on the day with stock fan, and freezing cold after fitting the Zalman (or vice versa) But the average idle temps are down, and idle is easy enough to estimate as CCC will be open and Windows idle is a most common test. So if it's down on idle even at minimum fan speed it would be safe to presume it'll be lower during gaming with the Zalman at full speed. I've never had the stock ATI fan at 100% during gaming, but it did spin up faster during gaming, than the acceptable idle speed.

I guess for a proper comparison someone needs to do proper testing method. Ideally with two identical systems in the same room with same config. My stacker with a lot of cooling will be different to another case to a user with stock fan, and different again to one with a Zalman and depending on ambiant room temp.
 
Back
Top Bottom