There was a chart somewere, and it seemed like the X1900 ATi cards had only a couple fps performance drop, but the GTS/X's had quite a big performance drop.
ignore
Last edited:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
There was a chart somewere, and it seemed like the X1900 ATi cards had only a couple fps performance drop, but the GTS/X's had quite a big performance drop.
You done the oblivion.exe hack to get AA then?
Doesn't need to do a .exe hack, he just sets the AA level he wants in his driver control panel.![]()
Weird really, because some GPU's cope better at these higher res rendering more than at lower res with some CPU limitation.
I was aware with ATi you had to do the Oblivion.exe hack since AA did not apply, at least with the 2900's it was the case.
Considering you could probably buy a competent C2D setup for the same money as you would be spending on a new 8800GTX I'm not entirely sure where you've got "gadzillion squid" from...Would slapping a new GTX8800 in my old(ish) rig be a good idea or do I need to fork out a gazillion squid for a core 2 duo + board + ram etc. to get all the benefit of a 'top-of-the-range' card?
I want better/smoother graphics in games and use a 22" monitor native res. of 1650 x 1050 with my single core 3700 CPU running at 2.4
Any comments welcome![]()
3700+ will probs be a tad lower then the 3800+, or may be about same, :- http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/ if it is a tad lower, be only by about 1-2fps.
3700+ will probs be a tad lower then the 3800+, or may be about same, :- http://www.firingsquad.com/hardware/geforce_8800_gtx_gts_amd_cpu_scaling/ if it is a tad lower, be only by about 1-2fps.
3700 is only single core