• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

My mini review - GTX 280 vastly better than 9800GX2 for high res gaming

Would like to know that also, as some people say it does spin down, and others don't. It was Toms hardware who pointed this out.

http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-26.html
"During Windows startup, the GT200 fan was quiet (running at 516 rpm, or 30% of its maximum rate). Then, once a game was started, it suddenly turned into a washing machine, reaching a noise level that was frankly unbearable – especially the GTX 280......We should tell you, however, that our at-idle readings are taken after all our benchmarks have been run, after just a few minutes at idle. The problem is that the GTX 280 never really goes back to its minimum level......."


I assume all 280's are the same, made by Asus or whoever.
 
Last edited:
The worst thing about the jump from 512 to 1gb is the cost involved, especially with gddr5 as with anything new it will be ridiculously expensive compared to older stuff. the reason its such a pain for increased cost is, you don't necessarily need 1GB if you run out of memory with a 512MB card, you might only need 513MB, or more likely higher but I doubt anything comes close to using a GB yet so you essentially pay for wastage which is a shame.

1GB, at this time is overkill, I get that there are a few users with a 30" screen who it will help, but only in a small minority of games, to increase the price of the final card to the 99.9999999999% of users who don't need 1GB is fairly silly for them. AT the end of the day GFX makers need to aim a range of products at the widest range of people and be affordable in the segment. If ATi went ahead with 1gb of gddr5 on their 4870 no doubt it would increase the price significantly when only a very very small portion of buyers would see it used. Nvidia playing on epeen again. Its always a shame to lose out because you got a better screen, but its very silly to increase the cost so much. THey probably should have stuck with the old 8800gtx setup with 756mb mem and a midrange bus for a quite a lot cheaper card and the best of both worlds, but then, Nvidia have always been dumb.

I disagree, i'd like to see more 1GB cards. It's not just the 512MB texture limit that will affect high res monitors, games also load quicker and generally have less random stuttering. It's a better, smoother overall experience. ATI and NV should atleast do more 1GB cards for the higher end cards, which after all are aimed at serious gamers that will likely have high res monitors anyway and use AA/AF. A lot of people would be willing to pay extra for 1GB, but they should also still keep the 512MB versions, so you atleast have options.
And when buying a card it's best to think ahead, not buy it for whats currently availible, and in that case 1GB would be more useful for future games, as the 512MB memory limit will only get worse.
 
Would like to know that also, as some people say it does spin down, and others don't. It was Toms hardware who pointed this out.

http://www.tomshardware.com/reviews/nvidia-gtx-280,1953-26.html
"During Windows startup, the GT200 fan was quiet (running at 516 rpm, or 30% of its maximum rate). Then, once a game was started, it suddenly turned into a washing machine, reaching a noise level that was frankly unbearable – especially the GTX 280......We should tell you, however, that our at-idle readings are taken after all our benchmarks have been run, after just a few minutes at idle. The problem is that the GTX 280 never really goes back to its minimum level......."


I assume all 280's are the same, made by Asus or whoever.

I've tried EVGA precision tool to turn fan speed down, but it does not work with this card.

What other software is there that can change fan speed? I dont think the noise issue is a big one, the card does not get hot idle compared to most high-end GPU's, so all thats needed is software to change fan speed. Or it could just be a problem with current drivers... the fan really dont need to be spinning this fast considering the temp.
 
nice review mate, u must have to sit quite far back from a 30" screen lol, but it must feel uber realistic! :p
 
Last edited:
I reckon it's a driver problem, at least it was when the GX2's launched and had the opposite issue..fans would idle at 30%, speed up to 65% in gaming but quickly drop back down while gaming. Developer who made RivaTuner posted on the EVGAforums that there was squat he could do unless Nvidia revamped the Fan tables in the card via a bios update or fixed it with a driver. Nvidia opted for the latter. So I reckon a driver update is nigh.
 
I disagree, i'd like to see more 1GB cards. It's not just the 512MB texture limit that will affect high res monitors, games also load quicker and generally have less random stuttering. It's a better, smoother overall experience. ATI and NV should atleast do more 1GB cards for the higher end cards, which after all are aimed at serious gamers that will likely have high res monitors anyway and use AA/AF. A lot of people would be willing to pay extra for 1GB, but they should also still keep the 512MB versions, so you atleast have options.
And when buying a card it's best to think ahead, not buy it for whats currently availible, and in that case 1GB would be more useful for future games, as the 512MB memory limit will only get worse.

I'd like a 1 gig card too. Even though I game at 1920 and not uber high 2500 res, I still like to use a lot of AA, and would defintiely enjoy the smoother gameplay 1gig/512bit bus offers. Lots of people on EVGA forums have posted about this as well and finding the GTX 280 much "smoother" in gaming compared to the GX2. It's only the PRICE and FAN issues that are putting me off it right now.
 
Holding off on the 280 purchase now because early indications show the 4870 to be a real performer at high res. R700/4870x2 looks like it might also be worth waiting for, mainly because the implementation of CF is on the same PCB and the link between the two GPUs is much greater than the previous generation.
 
I've tried EVGA precision tool to turn fan speed down, but it does not work with this card.

What other software is there that can change fan speed? .......

RivaTuner?

I use the auto fan control and it basically auto adjusts the fan speed to meet any core temps increases resulting in nice & quiet setting when browsing and controlling the temps whilst gaming.

My settings for card in sig:

Dutycyclemin 33
Dutycyclemax 90
T min 45
T range 10
T operating 55
T low Limit 45
T high Limit 65

GPU core never over 57 deg C whilst gaming after these changes

You need to change RivaTunerFanAutoFanSpeedControl to 3 at Power User Tab to enable these changes.
 
By the time the 4870X2 is out i wouldn't be surprised if NV had shrunk the 280 core enough to do a GX2 280 or something, which would likely be faster.

And the 4870X2 would also need 2GB memory so that games have 1GB usable VRAM ... unless both cores share the same 1GB.
 
HardOCP also reported on microstuttering on the GX2, well they didn't call it that, but they reported how in places COD4 slowed down due to lack of memory/bandwidth.

This is worrying
http://evga.com/forums/tm.asp?m=411571
Core it throttling while gaming causing slow down...1 guys core droped down to 400mhz while playing Age of Conan. Looks like it's not a heat issue, and Nvidias p-states will drop it to 300mhz or so on the desktop so that's normal too. Maybe due to EVGA's precision tool not playing well with the Nvidia control panel ??

MR. B - I take it you have not experienced this weird slowdown effect in gaming ?
 
Well I've just joined the GTX280 club. :)

Should get it tuesday, got an ASUS card in the end for just over 400 squid and managed to sell my Ultra for 220 on "well known auction site".

I'm gonna do a few benchies of my own on the ultra before and then do the GTX280 and will post if anyones interested. :)
 
By the time the 4870X2 is out i wouldn't be surprised if NV had shrunk the 280 core enough to do a GX2 280 or something, which would likely be faster.

GT200b(55nm) is rumoured to come around 2009, the 4870 X2 is coming in 6-7 weeks or so. That said, the 9800 GTX+ (55nm) draws slightly more power than the 9800 GTX (65nm) despite being a 16-ish% die shrink and only a 9.3% clock speed increase. Personally I'm not expecting a dual GTX280 product until they get it down to 45nm, realistically.
 
Back
Top Bottom