Hi all
my specs are as follows:
i5 2500k @ 4.5ghz
asus p8p67
corsair 1333mhz 4gb DDR3
standard HD
EVGA SC GTX 580 @ 797/1594
Just got this card today through the mail, was from ebay, grade A card (minor box damage), which it had.
Anyway, first I removed my ATI drivers (before removing the card) so basically nothing was on there, also ran Driver Sweeper and removed all the registeries etc etc. Then I opened it up and put in my new card, loaded up and the first thing I noticed was that my main screen did not work (HDMI), but my secondary smaller one did (VGA). So anyway, I install the latest 580 drivers straight from the site, then I reboot as it tells me to, and again on startup my HDMI monitor isn't working, light stays orange etc, until I get to the windows loading screen, then it comes on and everything is fine resolutions are good etc.
So then It was time to test it out, which I have been doing for the past 6-7 hours. Ran Heaven Benchmark, version 2.1, and had a minimum fps of 5.3, a maximum of 103.4, and an average of 25.6 FPS. Yes that's right, 25.6 FPS, I then was so confused I got a friend of mine who's running a GTX 560Ti on air, and asked him to run the exact same benchmark with the same settings, he got 43.3 average fps.
There's obviously some huge problem here, i'm not sure if I might've had an error installing the drivers, so I reinstalled them again, using driver sweeper to wipe them clean. Run the benchmark, same very low fps, check the temperatures, around 59-61 degrees whilst running it (using evga precision and MSI afterburner to monitor). I get 43 degrees idle with the fan at 85% (max it will go to manually).
So then I test out some games, modern warfare 2, I was having fps spikes to 5-15 when looking in an "open area" - for anyone who plays mw2, for example wastelands, if I was to look out towards the churches my fps would plumit, where as in the tunnel in the middle it would be fine and smooth.
The settings in mw2 were: all textures on high (not even extra..) - shadows off, everything from the options off except for bullet impacts (shouldn't affect much). AAx0 on 1680x1050 resolution.
So then I try out World of Warcraft, i'm getting 20-35 fps in the middle of a city with around 100 people actively walking around, which isn't too bad, but it's definitely not great. This is with shadows on ultra.
In furmark, the latest version (1.9.1 I do believe this is latest), running the 720p benchmark, I had an average fps of 32 and a score of 1875 points. Running the 1080p benchmark I finished with an average of 17 fps.
Now my question is, is this normal, because I really don't think it is. And if it seems absurd to you like it does to me, how can I go about testing it before deciding that the card is just screwed?
Thanks for any suggestions in advance.
my specs are as follows:
i5 2500k @ 4.5ghz
asus p8p67
corsair 1333mhz 4gb DDR3
standard HD
EVGA SC GTX 580 @ 797/1594
Just got this card today through the mail, was from ebay, grade A card (minor box damage), which it had.
Anyway, first I removed my ATI drivers (before removing the card) so basically nothing was on there, also ran Driver Sweeper and removed all the registeries etc etc. Then I opened it up and put in my new card, loaded up and the first thing I noticed was that my main screen did not work (HDMI), but my secondary smaller one did (VGA). So anyway, I install the latest 580 drivers straight from the site, then I reboot as it tells me to, and again on startup my HDMI monitor isn't working, light stays orange etc, until I get to the windows loading screen, then it comes on and everything is fine resolutions are good etc.
So then It was time to test it out, which I have been doing for the past 6-7 hours. Ran Heaven Benchmark, version 2.1, and had a minimum fps of 5.3, a maximum of 103.4, and an average of 25.6 FPS. Yes that's right, 25.6 FPS, I then was so confused I got a friend of mine who's running a GTX 560Ti on air, and asked him to run the exact same benchmark with the same settings, he got 43.3 average fps.
There's obviously some huge problem here, i'm not sure if I might've had an error installing the drivers, so I reinstalled them again, using driver sweeper to wipe them clean. Run the benchmark, same very low fps, check the temperatures, around 59-61 degrees whilst running it (using evga precision and MSI afterburner to monitor). I get 43 degrees idle with the fan at 85% (max it will go to manually).
So then I test out some games, modern warfare 2, I was having fps spikes to 5-15 when looking in an "open area" - for anyone who plays mw2, for example wastelands, if I was to look out towards the churches my fps would plumit, where as in the tunnel in the middle it would be fine and smooth.
The settings in mw2 were: all textures on high (not even extra..) - shadows off, everything from the options off except for bullet impacts (shouldn't affect much). AAx0 on 1680x1050 resolution.
So then I try out World of Warcraft, i'm getting 20-35 fps in the middle of a city with around 100 people actively walking around, which isn't too bad, but it's definitely not great. This is with shadows on ultra.
In furmark, the latest version (1.9.1 I do believe this is latest), running the 720p benchmark, I had an average fps of 32 and a score of 1875 points. Running the 1080p benchmark I finished with an average of 17 fps.
Now my question is, is this normal, because I really don't think it is. And if it seems absurd to you like it does to me, how can I go about testing it before deciding that the card is just screwed?
Thanks for any suggestions in advance.