Does anyone else think this may be the case?
This is our current household scenario:
Laptop #1:
Was running XP, due to its age it can't really play too much on low talk about high but frame rates were still choppy on lowest. Upgraded to Win7 after a few days graphics card died, before this we noticed a performance increase of about 10-15fps.
Didn't think anything of it due to the laptop being so old just thought it was its time.
PC #1:
Was running Win7 RC client, ****ed over a fair amount of games on full, Upgraded to Win7 Retail still ****ed over a lot of games but with a increased FPS. Graphics card died, didn't think anything of it due to the 8800GTX being 3 years old now. Replaced it with the warranty and the same happened again a few hours after receiving the replacement.
PC #2:
Was running XP, struggled to play Bioshock game on full settings. Upgraded to Win7, plays Bioshock game on full settings with no arguments. (until it gets pushed too hard and pc reboots). Card hasn't died but we think it is because unlike the Laptop and PC #1, it wasn't overclocked by default.
Now the only common thing between all 3 systems is the operating system, you have the laptop and PC#2 running ATI chipsets and AMD processor, you have PC#1 with a nVidia chipset and Intel processor.
To see an increased performance over even a fresh install of XP we have come to the conclusion that windows and the drivers made for it are pushing the cards a lot harder, for normal cards it isn't an issue as they can receive the extra power due to not being already overclocked and all they will do is reboot the pc. However, those cards which are overclocked by default are being pushed far beyond their capabilities? (Asking for too much power from the PSU or Mobo?)
(It would be interesting to see if the RMA demands towards various companies have increased since the release of Win 7)
This is our current household scenario:
Laptop #1:
Was running XP, due to its age it can't really play too much on low talk about high but frame rates were still choppy on lowest. Upgraded to Win7 after a few days graphics card died, before this we noticed a performance increase of about 10-15fps.
Didn't think anything of it due to the laptop being so old just thought it was its time.
PC #1:
Was running Win7 RC client, ****ed over a fair amount of games on full, Upgraded to Win7 Retail still ****ed over a lot of games but with a increased FPS. Graphics card died, didn't think anything of it due to the 8800GTX being 3 years old now. Replaced it with the warranty and the same happened again a few hours after receiving the replacement.
PC #2:
Was running XP, struggled to play Bioshock game on full settings. Upgraded to Win7, plays Bioshock game on full settings with no arguments. (until it gets pushed too hard and pc reboots). Card hasn't died but we think it is because unlike the Laptop and PC #1, it wasn't overclocked by default.
Now the only common thing between all 3 systems is the operating system, you have the laptop and PC#2 running ATI chipsets and AMD processor, you have PC#1 with a nVidia chipset and Intel processor.
To see an increased performance over even a fresh install of XP we have come to the conclusion that windows and the drivers made for it are pushing the cards a lot harder, for normal cards it isn't an issue as they can receive the extra power due to not being already overclocked and all they will do is reboot the pc. However, those cards which are overclocked by default are being pushed far beyond their capabilities? (Asking for too much power from the PSU or Mobo?)
(It would be interesting to see if the RMA demands towards various companies have increased since the release of Win 7)