• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Geforce GTX 280 Advice Please?

A 280 as a PhysX card is a bit... overkill:eek: Why not sell it once you get a fermi card and buy a 8800 for PhysX?

120 quid is a pretty good price I think for the 280. I agree drivers at this time are very buggy and not user friendly. if as you say you are going to upgrade to fermi ask yourself if you really need to upgrade at this particular moment. if yes get the 280, but its definately overkill as a physX card.


Then again, maybe it's...

Mafia 2 Video Cards and resolution: APEX High settings
Minimum: NVIDIA GeForce GTX 470 (or better) and a dedicated NVIDIA 9800GTX (or better) for PhysX
Recommended: NVIDIA GeForce GTX 480 for Graphics and a dedicated NVIDIA GTX 285 (or better) for PhysX NVIDIA GPU driver: 197.13 or later.
NVIDIA PhysX driver: 10.04.02_9.10.0522. Included and automatically installed with the game.

...not powerful enough!
 
LoL, you can't compare the two from tests years apart with different systems, the usage I showed is just from the card not from the system, the 5850 is one of the most efficient power to performance cards ever.

The gpu's use's same power no matter what system or How old .. Also you posted the 285 :rolleyes: your just Flapping about Power again, but the Facts are above, this just shows New tech = more power Usage and that is a mid range card..

Also the Op was asking a Question.. were was your Advice in this thread.
 
LoL you don't know how wrong you are, everyone on this forum who reads your posts know how wrong you are too. Regarding this thread, when I see someone posting clear misinformation, I will correct them. Your sig is against the forum usage rules...

OT, 280 for £120 is about right, I wouldn't pay anymore for one though.
 
Last edited:
I'm tempted to call those graphs inaccurate tbh... I'm yet to find a GTX260 (and I've a fair amount of experience with them) that idles around 30+ watt - highest I've seen so far is 27 (most are 25-26).
 
LoL, go check out what he uses to measure the power usage, it's not cheap tat, God forbid a 4 watt discrepancy when idle, TBH roff I would go with Wizards power draw analyze than yours.

So you agree that the 280 uses the same power as a 5850...;p
 
I have no idea on 5850 v 280 - I just know from extensive experience with the GTX260 that its unlikely to draw 31watt idle - which to my mind throws question on the rest of his figures.
 
You don't need a sophisticated device, a simple adapter (and knowing where to measure) and a DMM is more than accurate enough for GPU monitoring.
 
Last edited:
You don't need a sophisticated device, a simple adapter (and knowing where to measure) and a DMM is more than accurate enough for GPU monitoring.

Well actually you do need to use a sophisticated device, and you need to know how to use it. your "more than accurate enough device" is probably very inaccurate TBH.
 
You don't need super accurate monitoring gear to measure the power draw from the pci-e socket and additional connectors to well within the accuracy of a watt - even cheapy off the sheld gear is probably correct for calculating within probably 1/100th of watt which is more than enough for this purpose.
 
nm

EDIT: Anyhow thats arbitary to my point. My own measurements are properly done and not with a cheapy device anyhow.
 
Last edited:
? even with a shunt a DMM is more than accurate enough for this purpose.

You are expecting people to take your "testing methods" over one of the last unbiased review sites ! A site that is solely aimed at GPU testing and has been testing GPU's and having verifiable results for years.

Sorry Rroff, but you do what for a living again ?
 
I edited my post, I didn't use a cheapy method anyhow in my own testing - I was just saying that even cheapy equipment is more than capable of accurate enough measurements for GPU purposes.

To say TPU is "unbias" is a joke.
 
TPU are not bias, recent Nvidia and ATI reviews look spot-on with not a hint of bias, please direct me to some TPU bias toward either ATI or Nvidia.
 
Ok Rroff, why don't you talk us through that one, rather than making out you are a electronics expert.

How old are you Rroff ?
 
Ok Rroff, why don't you talk us through that one, rather than making out you are a electronics expert.

How old are you Rroff ?

You really don't want to go into it... its a long - semi complicated - and not very interesting story.

Short version: if you really want to be pedantic about the expert bit I have the qualifications on paper that says so (A grade double science GCSE with physics/electronics extra modules, A level and higher qualifications in physics and electronics, etc.) the truth is I'm not that good at it being out of practise never really putting what I learnt into use since leaving education almost 10 years ago. But my dad is technical manager for R&D at a well known UK company so we have all the hardware here at home (yes I'm living with my parents).
 
Last edited:
Back
Top Bottom