• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

R600 insane 250W requirement!

Associate
Joined
8 Dec 2006
Posts
746
Location
Luton
This is unbelievable, i finally upgraded from a 400w PSU to a 535W and now i would have to upgrade again if i choose to buy that bad boy. I dont understand, building at a much more efficient 80nm process should mean a lower power input but it isnt the case. How come?
 
Because it has massive amounts of transistors. Efficiency per transistor can increase, whilst consumption increases due to overall die size. Just a die shrink would indeed lower consumption.
 
Last edited:
well, it's not really as easy as that. better efficiancy means less is wasted, but remember the gpu will be the most complex intergrated circuit available for us home users.

Imagine two cpu's, we'll call them A1 and B1. Both identical except one (b1) is built on a smaller process than the other. all things being equal B1 would draw less current, waste less and produce less heat. now imagine doubling that cpu's complexity - its now twice as fast with twice as many transistors; we'll call this one B2. B2 is still 'more efficiant' than A1, but its now drawing double the current it used to - and in the process drawing more than A1 does, and putting out more heat....


...you get the idea. building on ever smaller processes does improve the efficiancy of the circuits but there are always going to be stumbling blocks, such as power leakage. the smaller you make a transistor, the thinner you have to make the walls of the transistor. The thinner they are the more it 'leaks' current and so the more current a cpu/gpu will waste.

Remember they arent in it for making things energy efficiant. they are doing it to produce ever more complex and faster circuits on a single die.
 
LoadsaMoney said:
Where have ATi said it requires 250w, when did i miss this, last i heard was it was supposed to use less than G80. :confused:


Whats new?

ATI always make hotter louder and more power hungry cards :p
 
LoadsaMoney said:
Where have ATi said it requires 250w, when did i miss this, last i heard was it was supposed to use less than G80. :confused:
No, they last reports still had more than one PCIe connection. That puts it over 150W. Two PCIe connections limits out at 225W, I can see it being any more than that. Or they would have to wait for the PCIe ver 2.0
 
easyrider said:
Whats new?

ATI always make hotter louder and more power hungry cards :p

sparkle-fx5800Ultra.jpg
 
thats too old
leaked specs in january said it would have 2 pcie 6 pin slots same as the 8800gtx so i cant use more than 250w
 
They should give a consideration to the environment, when they design 'better' performing components, that consume more power. I mean, in the end of the day, its not a necessity- its a hobby for most. Whereas, in some countries, generating that amount of energy is tough.
Ah well, I think the thing I'm most looking forward to is AMD's 'Fusion'.
:)
 
I take your point. I've a neighbor with 4 x 1000w lights around their house at night. Would it kill them to fit PIR's, no. Now that's irresponsible environment pollution. In the scheme of things the few hundred owners and the few extra watts won't add up to much. Personally I saved a 1kW of light changing to led fittings, more than enough to offset a few hours gaming.
 
Back
Top Bottom