• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gibbo is getting an R600!!

nicolasb said:
In the video of AMD's recent press conference someone asks if it's true that R600 will use 270W. The reply is that in their demo setup each card is using roughly 200W. The demo setup was an R600 Crossfire rig doing GPGPU-type calculations, i.e. with all of the shaders working absolutely flat out to produce a total of 1 Teraflop.
Conflicting reports on that one. At another press conference AMD commented on the 270W directly. If you can survive the mind numbing dreariness of it all, watch AMD Senior VP Rick Bergman talk about it 8.10min in. A journalist asks if they're going to get the 270W R600 power "guzzling" consumption down. The answer is that they didn't have all the "tools" to develop it at the time and the next gen R650 or "whatever" will use less power.

nicolasb said:
So, all in all, it looks as though the card probably won't ever pull more than 200W unless you're overclocking. If you do overclock then the consumption will get close enough to 225W that they make you use different connectors in order to maintain a sizeable margin for error.
The fact they used the 150W 8 pin plug at all means the card has the potential to pull more than 225W when overclocked. Interestingly enough the new 8P PCIe plug doesn't actually use more power conductors than the older 6 pin. The 2 extra wires are commons, one of which is planned (not implemented) to be used as a handshake to the PSU to ensure adequate power levels. I see you've found the same interesting thread on the subject at JonnyGURU's forum. As the 2 extra wires are only commons, from a load POV three PCIe 6 pins are not really needed to run the card. In fact 2x6P gives the same number/gauge of load conductors, all that is needed is a 6 to 8 pin adapter with 2 doubled up ground wires. Personally I'd just make my own adapter.

Jonny also comments on this in the Ultra X3 1000w review. Which looks like a good xfire R600 option with a single 70A rail and all the right connectors.



nicolasb said:
The lack of 8-pin PCIe 2.0 connectors on most PSUs is very annoying, though. :mad: I'm hoping that the 850W Silverstone Olympia PSU (due out in early May) will have the right connectors and be of reasonably good quality. We'll see.
I was looking at the latest (May?) PCP&C 750W (single rail!!) should have the 2x8P + 2x6P needed. The review also comments on the new PSU handshake. What I'd be more interested in is the fact that one R600 can get very close to the 240VA overcurrent trip limits of some 12v current rails. Some older PSU's may have problems with R600 depending on the rail implementation, not so much peak wattage.
 
Last edited:
Not sure if this has been mentioned

R600XT wins in NFS Carbon



The new Radeon HD 2900 XT driver saves the day. Only with the new driver the card is really faster than the six month old Geforce 8800 GTX.


We only know a few examples as in Need for Speed Carbon R600XT scores 10 to 17 percent better than Geforce 8800 GTX. The previous 8.35 was not that good and the R600XT was close to G80 GTX but not better. With new 8.361 which is suppose to be the launch driver the scores are even better on ATI hardware.


Looks like ATI still have an ace of spade in its sleeve.

http://www.fudzilla.com/index.php?option=com_content&task=view&id=525&Itemid=1

Also

R600 has 64 Unified Shaders

Partitioned in 4 blocks.


Everyone expects 128 Unified Shaders but R600 doesn’t have more than 64. It seems to be enough to make Nvidia G80 GTX runs for its money. Nvidia has 128 unified Shaders but we are puzzled where does the difference comes from.


The Shaders are divided in four blocks and each has 16 units. We are working to get the number or TMU's and ROPs and TMU's. Well, we dont care as long as it runs faster. Stay tuned.


http://www.fudzilla.com/index.php?option=com_content&task=view&id=526&Itemid=1

tata NV.
 
what happened to the 320shadder thingy that some one else posted a few pages back?
G80 has 128 scalar ALUs. R600 has 64 ALUs, each of which is vec4+scalar. It's a bit like the difference between a conventional CPU instruction that takes a single piece of data and an SIMD instruction, if you know how that works.

In a best-case scenario each of ATI's 64 shaders can do 5 times as much per clock cycle as each of Nvidia's 128 shaders, so it's the equivalent of 320. Worst case is 2 times as much; but Nvidia's run at double the clock-speed. So, best case is ATI wins by 25%, worst case is Nvidia wins by 100% (if we assume that R600 core clock is half of Nvidia's shader clock - it will probably be slightly better than that).
 
Last edited:
Lolcb said:
Nonsense. If he said stuttering is due to driver, it doesn't matter what specs you have, cause performance will be as bad.

What is nonsense? I didn't make any statement for you to prove or disprove, I simply asked whether you have exactly the same specs as him. Do you?

If you want me to make a statement for you to contest then:

It is perfectly possibly for drivers to be buggy in such a way that people with particular hardware experience huge problems while others with slightly different setups do not: read any thread about poor drivers for the 8800 GTX in Vista and you'll see that there are people who have no problems with them whatsoever, despite the fact that it is widely accepted that drivers for the 8800 series are very very bad, particularly in Vista.

If we're still talking about the guy with 1GB RAM and a dated CPU it's fairly likely that his performance issues are not related, or are not related only, to poor drivers for his card but how you can make a credible statement about the exact reason for his problems without having the same spec is beyond me.
 
I have to say that this 2900xt 512mb seem to be just what the market needs right now (the bang for bucks and all that) and some stiff competiton for NV bringing prices down for us customers.
Looks like Nvidia's rain of terror could all most be over espcially if their new offering the 8800 Ultra is overpriced. So good news all round i think.

Just make sure your psu is up to it (Small Dig :rolleyes: )
 
fornowagain said:
I was looking at the latest (May?) PCP&C 750W (single rail!!) should have the 2x8P + 2x6P needed.
Not easy to get hold of one of those in the UK. Even harder to get one at a less than insane price. :(

I wish Etasis would do 2x8/2x6 version of their 750W and 850W models. That (or the Silverstone Zeus line, which is the same devices rebadged) would be a good deal.
 
Dutch Guy said:
If it is only slightly faster than a GTX why on earth does it need >225W when a 8800GTX needs 100W less while being on 90nm :confused:

To keep the Heatsink nice and toasty ;)

Seriously, I'm not even sure we have accurate power figures, just it uses the new spec power connectors
 
Dutch Guy said:
If it is only slightly faster than a GTX why on earth does it need >225W when a 8800GTX needs 100W less while being on 90nm :confused:
Does 8800GTX really only use 125W peak? I thought it was a bit higher than that. In any case, remember R600 will not use >225W except when significantly overclocked, so compare it with the power used by an equally highly overclocked G80, not a G80 at stock speeds.

I'm actually not that fussed by the power usage. There are two potential problems that using a lot of power can cause. One is that it can limit overclocking potential; the other is that it can require a very noisy cooling system (or possibly an inefficient cooling system that dumps heat inside the case). The evidence so far suggests that the R600 cooler is very efficient, shifts heat outside the case well, keeps the chip cooler than the G80 is in use, and is significantly quieter than the G80 equivalent; so that's not a worry. Overclocking potential is an unknown quantity, but I see know no reason to panic about it yet; the 8800GTX doesn't overclock particularly well either, and R600 is (according to Gibbo) running cooler at stock speed.
 
All this moaning about power loads ...

Do you see monster truck or performance car enthusiasts moaning about fuel consumption? It will most likely be the fastest DX9/10 card on the planet, it's supposed to consume a lot of power. It should also make a lot of noise, and have some lights (preferably flashing) on it.

If it affects your neighbour's electricity supply when you turn it on, even better.
 
Back
Top Bottom