• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Enhanced 5970 out soon?

I wish you wouldn't post stuff like that as now I'm starting to consider it... :(. Do I really want to risk a £550 card for a bit more overclocking headroom? If I do then I'll probably wait until the next gen of DX11 cards are out (from both nVidia and ATi) so that if anything goes wrong I can use it as an excuse to upgrade... hopefully prices will be cheaper as well should anything go wrong. Grrr.

You should get it, then give me your old one :D
 
Funny how when we were talking about nVidia everyone was very down on how exceeding 300watt would cause such big problems, never be "allowed" and not within the standard etc. and here we have a card thats clearly going to exceed it and everyones all hyped about it... love the double standards.

Well for one this card is going to be produced in very small numbers and will most likely be bought by people that have had a frontal lobotomy.

Its not really an official ati product, ati pushed the power limits as much as they dared with the 5970. Its an extreme edition of a card that theyve made by choice and nothing more, nvidia on the other hand seems to not been able to do anything about the power consumption of their forthcoming cards, and when it comes down to it we're still talking a single nvidia core whoring up 250-300 watts as opposed to it taking 2 ati cores to do that.
 
Looks like Arctic cooling are claiming their 5970 cooler (the one on the 4G sapphire 5970 OC), will knock an incredible 44C off the GPU temps versus the stock cooler. That is pretty extreme if true:

http://www.fudzilla.com/content/view/17936/1/

Sounds credible to me - looks very similar to the aftermarket cooler they released (albeit FAR too late - was vapourware for most of the card's life!) for the 4870X2, and there were similarly dramatic temprature reductions achieved.
 
Funny how when we were talking about nVidia everyone was very down on how exceeding 300watt would cause such big problems, never be "allowed" and not within the standard etc. and here we have a card thats clearly going to exceed it and everyones all hyped about it... love the double standards.

Sorry but 2x 280gtx is well over 400W, 2x5870's is well over 300W.

Notice a pattern, two gpu's, twice the performance, more power. Who has an issue with that, one gpu, 300W, same performance, :( sad. 2x480gtx's at around 600W, on their own under load, a full system not far off 1kw under load, for not much more performance than a 5970 can give for 350-400W maybe. Lets also keep in mind we don't know for sure these are over 300W.

Because of the insane leakyness of the 40nm process, but also the extreme variability, theres quite a bit of scope for very heavy binning and finding some quite drastically lower power cores. Its very possible that super aggressive binning will mean a few cores a wafer, infact maybe quite a few, can run the same speed as a 5870, with lower voltage, dropping the wattage quite significantly. The tranisistor size, and faulty via's are the biggest issue with the 40nm process, however, theres got to be a few working per wafer, because, how else would Nvidia get any working cores at all, they don't have back up via's, and make no provision for variable sized transistors, some of them per wafer have to be next to perfect on both.
 
Last edited:
Well for one this card is going to be produced in very small numbers and will most likely be bought by people that have had a frontal lobotomy.

Its not really an official ati product, ati pushed the power limits as much as they dared with the 5970. Its an extreme edition of a card that theyve made by choice and nothing more, nvidia on the other hand seems to not been able to do anything about the power consumption of their forthcoming cards, and when it comes down to it we're still talking a single nvidia core whoring up 250-300 watts as opposed to it taking 2 ati cores to do that.

Sorry but 2x 280gtx is well over 400W, 2x5870's is well over 300W.

Notice a pattern, two gpu's, twice the performance, more power. Who has an issue with that, one gpu, 300W, same performance, :( sad. 2x480gtx's at around 600W, on their own under load, a full system not far off 1kw under load, for not much more performance than a 5970 can give for 350-400W maybe. Lets also keep in mind we don't know for sure these are over 400W.

Because of the insane leakyness of the 40nm process, but also the extreme variability, theres quite a bit of scope for very heavy binning and finding some quite drastically lower power cores.

Your missing the point... everyone was up in arms about how the lawyers, standards, blah blah would mean that theres no way a Fermi board exceeding 300watt could possibly happen and how much that fact was going to sink nVidia... no matter what was on the board 1 core, 2 core, 4 cores whatever... everyone was saying that a PCI-e card exceeding 300watt would absolutely break a standard that they'd obviously not bothered to even read*... strangely enough when this card comes along none of the same voices are heard.

Double standards.



* The standard only sets out guidelines for what you can draw from the PCI-e socket and the individual PCIe 6 and 8 pin standard connections and has an advisory for electrical and thermal management in a desktop system that needs to be adhered to for ratification purposes - nowhere does it forbid power draw above a set figure.
 
Last edited:
Purely because it's a 3rd party alteration on the reference design.

Nvidias isn't.

Doesn't make any difference... if the standard was such a big deal and as strict as these people were making out it would have been "thrown out" regardless.
 
Doesn't make any difference... if the standard was such a big deal and as strict as these people were making out it would have been "thrown out" regardless.

Actually it does.

huff and puff all you want. NVs card is "stock" if you will and hassn't been released. This thing is based off of a design all ready out. People did moan, if it was the 6970 then maybe. :p
 
Doesn't make any difference, if the standard was such an issue as people were making out when it suited them to slam nVidia it would apply to anyone designing and producing PCI-e compatible addin cards.
 
no matter what was on the board 1 core, 2 core, 4 cores whatever....

Everything you say would be absolutely fine Rroff, except for the fact that you have started to put this in there. No one would even blink if Nvidia produced a Fermi card with 4 cores that was over 300W, maybe if they couldn't produce a Fermi card with 2 cores that was under 300W there would be some griping and sarky "told you so's" from the more rabid Ati fans.

But what everyone was shocked about was >300W from a SINGlE core card, which although, only outside some "guidelines" was still considered to be excessive.
 
Everything you say would be absolutely fine Rroff, except for the fact that you have started to put this in there. No one would even blink if Nvidia produced a Fermi card with 4 cores that was over 300W, maybe if they couldn't produce a Fermi card with 2 cores that was under 300W there would be some griping and sarky "told you so's" from the more rabid Ati fans.

But what everyone was shocked about was >300W from a SINGlE core card, which although, only outside some "guidelines" was still considered to be excessive.

+1
 
Your missing the point... everyone was up in arms about how the lawyers, standards, blah blah would mean that theres no way a Fermi board exceeding 300watt could possibly happen and how much that fact was going to sink nVidia... no matter what was on the board 1 core, 2 core, 4 cores whatever... everyone was saying that a PCI-e card exceeding 300watt would absolutely break a standard that they'd obviously not bothered to even read*... strangely enough when this card comes along none of the same voices are heard.

Double standards.




When it came down to it most were saying that it was ridiculous for a single gpu card to draw that much given that ati managed 2 cores within a 300 watt envelope, secondly how would a dual gpu card work and what would its power draw be? Even if they downlock the cores a dual card is still going to be in the region of possibly 500 watts which is retarded to say the least. You can get a whole decent system to run on 500 watts never mind a single card.

Double standards nothing, this card is an extreme edition that isn't even an official ati product, if it was ati putting this card out themselves id say people would be more vocal.
 
My wife was recently "enhanced", but it turns out she overclocks no better than the referece model. It still takes just as long for her to cook, clean and iron but atleast the aspetics are more pleasing and the enhancement has provided a more solid build quality. Fotunately I do not hide her away inside a case (yet).

I find that "enhanced" products generally cost much more and do not really perform much better. Sure they can be good to look at (and play with) but if they spend most of the time hidden away what is the point? Most refererence parts will perform just as well given the right tools and encouragement.

LOL ;)

Interesting card, but I think I'll just stick with a 5870 seeing as I'm currently running a 1 screen setup...
 
Funny how when we were talking about nVidia everyone was very down on how exceeding 300watt would cause such big problems, never be "allowed" and not within the standard etc. and here we have a card thats clearly going to exceed it and everyones all hyped about it... love the double standards.

Difference is, if this was a dual Fermi gtx480 it would be a 700watt card. Different problem ;)
 
Very nice indeed, thanks for sharing.

My pleasure mate. Glad you liked. Have you seen how thick the heatpipes on this thing are?? It's hardcore!

contentteller.php


http://www.vortez.co.uk/contentteller/articles_pages/cebit_2010_asus_hd5970_ares_preview,1.html
 
Back
Top Bottom