• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Gibbo is getting an R600!!

speeduk said:
My brand new corsair 620w psu doesnt have an 8 pin thingy - am i buggered for overclocking?
Supposedly we'll be seeing 6-pin/8-pin adapters, so as long as the PSU is good enough then no.

Wish Gibbo would test it on a Corsair 520 and 620 to let us know. :p
 
IceShock said:
i thaught a 8 pin thingy was included with the card, like a molex to 8 pin thingy? :confused:


i bet tho they will be on sale molex to 8 pin things if there not supplyed

I just figured that it would need more power than a molex could supply hehe.
 
You will be buggered yes, but why not wait for the R650 as the R600 is another x1800, its going to have about the same lifespan (which was 2 months btw), as they are going to bring out its replacement the R650, the 65nm one, thats the one im waiting for. :D
 
Last edited:
speeduk said:
I just figured that it would need more power than a molex could supply hehe.
Er, yeah, a lot more than a molex. Six separate power conductors at least, on the 8 pin the 2 extra wires just look like common returns. I wondering if one com is some sort of loop/report back to the PSU as there is no extra power conductor. Just looks a bit odd from a power POV.

http://www.enermax.com.tw/english/upload/document/M20071199331439308.pdf



EDIT: So looks like in the future one of the coms does act as a sense wire.

JonnyGURU comments and here

Down the road "300W PCI-e cards" are going to look for communication with the PWM via the +12V sense wire on the new 8-pin connector and I think that's how the card's going to know if there's an 8-pin connected or not. I think the R600 cards are going to just look for that extra +12V or ground and that's it. Otherwise, the adapters wouldn't work because you can't just add a +12V sense wire to an exisiting connector.

So a "mod" without using two 6-pin connectors and/or an adapter would be easy. Just take an extra connector and take the connector off the end. Cut the unused wires short and plug the +12V and ground from the connector into the remaining two pins of the video card. Done.

Of course, the adapters are going to work fine because with two 6-pin connectors, you can deliver almost twice the juice and only really need another 6.25A of juice to get to the card at the absolute, balls to the wall, maximum.

And even then, I don't think the cards are going to really need THAT MUCH juice. I think the big hold up is going to be just in getting the card to recognize that you're providing juice and ground on all 8-pins of the connector.
 
Last edited:
Fusion said:
http://www.fx57.net/?p=576

Not precise scores, but somewhat of an indication.

nope, tells us next to nothing, entirely no way to know exactly how cpu limited either card is in that situation, for instance with a 4GHz kentsfield the 8800gtx might hit like 13k and the r600 might hit 18k(i really don't think it would get that high), of for that matter the 8800gtx might hit higher but the r600 barely gains anything putting them closer. 3dmark, all versions, are total system benchmarks and, while useful to a point, to see exactly how just the cards compare you have to whack up res aa and af so they get much lower scores. then compare the two cards there.

also some cards work better with aa or af modes, maybe the r600 takes a bigger hit with maximum AA than the nvidia card does for all we know.
 
I really doubt ATi would be so silly as to not make adapters work, I have two 6 pin plugs on my 600W and they both will plug into the two slots on the card.

Im sure it will be quite happy.
 
LoadsaMoney said:
You will be buggered yes, but why not wait for the R650 as the R600 is another x1800, its going to have about the same lifespan (which was 2 months btw), as they are going to bring out its replacement the R650, the 65nm one, thats the one im waiting for. :D

Me too, looks like this one is going to be a tad juicy for my Shuttle.

Any precise dates on the 65nm?
 
LoadsaMoney said:
but why not wait for the R650 as the R600 is another x1800, its going to have about the same lifespan (which was 2 months btw), as they are going to bring out its replacement the R650, the 65nm one, thats the one im waiting for. :D
LOL....Wait 6 months for R600...then went it comes out,Wants to wait longer for R650....

Wait for another year or two and there should be something like a R700 and then a R800...


But i hear you best to wait for the R1000.....
 
Last edited:
Thank you Gibbo for testing it and giving us what details you can. Appreciated ;) .

I'm liking the look of this card. I am impressed with your careful comments made so far.

easyrider said:
8900 will be here by then....

oh dear :D

Anyone got any insect repellent handy?. :rolleyes:

Some flies need a swattin'.

Seriously though Easy you've had the same outlook all through this thread. The thing I don't understand is why?. Why have all these negative vibes about a company as great as Nvidia?.

Yes Nvidia has had the G80 out a lot longer than the R600 but with flaky support. What Nvidia owner that's having problems with Vista and their G80 are not going to look at the quieter, slightly faster (maybe), improved image quality beyond the x1900's already great IQ and SUPPORT FOR VISTA (oh and better priced). If ATI have greater support in Vista than Nvidia on launch then it's a sad sad day for Nvidia. Loadsamoney is a great example as he owns both the X1800XT and 8800GTS. If the R600 is competitively priced towards GTS prices then there's one Nvidia changeover that'll be turning back again to ATI. Who wouldn't?.

I was about to buy the 8800GTS last month but luckily I've been helping my mother and sister move house so I'm still in the market to get one and ATI might have just snagged me back as I was about to go the GTS route as I'm still using XP mainly.

Nvidia had the last 6-7 months to shine and really give ATI/AMD a hard time but have failed to do so in the world of Vista. If this R600 comes out at GTS prices then it's "game on".

Oh and Easy?. Stop being childish about it all. It's a new card that should have been out a long time ago. We all know this (Bad Bad AMD/ATI!!!!) there I've told them off :p . I never hear you crying about how crap Vista and G80 go together :rolleyes: .
 
Last edited:
Back
Top Bottom