• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 specs finally confirmed! NOT old news!

Gashman said:
i agree completely, NVIDIA so-called G80 specs never showed GDDR4, the thing is that card is stupid, i mean have you seen the size of the thing, to have SLI your gonna need a small room in your house, just for the cards and TWO! 6-pin PCI-E connectors, that equates to some 200W+ or something. another beef, look at the size of the heatsink, so im assuming that means this GPU does not run cool, infact suggest the exact opposite. i can only hope ATI don't make a complete hash of it like this, cause i don't have a huge case, infact i have trouble fitting SLI-ed 7600GTs in it without it becoming VERY crowded, NVIDIA are clearly not thing here like, enormous card, enormous power consumption, totally enormous heatsink, there not going small, hope the red camp does better :)
And you think Ati's power consumption is going to be lower?
 
This is starting to reek of fanboys now this thread.

A new card comes along and NV releases it and the ATI fanboys see it as the perfect opportunity to diss the card.

And the only thing that they can knock is the power requirements.

How apt!

ATI have clearly been behind in this area for quite some time so give it a rest please.

As you will will only look silly when R600 is released and have to wipe the egg off your faces. :D
 
ernysmuntz said:
Other way round these days.
Last I checked, Nvidia built quieter graphics cards with much less heat output. How is it the other way around? Like I said, the Nvidia owners who have constantly berated ATI for such things will suddenly think it's not a big deal.

easyrider said:
A new card comes along and NV releases it and the ATI fanboys see it as the perfect opportunity to diss the card.

And the only thing that they can knock is the power requirements.
A little reading comprehension goes a long way. Nowhere have I been "knocking" the G80 apart from saying I will wait and see how the refresh is as I am careful about new tech, what I did say is that the fanboys will change their tune now that Nvidia cards suddenly require a lot more power to run.

See the difference here? Not knocking the product, knocking the fanboys. I don't see you complaining about how much power this will need or how much heat it might generate, which is a first for you as it's usually the first two things you mention when an ATI card is being discussed.
 
Last edited:
Ulfhedjinn said:
A little reading comprehension goes a long way. Nowhere have I been "knocking" the G80 apart from saying I will wait and see how the refresh is as I am careful about new tech, what I did say is that the fanboys will change their tune now that Nvidia cards suddenly require a lot more power to run..

Ulfhedjinn said:
ROFLMFAO! :D 10/10 for genius!

Hmm :rolleyes:

Its a new Gen card.Of course its gonna need more power.If it offers performance of two 7900 GTX's then who cares.

See the difference here? Not knocking the product, knocking the fanboys. I don't see you complaining about how much power this will need or how much heat it might generate, which is a first for you as it's usually the first two things you mention when an ATI card is being discussed.

It can't be compared to 7 series or x1900 series as well you know on release of this product those cards will be old hat.

I'll be waiting eagerly to see if R600 uses less power and is faster than the 8800 though,

somehow going on ATI's design heat and power requirments are not something they are very good at.

Maybe they will pull their socks up with the release of the R600.We will see.
 
Last edited:
easyrider said:
So because I laughed at a picture that is obviously very funny, quite clever even, it means I am bashing the card? You are far too sensitive, you might want to seek counseling before someone actually does make fun of the card and sends you over the edge. :rolleyes:
 
Last edited:
Ulfhedjinn said:
So because I laughed at a picture that is obviously very funny, quite clever even, it means I am bashing the card? You are far too sensitive, you might want to seek counseling before someone actually does make fun of the card and sends you over the edge. :rolleyes:

LOL

Me sensitive?

I dont care and get precious about PC hardware.I dont have hardware
that long to care about it.I move on.

If I remeber correctly it was you that was getting upset when I stated that the x1900 will be seen as old on DX 10 Cards release. :D


My point is that to many people are to quick to judge.When the power requirements may be not that big when in conjunction with the power the card offers.
 
easyrider said:
Me sensitive?
I laughed at a clever parody, you accused me of "dissing" the card itself. I call that sensitive.

easyrider said:
If I remeber correctly it was you that was getting upset when I stated that the x1900 will be seen as old on DX 10 Cards release. :D
It will be seen as an old card by you, not by anyone with any sort of grip on reality. Also, if I remember correctly you wouldn't confirm that by your own logic the 7950GX2 would also be "old" when DirectX 10 cards are released. ;)

Talk about double standards.

easyrider said:
My point is that to many people are to quick to judge.When the power requirements may be not that big when in conjunction with the power the card offers.
Yes, you are indeed quick to judge. I mean, goddess forbid I laugh at a clever image. :rolleyes:

The 8800GTX and 8800GTS interest me, there is no bias here. I would be interested in buying one if it wasn't brand new tech that's not had the waters tested and didn't look like it might cost too much for my tastes, so I don't know how you can accuse me of "dissing" it at all.

I fully intend to look at the refresh (and the refresh of R600, not R600 itself though) as possible next purchases.
 
Last edited:
Adam Senior said:
erm....is this a computer forum of a get ya hand bags out fourm?
You must be new here, but I doubt it with a 2003 join date.

Yes the graphics forum is the "get your handbags out" forum, A.K.A. the sewer of OcUK.
 
Ulfhedjinn said:
I
It will be seen as an old card by you, not by anyone with any sort of grip on reality. Also, if I remember correctly you wouldn't confirm that by your own logic the 7950GX2 would also be "old" when DirectX 10 cards are released.

Talk about double standards.;).


I suggest you look back at my previous post's.I said that the GX2 will be old too as the 8800 GTX will offer Sli performance as a single card solution.

Thats the reason I will be getting one.

When the 8800 GTX is released my GX2 will be old.It will not have DX 10 and will be slower in games.
:D
 
easyrider said:
somehow going on ATI's design heat and power requirments are not something they are very good at.
Theres a laugh.
Dont get me wrong, i have no problem with picking out flaws in both companys products.
But to claim that ATIs design teams cant design an efficient product is nothing more than ignorence. Considering the whole 5800 'incident', its a little rich. :p
Neither team is better or worse than the other. They all have their phases. We've seen nVidia have badly designed, hot running, loud cards, and we've seen the same from ATI as well. At the moment, its ATI with the hot running cards. It'll flip-flop back over soon enough though.
 
BoomAM said:
But to claim that ATIs design teams cant design an efficient product is nothing more than ignorence. Considering the whole 5800 'incident', its a little rich. :p
Or, in more recent memory, the whole 7900GT incident. :p

easyrider said:
Wrong post lol

;)
Really, so you're denying you said that? Bravo! Here is your pointy hat with a D on it, go and sit in the corner. :D
 
what the hells with this forum? one minute your all going on about how increased power consumption is bad and manufacturers getting it wrong, there going the wrong way, using more power, and so on, and now people are starting to say don't judge a card by how much power it consumes, im not NVIDIA or ATI fan, i have used NVIDIA cards almost exclusively to date, its just im shocked that these cards are going to use so much power, when AMD and intel are realising there designs need to be more power efficient, also the size of the thing is im sorry just utterly shocking, theres no need for a card to be that big, i mean thats even bigger than a X1800 by the looks of things and i have problems getting cards that big into my case easily. you can't seriously tell me your all overjoyed and so incredibly happy that these cards need two connectors, meaning soak up much more power than the generation you have now, and the only reason i have to say 'i hope R600 consumes less power' is that ATI's Xenos processor has a supposidly lower power consumption, and R600 is similar to that in more ways than not, so quit the 'god im sick of fanboys saying this' and 'god typical fanboy, dissing a card' crap and lets just stick to discussion
 
Ulfhedjinn said:
So because you changed your tune after I pointed out the obvious flaw in your logic, that means you didn't say it at all? :rolleyes: See my previous post, the quote is right there with a link and time stamps do not lie.

No doubt you will continue to cover your arse, and I will sit mildly entertained. :)


I'm not covering anything.

And to quite frank I'm unsure what your point is?

Just to make it clear when DX 10 is released DX 9 cards are old this cannot be denied.
 
Back
Top Bottom