• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 specs finally confirmed! NOT old news!

vista + high power consumption gfx card = problems, since aero view will be running in 3d mode all the time more heat from the gfx card means more heat in the case.

in can see lots of 7900 cards failing next year when vista is hammering them 24/7 :D
 
Cyber-Mav said:
the 128bit 256mb sounds like it could be physics storage area or something else such as a shader scratch area like games like battlefield 2 etc benefit from.
Indeed it could, we can speculate as much as we like but im betting on it not being something really cool like that.

Yup, no point releasing a DX10 card now is there. :p

Have you seen the Vista prices rofl.

Scratch what i said about £600+, its gona be £700+ :D
Indeed I have, I don't care about prices though, it's the best os i've ever used (bar osx with 2 gigs of ram).

but money does grow on tree's so we can afford all this. :p:rolleyes:


*edit* thought i might add this, with vista using the gpu all the time, whats gonna happen to laptop battery life?

my m1710 already gets the award for most pathetic battery life ive ever come across, with the gpu getting constant use its going to be about 2.5 hours at the most :eek: (currently does 3.5hrs)
 
This is all really exciting stuff (not meaning to dampen peoples spirits) but if you get a G80, you'll have to get vista too...won't you? And that doesn't come out till December :(
 
AWBbox said:
This is all really exciting stuff (not meaning to dampen peoples spirits) but if you get a G80, you'll have to get vista too...won't you? And that doesn't come out till December :(

Vista is around Jan/Feb time, and you only need it for DX10, G80 does DX9 fine under XP. :)
 
Confusion said:
just found out you need 850w PSU for SLI :eek:
the whole thing is do you really need sli? if this thing is as fast as 2 gx2 , why on eath would you want to sli them? once the cards go to 2nd gen power should come down.....
now that nvidia had opened up on specs i would ssuggest ati wont be to far down the road , if only o stop prospective buyers buying nvidia rather than ati ,
Everyone needs to remember this is nvidias first go at unified shaders....... ati has done it before with xbox 360,,,,,,......
 
Gashman said:
bet you'd be envious if his **** was really that large
Nah, my G80 epenis would make up for it.

A 7950GX2 had 556 million transistors for 143w or so, going on the PSU requirements this thing pulls 190/200w. Assuming its the same process with a similar efficiency as the G71 similar clocks mem etc, a guesstamate 3.8Mil /w says over 700 million. Maybe unlikely, but not impossible.. Good for the water cooling manufacturers!!
 
Last edited:
LoadsaMoney said:
Nope, im actually gona buy Vista, and a DX10 card, SHOCK HORROR, but when somethings using it. :p

Im not wasting all that for faster DX9. :D


But then us lot will be discussing GFX,shaders,differences(if any) in the new tech.

By the time you buy.. Us lot will have 12 months experience with the cards and make valid judgements based on experience.


You will be a DX 10 Virgin and comment,not from experience,but from conjecture. :p
 
I think those 128 shaders are actually shader "processors" not physical pipelines, like with the X1900 which has 48 pixel processors and 16 pipelines (i.e. 3 per pipeline).

So it may be the case of 32 pipelines on the G80 with 4 shader processors per pipeline or possibly a 2:1 config but I doubt it. Think about it, since the geforce 4, we've had 4>8>16>24 and now its gonna be 128??? Even if it is a unified shader architecture its still a stupendous jump. When the X1900 came out I was like baffled at the huge jump to 48 pipelines as many sites were claiming, but it turned out to be a load of misleading cack in the end. :rolleyes:
 
Last edited:
Back
Top Bottom