• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

G80 specs finally confirmed! NOT old news!

The Asgard said:
Wait for the R600. Your GX2 will cut it till then

i would proberbly go with R600 over G80, had some naff NVIDIA hardware over the years, the FX series :rolleyes: as well i assume you like the stargate franchise, seen the MMORPG stargate worlds? and series 10 is killer so far :D a stargate, and sci-fi in general fan myself
 
Gashman said:
i would proberbly go with R600 over G80, had some naff NVIDIA hardware over the years, the FX series :rolleyes: as well i assume you like the stargate franchise, seen the MMORPG stargate worlds? and series 10 is killer so far :D a stargate, and sci-fi in general fan myself

You assume correctly m8. A big fan!
 
I was going to wait for R600 but tbh, I really havent enjoyed my spell with ATi. All the way until I got my X1800XT I'd had nVidia cards, but thought what with the supposed "better image quality" and HDR+AA advantages i'd give them a shot.

But they're drivers are just bloatware, why they insist on making such a hog like CCC when they were doing fine with the old style. And although thats the only fault with them, I didnt have any faults like that with nVidia so will be buying G80 when it is released
 
The Asgard said:
Wait for the R600. Your GX2 will cut it till then

I presold my GX2 to a mate on the strength of the G80. If it isn't pulling much more then my GX2 - I wonder is it much faster in DX9. DX10 obviously does not concern me right now. But for 500 quid + I would hope it would be a big improvement in DX9.

Also, slightly off topic, but every bench I see shows Crossfire beating SLI. Why do so many people prefer SLI then ? Is it more stable. Does it scale better at higher resolutions. Is there any issues with Crossfire ?
 
Steedie said:
But they're drivers are just bloatware, why they insist on making such a hog like CCC when they were doing fine with the old style. And although thats the only fault with them, I didnt have any faults like that with nVidia so will be buying G80 when it is released
Dont install CCC then.
Ive been using the normal display control panel for ages with the Cats.
But, iirc, they've stopped updating the display control panel, probably in favour of getting people to use CCC. But you could probably source the old display control panel software and use that perfectly fine. Or if not, just use ATI Tools instead.
 
Flanno said:
Also, slightly off topic, but every bench I see shows Crossfire beating SLI. Why do so many people prefer SLI then ? Is it more stable. Does it scale better at higher resolutions. Is there any issues with Crossfire ?

Well for me its control. You can alter/create new profiles for any games that are not supported. Xfire it either works or it doesn't
 
Flanno said:
I presold my GX2 to a mate on the strength of the G80. If it isn't pulling much more then my GX2 - I wonder is it much faster in DX9. DX10 obviously does not concern me right now. But for 500 quid + I would hope it would be a big improvement in DX9.

Also, slightly off topic, but every bench I see shows Crossfire beating SLI. Why do so many people prefer SLI then ? Is it more stable. Does it scale better at higher resolutions. Is there any issues with Crossfire ?
crossfire is reletavliy new compared to sli so more people adopted it , but your right in that x1900xt or 1950 xt thump sli even quad sli , in most games , infact i think fear is the only game where quad sli will beat them
 
BoomAM said:
Dont install CCC then.
Ive been using the normal display control panel for ages with the Cats.
But, iirc, they've stopped updating the display control panel, probably in favour of getting people to use CCC. But you could probably source the old display control panel software and use that perfectly fine. Or if not, just use ATI Tools instead.

I've tried mate, used ATI Tools and also ATi Tray Tools, but none of them let me overclock my card without producing artifacts, its wierd. Only CCC lets me go up to XTPE speeds
 
LoadsaMoney said:
Thats some bloatware then, Nvidia drivers = 64mb, ATi drivers = 13.3mb, jesus i didn't know ATi dtivers were that huge compared to Nvidia ones, thats it x1800's going, Nvidia here i come. :eek:

Yeah, but running the Nvidia control panel uses bugger all in memory and has 1 service running. While as currently with the CCC running i have 3 cases of CLI.exe running 2 atievxxx.exe running, and it always takes an age to load up

So sorry that I hurt your little ATi ego, but its the truth, and its not just me who thinks it

So :) sorry that you were wrong
 
LoadsaMoney said:
Thats some bloatware then, Nvidia drivers = 64mb, ATi drivers = 13.3mb, jesus i didn't know ATi drivers were that huge compared to Nvidia ones, thats it x1800's going, Nvidia here i come. :eek:


Its not the size of the drivers that matters.

Its how many processes they need to run.
 
Hmmm...my current rig get around 9700 in 3dmark with my GX2 at 600/790, and my CPU at stock (2.93ghz). Have not overclocked my bad axe yet.

Comparing that to a rig I borrowed from work which had 2 x 3.0ghz Xeon 5160 (woodcrests) at 1333fsb, same video card at same speed pulled over a thousand more 3dmarks.

Both results had almost identical benches for the video test. The cpu test on the quad core was twice as high as the dual core though which explains the increase in 3dmark. So when the Inquirer say that the G80 with Core2Duo get's 10.5k, and almost 12k with quad core Kentsfield I definitely believe them. Of course if the game is not multi-threaded it will make very little difference.
 
Some more of the same. I'll stick it all up here just in case it gets pulled.

On 8 November 2006 Nvidia wants to present the next diagram chip generation Geforce 8. The PC WORLD could examine now the first reference maps with the Geforce 8800 GTX and the Geforce 8800 GTS.

(Taipei, Taiwan) Nvidia delivered the first reference maps with the Direct-X-10-Grafikchip Geforce 8800 GTX as well as the small brother Geforce 8800 GTS to the taiwanesischen partners. The PC WORLD was locally, in order to win a first impression from the new superchips to.

According to our sources of information, which can test the Geforce 8 for two days, the Geforce 8800 GTX approximately 30 per cent is faster than ATIs current flagship Radeon X1950 XTX (test). The 3D-Leistung of the Geforce 8800 GTS again is to be on the level of the ATI Topmodells. It is to be however assumed the 3D-Leistung will still rise, if Nvidia optimizes the Forceware driver for the Geforce-8-Familie in the next months.

Besides the percentage figures refer to 3D-Anwendungen under Direct X 9. Still more 3D-Leistung might offer the Geforce-8-Familie with plays, which support Direct X 10. One the first Direct-X-10-Spiele are for example the Ego Shooter Crysis (picture gallery), announced to the turn of the year.

----------------------------

Monster map: Which with the top model Geforce 8800 GTX in the eye stings immediately is the oversize PCB (printed Circuit board) with a length of full 260 millimeters. Likewise remarkably it is wuchtige radiator box, which accommodates two heat pipes (Heatpipes) and which map weight drives on over 800 gram.

Besides the GTX flagship possesses power connections equal two 6polige. This is emergency little, because the diagram map under full load up to 200 Watts verb-guesses/advises. Therefore Nvidia for the stable enterprise prescribes also at least a 450-Watt-Netzteil, which supplies on the 12-Volt-Leitung at least 30 ampere (Geforce 8800 GTS: 400 Watt/26 ampere).

Nevertheless the Geforce-8800-GTX-Referenzkarte gets along with an air cooling, which works amazingly quietly. And despite the high current consumption the Topmodell does not become during operation hot so, as we would have expected for such a diagram map monster.

-------------------------------

The complexity of Nvidias Geforce-8-Generation exceeds all GPUs (Graphic processing unit), which came so far on the market: No less than 700 million circuits werkeln in the diagram chip. To the comparison: ATIs top model Radeon X1950 XTX are available with 384 million only approximately half at transistors.

The unbelievable complexity lies in the architecture justified, which completely again developed Nvidia. With the Geforce 8 it acts around Nvidias first diagram chip with „the Unified in such a way specified Shader Architecture “. That is, there are no specialized pixel and Vertex Shader, but universally applicable Shader, which take over the arithmetic exercise depending upon need, dynamically. Altogether werkeln in the Geforce 8800 GTX full 128 of these arithmetic units, which Nvidia Streaming processors baptized.

The actual diagram chip clock amounts to with the Geforce 8800 GTX only 575 MHz. The clock frequency of the Streaming processors of the Nvidia flagship is however with 1350 MHz. Threading units so mentioned divide the resulting arithmetic problems dynamically into small pieces and send you to the Streaming processors. Nvidia calls this technology „Gigathread Technology “.

--------------------------

Already on board the Geforce-8-Grafikchips a physics arithmetic and logic unit is named quantity Effects Physics Processor. She is to take over the physics computation. We could not experience detailed information to the quantity Effects Physics Processor unfortunately.

-------------------------

Firmly stands however that Geforce 8 now - just as ATI - which can accomplish High dynamic rank Rendering (HDRR) with 128-Bit-Genauigkeit, without which you thereby edge smoothing to up to the 16x-Full-Screen-Modus do without must.

-------------------------

So that the diagram memory does not out-brake the computing power of the 128 Streaming processors, Nvidia the Geforce 8800 GTX 384 bits a long memory interface spendiert. Besides the top model may access full 768 MT GDDR3-Speicher. The clock frequency amounts to 900 MHz. Thus the theoretically maximum memory range increases to violent 86.4 GB/s.

With the small brother Geforce 8800 GTS is long the memory interface only 340 bits. In addition the chip may access only 640 MT GDDR3-Speicher, which work with a clock frequency of 800 MHz. The theoretically maximum memory range amounts to however nevertheless still 64 GB/s.

--------------------------

The Geforce 8 supports Direct X 10 as well as open GL 2,0 and is besides SLI able. The maps have two dual left DVI i connections and a video In/Out socket. The diagram chip supports HDCP (High bandwidth digitally content Protection) and can thereby also highly dissolved films show, which are stored on Blu ray and Hp-DVD-media.

According to statements of our sources of information Geforce-8-Grafikkarten is to be available to the starting date on 8 November 2006 in sufficient numbers of items. Violent is however the price: Diagram maps with the flagship Geforce 8800 GTX are to cost approximately 650 euro, GTS models strike with approximately 500 euro to beech.


* GTX approximately 30 per cent is faster than X1950 XTX.
* Length of 260mm, weight over 800 grams. In old money that's 10 1/4" and it's going to overhang your motherboard by 3/4"
* GTX uses 200W, recommeded 450W PSU with 30A
* Physics Processor
* 128 bit HDR 16xAA
 
Last edited:
Steedie said:
Yeah, but running the Nvidia control panel uses bugger all in memory and has 1 service running. While as currently with the CCC running i have 3 cases of CLI.exe running 2 atievxxx.exe running, and it always takes an age to load up

So sorry that I hurt your little ATi ego, but its the truth, and its not just me who thinks it

So :) sorry that you were wrong
You can disable all those processes and still have it function normally.
 
Johanson said:
2 sli connections? Like the new Crossfire?

It also says one sli connection for the GTS, im confused...
2 SLi connectors?? = SLi with three cards/PCI-e slots !!

Only kidding, but it does sound expensive, so I wouldn't put it past them. Physics card connector? Proberbly more to do with bandwidth.
 
Last edited:
That report mentions 650 euro for the GTX which is about $820. Along way off the original $650 being reported a couple of weeks ago. In Sterling this is 435 quid which wouldn't be so bad and I would be willing to pay it.

But what's the betting etailers will charge 500 quid+ and put it down to supply/demand. Pure greed as far as I am concerened. And at this price point I don't think I will be buying considering the less then stellar performance being reported so far.
 
fornowagain said:
That's £435 for a GTX and £335 for a GTS.

And from that 8800GTX = 2 x 7900GTX
The Asgard said:
And the GX2 is how many % faster than the X1950?
If its true, its not all that to be honest and you KNOW it going to be £500. If I can step up for a few quid, fair enough, if not I'll wait for the prices to drop or the R600 comes out.
 
Last edited:
fornowagain said:
If its true, its not all that to be honest and you KNOW it going to be £500. If I can step up for a few quid, fair enough, if not I'll wait for the prices to drop or the R600 comes out.


Yeah,

TBH I will amazed if this step up goes without a hitch.

And of course how much will be needed to step up.

If they take the normal cost of the G80 rather than the bloated retail prices that there will be on release then it will be worth it.

Maybe its time to jump off the GFX card merry-go-round

Stop the world I want to get off! :eek: :p
 
Back
Top Bottom