• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

9800 GX2 Cebit Pics.

Best of luck getting a contact for Nvidia from their site and then getting a reply.

I have the UK PR Managers phone number, but he avoids me after a few calls back and forth between us about the 6800Ultra and Purevideo farce years back lol.
 
Last edited:
Best of luck getting a contact for Nvidia from their site and then getting a reply.

I have the UK PR Manangers phone number, but he avoids me after a few calls back and forth between us about the 6800Ultra and Purevideo farce years back lol.


Yep, bunch of lead swinging monkey's.
But the question should be asked as it's skirting on false advertising.
 
I'm emailing NV and asking why they have 512bit all over their product...

Edit: And 1 GIG of ram too!

SLI does not use both cards Vram

Marketing really, happened with the 7950GX2. On the boxes it had 1gb of vram and a 512 bit bus.

It techniqually true as it is 512bit bus on one card and same for mem. Just not per core.
 
It's certainly 100% bonafida not 2x512-bit - the cards would be damn enormous (twice as many memory chips), much more so than they are now... Then there's the extra heat it'd output... Then the extra logic they'd have to add to the cores. It'd all get very hot and hairy...

Also, more factual proof:

Memory bus-width can be worked out with the formula: 64/2*c where c is the number of memory chips because GDDR3 works in 64-bit channels comprised of two chips ...

As you can see in this shot:

21587tf1.jpg


There would appear to be 8 chips, so: 64/2*8 = 256.

If anyone wants to correct my maths, feel free. :p

It is 2x 256bit....

There is 2 cards on this thing... each card has a 256bit bus, its 2 cards in SLi.....
 
Ok 2 cards 1 is faster and bigger and the other is slower & smaller but uses some fancy design :confused:

I know which I'm going for, its a no brainier :D
 
Ok 2 cards 1 is faster and bigger and the other is slower & smaller but uses some fancy design :confused:

I know which I'm going for, its a no brainier :D

Faster? And where exactly did you get that information from?

Its two 8800GTS G92(?) in a single PCI-E slot package. As SLi is less efficient than CF (what I've seen) then the X2 should in theory, win.

We'll see when reviews get the cards and test with the latest cats...
 
I'm emailing NV and asking why they have 512bit all over their product...

Edit: And 1 GIG of ram too!

SLI does not use both cards Vram

Technically it does use both cards vram, it's just that both sets of memory must be loaded up with identical data. That's what I've read anyway!

A more efficient use of the RAM would be if the 2x512mb memory was combined and shared by both cores, but that would be one amazing technical feat for the dual pcb design of the nvidia card; you'd need something like a 60GB/s interlink between the two cards.

The 512bit bus is BS too. I recall when ATi released the 2900XT, they were making a huge fuss about the 512bit bus, since it was pretty difficult to implement and required some major changes to the PCB, memory config, power requirements etc.

For nvidia to ignore thier already, more than adequate 384bit bus (as on the GTX) and suddenly pull out a 512bit bus for this card alone, would be lunacy.
 
Technically it does use both cards vram, it's just that both sets of memory must be loaded up with identical data. That's what I've read anyway!

A more efficient use of the RAM would be if the 2x512mb memory was combined and shared by both cores, but that would be one amazing technical feat for the dual pcb design of the nvidia card; you'd need something like a 60GB/s interlink between the two cards.

The 512bit bus is BS too. I can remember ATi when the released the 2900XT were making a huge fuss about the 512bit bus, since it was pretty difficult to implied and required some major changes to the PCB, memory config, power requirements etc.

For nvidia to ignore thier already, more than adequate 384bit bus (as on the GTX) and suddenly pull out a 512bit bus for this card alone, would be lunacy.

GTX is old tho they are using 256bit memory interfaces as this is what the 8800GTS is.
 
Faster? And where exactly did you get that information from?

Its two 8800GTS G92(?) in a single PCI-E slot package. As SLi is less efficient than CF (what I've seen) then the X2 should in theory, win.

We'll see when reviews get the cards and test with the latest cats...

I didnt get my info from anywhere, you siad you would get a ATI X2 because it has better design, but if the 9800GX2 is faster I would go for that regardless that it has a crap design ;)
 
I didnt get my info from anywhere, you siad you would get a ATI X2 because it has better design, but if the 9800GX2 is faster I would go for that regardless that it has a crap design ;)

You're also not taking price into consideration. If its marginally faster for £100 more then not a chance. Unless its around 50% faster than the x2 its a waste of time imo.
 
You're also not taking price into consideration. If its marginally faster for £100 more then not a chance. Unless its around 50% faster than the x2 its a waste of time imo.

I doubt this will come in 250 lol

So you are right.

Considering the price of a single GTS
 
Back
Top Bottom