Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Its bound to be 2x 256bit, its 2x GTS's, which we all know is 256bit.![]()
Best of luck getting a contact for Nvidia from their site and then getting a reply.
I have the UK PR Manangers phone number, but he avoids me after a few calls back and forth between us about the 6800Ultra and Purevideo farce years back lol.
I'm emailing NV and asking why they have 512bit all over their product...
Edit: And 1 GIG of ram too!
SLI does not use both cards Vram
It's certainly 100% bonafida not 2x512-bit - the cards would be damn enormous (twice as many memory chips), much more so than they are now... Then there's the extra heat it'd output... Then the extra logic they'd have to add to the cores. It'd all get very hot and hairy...
Also, more factual proof:
Memory bus-width can be worked out with the formula: 64/2*c where c is the number of memory chips because GDDR3 works in 64-bit channels comprised of two chips ...
As you can see in this shot:
![]()
There would appear to be 8 chips, so: 64/2*8 = 256.
If anyone wants to correct my maths, feel free.![]()
It is 2x 256bit....
There is 2 cards on this thing... each card has a 256bit bus, its 2 cards in SLi.....
That doesn't make it 512 bit though.
I'd buy the ATi one if I was in the market for one.
That doesn't make it 512 bit though.
Did I say it is 512bit? Its 2x 256bit, one 256bit interface on each card.
Ok 2 cards 1 is faster and bigger and the other is slower & smaller but uses some fancy design
I know which I'm going for, its a no brainier![]()
I'm emailing NV and asking why they have 512bit all over their product...
Edit: And 1 GIG of ram too!
SLI does not use both cards Vram
Technically it does use both cards vram, it's just that both sets of memory must be loaded up with identical data. That's what I've read anyway!
A more efficient use of the RAM would be if the 2x512mb memory was combined and shared by both cores, but that would be one amazing technical feat for the dual pcb design of the nvidia card; you'd need something like a 60GB/s interlink between the two cards.
The 512bit bus is BS too. I can remember ATi when the released the 2900XT were making a huge fuss about the 512bit bus, since it was pretty difficult to implied and required some major changes to the PCB, memory config, power requirements etc.
For nvidia to ignore thier already, more than adequate 384bit bus (as on the GTX) and suddenly pull out a 512bit bus for this card alone, would be lunacy.
Faster? And where exactly did you get that information from?
Its two 8800GTS G92(?) in a single PCI-E slot package. As SLi is less efficient than CF (what I've seen) then the X2 should in theory, win.
We'll see when reviews get the cards and test with the latest cats...
I didnt get my info from anywhere, you siad you would get a ATI X2 because it has better design, but if the 9800GX2 is faster I would go for that regardless that it has a crap design![]()
You're also not taking price into consideration. If its marginally faster for £100 more then not a chance. Unless its around 50% faster than the x2 its a waste of time imo.