• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Turns out we've had GT200 all along - the 9800GX2

Even if its 2 cores on the same die... its still going to need SLI to do multi core rendering short of a major architecture reworking...
 
if this is 2 cores on 1 die like dual core cpus are then they will have solved all the major issues of multi card configs. memory will not be wasted, no longer will there be 2 x 512mb = only 512mb usable on dual card configs, and with the added memory bandwidth im sure this card will finally pave the way for all future gpus, multi core gpus will soon become the norm.

all graphics works is heavily threaded, e.g having 16 rops means you can work on 16 pixels per pass, with dual core on 1 die you can work with 32 pixels per pass and in theory there will not be any need for SLI since its not a software based method of dual card rendering.

either way its going to get very interesting.
 
if this is 2 cores on 1 die like dual core cpus are then they will have solved all the major issues of multi card configs. memory will not be wasted, no longer will there be 2 x 512mb = only 512mb usable on dual card configs, and with the added memory bandwidth im sure this card will finally pave the way for all future gpus, multi core gpus will soon become the norm.

all graphics works is heavily threaded, e.g having 16 rops means you can work on 16 pixels per pass, with dual core on 1 die you can work with 32 pixels per pass and in theory there will not be any need for SLI since its not a software based method of dual card rendering.

either way its going to get very interesting.

Forgive my ignorance, but exactly how does putting both the cores on 1 die solve the multi card issue? Surely something still has to tell the 2 GPU's how to divide the work in the most efficient manner, be it the drivers or the actual software you run, ie the game itself.
 
Forgive my ignorance, but exactly how does putting both the cores on 1 die solve the multi card issue? Surely something still has to tell the 2 GPU's how to divide the work in the most efficient manner, be it the drivers or the actual software you run, ie the game itself.

Having an internal bridge for the two cores will be far more efficient than using pcie or whatever the dodgy external bridge they normally use is called. The new ATI cards are rumoured to have a similar setup and one that allows the card to pool the graphics ram properly - no loner will 1 gig sli/xfire cards only actually really have 512MB ram.

That's the theory/rumour anyhow. Whether or not that's true, who knows.
 
Ah, so it's not the same type of process as say a dual core CPU where 1 core will stay relatively idle unless instructed to do something.

If that's the case, then it sounds pretty interesting!
 
Is it not just the 1x core, as it sounds to me like they've just taken 2x G92 cores, and made them into 1x single core.

yep thats what i ment if they have done that then there will be no need for sli since the entire core is seen as a single unit, and even the ram on the card will be just 1x512 or 1x1gb, and there will be no need for 2 seperate lots of ram for each core since in effect they have become unified.

and in such a case you can expect no less than 99% performance boost over a single core.

looks like nvidia will give you your kebab and also let you eat it. :D
 
Having an internal bridge for the two cores will be far more efficient than using pcie or whatever the dodgy external bridge they normally use is called. The new ATI cards are rumoured to have a similar setup and one that allows the card to pool the graphics ram properly - no loner will 1 gig sli/xfire cards only actually really have 512MB ram.

That's the theory/rumour anyhow. Whether or not that's true, who knows.

yep thats the main thing that peed me off about dual card configs, not that you dont get a 100% speed increase over a single card, but that you also loose the tam of the second card, since it has to hold duplicate data of the first card.

dual card configs are too much waste to me. this new approach by nvidia should really get the ball rolling for multi core cards in the future.

and remember all gfx work is pretty much multi threadded as it is. your working with pixels, and the numberof pipelines/ROPS you have determines how many pixels you can work on at the same time. 8800gtx has 24rops so can work on 24 pixels at once, 8800gt has 16 rops so it can work on 16 pixels at once.

and this new card they saying will have 32rops and no need for any games or software to be modded to allow more pixels to be processed per clock since the good thing about gfx is that its all a fairly efficient rendering process by working on multiple pixels simulateously.
 
Ah, so it's not the same type of process as say a dual core CPU where 1 core will stay relatively idle unless instructed to do something.

If that's the case, then it sounds pretty interesting!

yep, pc software was mostly single threadded in the past, so when dual core cpus came out the software needed to be programmed in a different way so that it could be split up into multiple threads. only some software such as rendering, and other cad/specialist software has been multi threadded going back 10 years.

but due to the nature of how graphics are processed they are naturally multi threadded since the rendering of each pixel can be put onto its own thread in theory.
 
I did read the the gx2 was going to be a true dual core GPU and cosidering the time scale this might be what the gx2 was ment to be. Still, if this is true it will be one hot ass power hungry card.
 
if they are giving it a die shrink to 55nm and the new thermal power management features then its heat will so not be an issue.

as for the number of stream processors on it, if it was a g92 dual core, then it would be 112 or 128 x 2 sp's but since its got 192sp's then it looks to be an even more cut down version of g92.

or this core may actually be G90:eek:
 
They might do, but electromigration could be a problem at 55nm, unless it's quite a major over haul with new substrate, but they have had a lot of R&D time to play with it, so maybe. I still think the draw will be around the 1.6-1.8 volt mark tho.
 
http://xtreview.com/addcomment-id-5121-view-GT200-predicted-speed.html is the source. I HOPE they're kidding... else Loadsa's going to have a field day

http://forums.vr-zone.com/showthread.php?t=271801

55nm TSMC process
Single chip with "dual G92b like" cores
330-350mm2 die size
900M+ transistors
512-bit memory interface
32 ROP
192SP (24X8)
6+8 Pin
550~600W PSU min
9900GTX SLI runs Crysis 2560x1600 VH 4XAA smoothly

Originally from here I think. Some talk of a 65nm version to start with.
 
Last edited:
I rekon the Crysis performance is bs though, if this is just a GX2 on a single chip, then how come the GX2 isnt running Crysis at that comfortably already, surely the GX2 would run it exactly the same if its the same card, just like 2x 3870's in Crossfire, they run the same as a 3870 X2 don't they, as it sounds like they are saying thats what this is sorta like, but they are going 1x better than ATi, instead of sticking 2x cores on the 1x PCB like ATi (3870 X2), they've gone one better by sticking the 2x cores together, and also sticking it on 1x PCB.
 
Last edited:
yep thats what i ment if they have done that then there will be no need for sli since the entire core is seen as a single unit, and even the ram on the card will be just 1x512 or 1x1gb, and there will be no need for 2 seperate lots of ram for each core since in effect they have become unified.

and in such a case you can expect no less than 99% performance boost over a single core.

looks like nvidia will give you your kebab and also let you eat it. :D

I don't want any of those kebabs, especially if they're using meat from old kebabs in them :D.
 
I rekon the Crysis performance is bs though, if this is just a GX2 on a single chip, then how come the GX2 isnt running Crysis at that comfortably already, surely the GX2 would run it exactly the same if its the same card, just like 2x 3870's in Crossfire, they run the same as a 3870 X2 don't they.

But the 3870 X2 has two separate GPU dies, communicating with el bog standard Crossfire tech over a PCI-E bridge. Obvious limitations there. The 3870 X2, regardless of being on a single PCB, is still Crossfire on a stick... Same story with the 9800 GX2. I am by no means saying that the claims are true, though. Seems a bit ridiculous to claim that it can 'play' it 'comfortably' without any kind of benchmarks or whatnot, that bit just sounded like a load of speculatory tripe tbh.
 
I rekon the Crysis performance is bs though, if this is just a GX2 on a single chip, then how come the GX2 isnt running Crysis at that comfortably already

512mb isn't enough to run crysis at 2560x1600 v high 4xaa, if you look at all the benchmark sites when they test quad sli setups they usually test at 2560x1600 on all games except crysis which they leave at 1920x1200
 
Back
Top Bottom