• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia 9000 series news

the 3870X2 will be two GPUs on a single PCB, they did it not long ago with there X1600 GPU with a card called 'gemini' wasn't too bad really, AMD will have to price the 3870X2 lower than the 8800GTX/ultra to be competative, though i would imagine it would beat both the GTX and ultra one on one, plus 3870s overclock like mad and come with unreal memory speeds, should be a winner for AMD :)
 
I agree with the people saying its all ATI's fault :p

Personally I blame nVidia.

Instead of putting some effort in and pushing themselves further ahead while they have the upper hand, they've taken the easy option of rehashing their old stuff and cashing in on the consumer.
 
Personally I blame nVidia.

Instead of putting some effort in and pushing themselves further ahead while they have the upper hand, they've taken the easy option of rehashing their old stuff and cashing in on the consumer.


This is the option that shows the largest immediate returns. As a large corporation you can't really blame them for taking this path. It gives them time to develop their new high-end tech without the need for lots of expensive rushed prototyping, whilst still reaping most of the rewards of a new 'tech level'. So yes, I blame ATI for falling behind - it leaves nvidia with the easy option, which they will inevitably take.

I'm more irritated by this move than most though - I need native 64bit (double precision) floating point capability to use the GPUs for scientific computing. I don't care if they put 32 GPUs on a single board which can do 200fps in crysis - if it's 32bit it's no use to me.

Bring on the 'real' next generation. ATI please hurry up - and no delays this time around please! Nvidia have no impetus to release any new tech until shortly before ATI release theirs.
 
This is the option that shows the largest immediate returns. As a large corporation you can't really blame them for taking this path. It gives them time to develop their new high-end tech without the need for lots of expensive rushed prototyping, whilst still reaping most of the rewards of a new 'tech level'. So yes, I blame ATI for falling behind - it leaves nvidia with the easy option, which they will inevitably take.

I'm more irritated by this move than most though - I need native 64bit (double precision) floating point capability to use the GPUs for scientific computing. I don't care if they put 32 GPUs on a single board which can do 200fps in crysis - if it's 32bit it's no use to me.

Bring on the 'real' next generation. ATI please hurry up - and no delays this time around please! Nvidia have no impetus to release any new tech until shortly before ATI release theirs.


I know why they did it, but it doesn't mean they had to.

nVidia could have struck while the iron was hot, brought out annother killer product and secured themselves annother years worth of dominance early on, while simualtaniously putting the proverbial boot into ATi while they were down.

R700 prolly won't hit til Q3 at the earliest, meaning even if it is a monster nVidia would still have 6months to bring out a refresh of whatever they could have brought out in March...

No doubt they have working G100's in the labs right now, just sat there til ATi bring something out that challenges them.

Sucks :(
 
See what is the 9800GX2? Its basically 2x 8800GTS stuck together, what will the 3870X2 be? Proberbly 2 3870's on one PCB, its currently unknown if they'll use crossfire, but what we do know is crossfire scales better than SLi and for people who play crysis this is a good thing, its a good thing all around really.

The way I see it is the 3870X2 is the future, more than one GPU on one PCB and if it does not use crossfire then I think its another step in the right direction. I'm sure the 3870X2 will be cheaper to make too? Since its only using one PCB, for every 50 GX2's NV needs to use 100 PCB's so its bound to be cheaper to make the 3870X2.

Hey if it doesn't use crossfire, i'm on, that's definitely the future, for both camps. If they can integrate it well enough that a game will just see through it, it'll be an amazing card for a very nice value. But really, until then i'm going by single card designs with no driver issues and a better power usage, which my dad has been getting cranky about lately :P

All i'm saying is that we should wait another couple of weeks until the situation is clarified. I still think it would be an extraordinarily stupid move by Nvidia not to provide a very very good 8800GTX refresh, with 512-bit bus and 1GB ram. The dual card would still be quicker than that, since it would not need to have more shaders, and if it did it wouldn't have 256 shaders as the dual would have combined.
 
Personally I blame nVidia.

Instead of putting some effort in and pushing themselves further ahead while they have the upper hand, they've taken the easy option of rehashing their old stuff and cashing in on the consumer.

Well of course we do live in a world where if they started completely owning the market they would be sued :(
 
Hey if it doesn't use crossfire, i'm on, that's definitely the future, for both camps. If they can integrate it well enough that a game will just see through it, it'll be an amazing card for a very nice value. But really, until then i'm going by single card designs with no driver issues and a better power usage, which my dad has been getting cranky about lately :P

All i'm saying is that we should wait another couple of weeks until the situation is clarified. I still think it would be an extraordinarily stupid move by Nvidia not to provide a very very good 8800GTX refresh, with 512-bit bus and 1GB ram. The dual card would still be quicker than that, since it would not need to have more shaders, and if it did it wouldn't have 256 shaders as the dual would have combined.

9800GTX won't have a 512bit bus, it'd have to have an entire GPU redesign and from what everyone is saying it is going to be a G92 based part, probably with higher core and mem clocks than a GTS and thats about it. 1GB of mem I can see though.
 
9800GTX won't have a 512bit bus, it'd have to have an entire GPU redesign and from what everyone is saying it is going to be a G92 based part, probably with higher core and mem clocks than a GTS and thats about it. 1GB of mem I can see though.

Alright, i just don't see it not being a G90 part by any technological reasons, but you're probably right, makes it a lot easier on their part to own the market right now. Just hope they'll get done with the 9th generation quickly instead of launch a G90 in 7-8 months and keep refreshing, would rather have a new supercard quickly.
 
Im sticking my head out and saying this will give ATI the break they need...

AMD were clever in focusing on nailing the finder points of the 38xx cards (Power usage/thermals). This put them in a good position for their next refresh.

I used to like Nvidia but have not purchased any of their cards since the FX 5xxx farce.

Nvidias driver support across VGA and mobo chipsets over the last year as been disappointing to say the least... CMON AMD.
 
Last edited:
Im sticking my head out and saying this will give will give ATI the break they need...

I used to like Nvidia but have not purchased any of their cards since the FX 5xxx farce.

Nvidias driver support across VGA and mobo chipsets over the last year as been disappointing to say the least... CMON AMD.

GeForce 6 series and obviously the GeForce 8 series wiped the floor with what ATI were offering at the time and quite frankly ATI's driver support in the past at least was much worse than Nvidia's.
 
Has anyone seen ATI? oh, there he is, hiding under that rock.

OCUK 3d Mark Scores

1....21422 | simonmaltby | intel qx9650 @ 4400mhz | ati 3870xt 512mb (crossfire) @ 875/1175
2....20404 | marscay | intel q6600 @ 4050mhz | ati 3870xt 512mb (crossfire) @ 850/1261
3....19998 | J1nxy | intel q6600 @ 3915mhz | ati 2900pro 512mb (crossfire) @ 743/828
4....19621 | matt77 | intel q6600 @ 3995mhz | ati 2900xt 512mb (crossfire) @ 792/879
5....19603 | Predator | intel q6600 @ 3870mhz | ati 2900pro 1024mb (crossfire) @ 850/1000
6....19302 | Super | intel qx6850 @ 3840mhz | nvidia 8800gtx 768mb (sli) @ 665/1020
7....18600 | 4Qman | intel q6600 @ 4347mhz | nvidia 8800gts 512mb @ 913/1035
8....18144 | Bomag | intel qx6700 @ 3550mhz | nvidia 8800gtx 768mb (sli) @ 600/1000
9....17879 | paradigm | intel q6600 @ 4000mhz | ati 3870xt 512mb (crossfire) @ 855/1350
10...17579 | Fornowagain | intel q6600 @ 4022mhz | nvidia 8800gts 512mb @ 840/2230
 
Add another 8800GTS overclocked to one of the GTS systems and it'll fly.

Again tho, look at the price difference compaired to how much you'd gain over the Crossfire systems.

2 x 3870 = £280.

2 x 8800GTS = £420.

You could buy an entire extra 3870 for the price difference and if you so desired run 3 x 3870's in one of the upcomming CrossfireX boards for the same price as SLi 8800GTS's.
 
GeForce 6 series and obviously the GeForce 8 series wiped the floor with what ATI were offering at the time

Noob.

How the hell was the relevant to the comment I made?

I assume it was relating to my comment as you quoted it or were I mistaken?

While were at it, you say Nvidia "Wiped the floor" - What with and why?

In terms of performance per £ (Which I believe a lot of people are conversing about on this topic), ATI were equal to and sometimes better than the competition.
 
Last edited:
Again tho, look at the price difference compaired to how much you'd gain over the Crossfire systems.

2 x 3870 = £280.

2 x 8800GTS = £420.

You could buy an entire extra 3870 for the price difference and if you so desired run 3 x 3870's in one of the upcomming CrossfireX boards for the same price as SLi 8800GTS's.

Sure, ATI has good performance/value right now :) And until Nvidia opens up SLI to Intel they're gonna have a lead on that front for quite a while :)
 
Back
Top Bottom