• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** Official Nvidia 9xxx Series Thread***

What has a GTX/Ultra with a core speed of 700mhz+ got to do with them not being old tech, nothing thats what.

Ive just told you why its old tech, but ill say it again, now see if you can grasp it this time, maybe its because i need to fill in the blanks, so ill do that. :D

It is old, the G92's, and the upcoming 9 series are what, 8800's, and when were those cards released, yup, 2006 wasn't it.

Is that better now, can you see now why an 8800 released in 2006 is old tech , now that we are in 2008. :)
 
Last edited:
What has a GTX/Ultra at 700mhz got to do with being old tech, nothing thats what.

Ive just told you why its old tech, but ill say it again, now see if you can grasp it this time, maybe its because i need to fill in the blanks, so ill do that. :D

It is old, the G92's, and the upcoming 9 series are what, 8800's, and when were those cards released, yup, 2006 wasn't it.

Is that better now, can you see why an 8800 released in 2006, old tech. :)

It's got to do with the core and shaders and you know it. Does the old tech HD3870 have any greater difference with the core and memory overclocks?. Now that is a perfect example about what you are trying to say about the G92 which is completely wrong.

Right if that was the case then the die shrink would only make the core more efficient just like the 3870's. Wouldn't it?. What about the shader clock speed and it's overclockabilty?. Can any GTX or Ultra come near to 2000Mhz for that? No, you know it can't so why try to argue against it?.

I admit that I was a bit wrong before when I was arguing that the HD3870 wasn't just a re-badged 2900 as they achieve the same clock speed and just about FPS results in games as it's just a more efficient core with better heat and power output. I will stick my hand up to being wrong about something although I see efficiency and heat to be a big factor when deciding for a card and that seems to have changed so I don't see that as old tech as the older tech ran hotter, drew more power and was a lot noisier to boot. Can you stick your hands up in regards to the G92's? :D.

The G92 is nothing like the HD3870 in comparison. Nothing like it at all but you will continue to draw the G92 into the same league as with what AMD/ATI have done with the 2900 to the HD3870 and you can clearly see that it doesn't so again please try and answer it without repeating yourself ;).

Edit: Doesn't matter how bold you make it. In fact it's just easier for people to see that you're getting your wires mixed up between the G92 and HD3 series cards. Cheers ;).
 
Last edited:
Are the 8800's old tech ?

I say yes seen as we are in 2008, and they came out in 2006. :)

Now, what about everyone else, yes/no.

Im not getting my wires crossed, both the G92, and the 3870's are old tech, the G92's are 8800's, the 3870's are 2900's, one came out November 2006, the other May 2007, is my calendar wrong, is the date not Jan 28th 2008, have i been abducted from 2006 by aliens and transported to 2008. :confused:
 
Last edited:
Are the 8800's old tech ?

I say yes seen as we are in 2008, and they came out in 2006. :)

Now, what about everyone else, yes/no.

Yes and no.

Yes, they're based on an old card.

No, as the process shrink helps a LOT with noise, heat, and power over the 8800s.

Who knows, with that die shrink they may squeeze in some new features as well.
 
I would say yes to the g92(8800 gt/gts 512)as to me they tweaked 8800 and not new design just like the 1800 to 1900 and 7800 to 7900 was from my point of view.
 
I'll have to say no. If Loadsa was correct then both the HD3870 and the 8800GTS would both behave the same. The HD3870 is more old tech to me as the core didn't receive a boost or more overclocking headroom. It did however receive efficiency and less of a power draw which if it was old tech then it would run the same. It does not so that is new tech to me.

Now the G92 as I've explained above can achieve core and shader clocks that the GTX can't even think about so I'm confused how you can say the exact same thing about both cards Loadsa when it's far too different between the two manufacturers to place them in the exact same boat as you've done time after time after time........

Try and stick a bigger memory interface on the GTS and GT and watch while it would surpass the GTX and Ultra without a sweat. Just the same as if you had the shaders and the core of the G92 on the G80 board. It would also fly.
 
Yes and no.

Yes, they're based on an old card.

No, as the process shrink helps a LOT with noise, heat, and power over the 8800s.

Who knows, with that die shrink they may squeeze in some new features as well.

Which is what i said earlier, the process is new yes, but that new process still has old 2006 tech slapped on it no matter how you look at it, it could be the size of a pin head, it could be the size of Mars, it doesn't matter, its still an old 2006 8800. :)

New tech is like going from a 7900 to an 8800, the 8800 is not the same 7900 on a smaller die is it.
 
Last edited:
I'll have to say no. If Loadsa was correct then both the HD3870 and the 8800GTS would both behave the same. The HD3870 is more old tech to me as the core didn't receive a boost or more overclocking headroom. It did however receive efficiency and less of a power draw which if it was old tech then it would run the same. It does not so that is new tech to me.

Now the G92 as I've explained above can achieve core and shader clocks that the GTX can't even think about so I'm confused how you can say the exact same thing about both cards Loadsa when it's far too different between the two manufacturers to place them in the exact same boat as you've done time after time after time........

Maybe but it been out half the time 8800 has been out,gtx was out 7 months or so before the 2900
 
Which is what i said earlier, the process is new yes, but that new process still has old 2006 tech slapped on it no matter how you look at it, it could be the size of a pin head, it could be the size of Mars, it doesn't matter, its still an old 2006 8800. :)

That is not good enough reasoning and you know it. Please tell me how both companies are so different between these new cards but you regard them both as doing the exact same?. It just doesn't make sense in the slightest no matter which way you look at it:confused:?.
 
Maybe but it been out half the time 8800 has been out,gtx was out 7 months or so before the 2900

The time is irrelevant and I'm not trying to be blunt here. I just don't want to get on time issues as it's got to do with the HD3870 and the 8800GT and GTS.

I can't see the point Loadsa is trying to make here. Both companies have done the exact same but if you look at benchmarks, core speeds, shader speeds between the GT and GTS compared to the differences between the 2900 and the 3870 then you can't deny that there is differences between what the two manufacturers have done.

Oh so they are both riding on old tech just the same but differently. It's insane to say that. I'll just go and get my inflatable dartboard out. (Not aimed at you queamin ;))
 
Which is what i said earlier, the process is new yes, but that new process still has old 2006 tech slapped on it no matter how you look at it, it could be the size of a pin head, it could be the size of Mars, it doesn't matter, its still an old 2006 8800. :)

New tech is like going from a 7900 to an 8800, the 8800 is not the same 7900 on a smaller die is it.

Yeah, but 55nm isn't an old tech, but it is something the 9800s would feature. So it is at least some "old" tech but also some new, although it's certainly more "old" tech than new.
 
The smaller process allows for the higher speeds, yes the G92's are different from the 3870's, as the 3870's are exactly like 2900's, same rops, same shaders etc..., the G92's on the other hand have extra texture units, more shaders than the G80 GTS, but they are the same 2006 tech 8800's, they are that same old tech, just with units enabled/disabled.

Yeah, but 55nm isn't an old tech, but it is something the 9800s would feature. So it is at least some "old" tech but also some new, although it's certainly more "old" tech than new.


No you right, the process is new, the tech that is on that process is old, which is what ive been saying, ATi have stuck their old May 2007 tech on a new process, the process shouldn't matter as much as the tech thats on it, as what does the process do, it just reduces power, reduces heat, what good is that to us, if we all wanted 8800's/2900's using less power, and giving out less heat, we would not have bought them way back in 2006, or in May 2007 when they were released, we would have waited till now 2008, then got ourselves those cards we were going to buy in 2006. :)
 
Last edited:
The time is irrelevant and I'm not trying to be blunt here. I just don't want to get on time issues as it's got to do with the HD3870 and the 8800GT and GTS.

I can't see the point Loadsa is trying to make here. Both companies have done the exact same but if you look at benchmarks, core speeds, shader speeds between the GT and GTS compared to the differences between the 2900 and the 3870 then you can't deny that there is differences between what the two manufacturers have done.

Oh so they are both riding on old tech just the same but differently. It's insane to say that. I'll just go and get my inflatable dartboard out. (Not aimed at you queamin ;))

Ati had problems with their 2900 maybe that is why they did 3870 so early in the 2900 life span,to get power and noise and price down as they couldn't match nvidia's performance.
The gt got pushed forward to when the 3870 was going to be released.
 
Remember when you could open something up on the old 6800 GT's (which i forget now:p) that turned them into 6800 Ultra's, were they classed as new tech then, no, so how can the new GT/GTS be classed as new tech when they are the exact same, just some things enabled/disabled.:)
 
Last edited:
Remember when you could open something up on the old 6800 GT's (which i forget now:p) that turned them into 6800 Ultra's, were they classed as new tech then, no, so how can the new GT/GTS be classed as new tech when they are the exact same, just some things enabled/disabled.:)

Only difference between the 6800 gt and the 6800 ultra was clockspeed.
 
Ati had problems with their 2900 maybe that is why they did 3870 so early in the 2900 life span,to get power and noise and price down as they couldn't match nvidia's performance.
The gt got pushed forward to when the 3870 was going to be released.

They needed to. Otherwise it wasn't going to be good for ATI. The GT got pushed at the right time (Wrong time for me as I bought a 2900Pro a few weeks before the GT :().

Remember when you could open something up on the old 6800 GT's (which i forget now:p) that turned them into 6800 Ultra's, were they classed as new tech then, no, so how can the new GT/GTS be classed as new tech when they are the exact same, just some things enabled/disabled.:)

Post above mine explains your "same tech 6800" argument which has more holes in it than a tea bag.

The G92 core is new tech and a lot more so than I regard ATI 3870's die shrink. Yes there was changes made to the G92 where as it was just a case of "Honey, I shrunk the die" with the HD38-- series but on both levels it's been redesigned to work more efficiently. I can see why you'd think ATI isn't new tech but I see new tech in the efficency, power draw and heat if the card where as the G92 sees a whole lot more than ATI does.

The G92 core had to have a faster clock speed and shader speed to make up for the missing bus speeds but still will receive a bigger hit in HD gaming with AA/AF enabled. If you had the memory interface of the GTX on the G92 card then you'd have a card that should be Nvidia's next card. Yes, it will be a combination of older tech with newer tech but as we all know the 256bit bus limits the card a little bit and you know that a G92 core and shader speed on a 384bit interface would absolutely fly. This is what I hope Nvidia have been working on, if they haven't and try to introduce another 256bit bus when it's clear that a little more on the 256bit with that very overclockable core would turn out to be a fantastic card, then it's the wrong move IMO.

To redesign from the ground up isn't an easy task and with a combination of the new tech (G92) with the old 384bit interface of the GTX is the way I see forward for a single but faster card than the current GTX and Ultra. They need to do this to make a great new card as they don't need to design a whole new card from the ground up to get better performances. Well you seen what happened to the FX series. Also the 7 series with bad IQ and the inability to do both AA and HDR where Nvidia had great IQ and could easily do AA and HDR. Do you think Nvidia want to go for another round of being a little bit down and out when they can redesign their new G92 core and simply (erm, well it should be for them :p) just add a bigger bus. They've struck it rich with the G80 and G92 cores and one mixed between the two will give them another winner in my opinion from what I know and have seen from the GTX, GT and GTS.
 
Last edited:
Back
Top Bottom