• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 380 & GTX 360 Pictured + Specs


those numbers look totally off the wall of all but most optimistic figures. More BS imo - pretty ovious attempt to try and ghet folks to hold off until after xmas as post-xmas is usually a dead time, and if they can get folsk to wait to post xmas they will wait till march/april. I expect to see a lot more FUD from them in lead up to xmas.

rroff's estimates seem about right - maybe even a little optimistic if the leakage problems/yields are even slightly worse than projected particulalry of the 360
 
Last edited:
rroff's estimates seem about right - maybe even a little optimistic if the leakage problems/yields are even slightly worse than projected particulalry of the 360

That will be the crux of just how well or not it does... potentially the shader performance could put it well out ahead... but I doubt that power will be doing much for general rendering.

I'm still a little at odds with the SP figures shader performance and what I've heard elsewhere tho... which would have put the 380GTX at between a theoretical 3.5-3.8TF when compared against the 285GTX - but looking at the figures 1.74TF SP for the 380GTX I'm starting to think they were talking about 2x380GTX in multiGPU.
 
Last edited:
As for Assassins creed and DX10.1.

The reason they had .1 nerfed in AC was not because nVidia can't do it - nVidia can infact do most .1 features anyway - it was because nVidia are already running at optimal performance on DX10 and ATI due to the way their architecture is needed .1 to get upto full speed... not a very nice move by nVidia but it keeps ATI cards just behind them performance wise in the game instead of just ahead... nVidia cards would see little to no benefit from DX10.1 in that game specifically.

What, this is getting ridiculous, since the time they patched dx10.1 out of the game everyone on earth knows its because Nvidia can't do it and ATi can and it gives them a significant performance advantage in the game.

You go on to say they didn't do it because they can't do it.... much fluff later, they did it because ATi can do it.

So Nvidia can do dx10.1 great, but they'd see no benefit, and ATI can do it, and get a massive benefit.

I seriously give up, also if we're going the I know more than you know, I just can't say because of an NDA, well I'm also under and NDA which says the 380gtx will be 948% slower than a 5870. Yes we can all say silly things, the simple and very basic fact is you do not know how fast it is. How do I know that, because Nvidia DO NOT HAVE FINAL SILICON. The head engineer doesn't know how fast it will be, Mr Spin CEO doesn't know how fast it is. YOU DO NOT KNOW HOW FAST IT IS, seriously, stop pretending you do.

Nvidia are MORE than happy to leak numbers to screw another cards launch, they have still not done so, because they still don't know the final spec's of the card let alone the clock speeds. They can't possibly know how much of a boost in clock speeds A3 will get them till its back, this will effect yields and spec's of the cards. yes the base specs are known.

But if it turns out they can't get a single 512shader part back at over 300Mhz, but they can get 30 cores if they drop one shader cluster but even then only at 600Mhz. If they go for 10% less shaders they can eek out 750Mhz. We don't know because they don't know.

10% less shaders and 10% lower clock speeds, and you could be looking at 20-25% less performance. IN all likelyhood they will indeed launch a 380gtx at full specs, hopefully at top clocks but considering the lack of clocks/yields in the A2 and the lack of improvement in TSMC's process, its hardly likely to sell in large quantities.

THe question is where will the 360gtx part fall in, they will obviously on this process need a part cut down enough to provide as many salavaged parts as possible, which means its the 360gtx thats in most danger of being cut down or clocked down. I wouldn't be surprised to see Nvidia launch with 3 versions for a change, with a 375gtx straight away and a really badly cut down 360gtx to bump that effective yield as much as humanly possible.

But again, Nvidia can not possibly know where those numbers will come in clocks/shaders, maybe even bus being cut down. Whoever is telling you information or leaking you a guess at final performance, well, if the CEO is spinning his ass off all over the place, what makes you think everyone internally isn't doing the same to people they work with so those guys don't decide to move towards AMD.

Just because you're under and NDA does NOT at all mean you're getting accurate info, not least because their isn't any.

This by the way is from someone who used to do hardware reviewing, has signed NDA's with Nvidia, ATi, Intel, AMD and a few others who knows full well what I'm told under NDA isn't always accurate or final data, be that from Nvidia or AMD. PR is PR and my guess would be if anyone you're in contact with at Nvidia, it will be their developers helping work on a project your company is working on. The developers know the LEAST of anyone in a company like Nvidia/AMD, they will be given the best case scenario and told to tell everyone how good everything is going.
 
Last edited:

Hello charlie?

Figures don't miraculously jump by orders of magnitude between spins... they've been refining down to a target level for awhile... this isn't 12+ months ago they know roughly what to expect... they can't know exactly no - but they aren't just winging it either.

I've never pretended to be under any nVidia specific NDA - I'm quite aware that some of the people I know who are may not being "leaking" the most accurate of info - or even intentional spin - but I'm capable enough of reading between the lines to a certain extent.


As to "everyone on earth knows its because nvidia can't do it and ATi can " - have you looked at what features are exposed through nvapi, etc.? most of the useful stuff from the DX10.1 specs are possible on nvidia cards unofficially... it wasn't to their benefit to expose that tho.
 
and my guess would be if anyone you're in contact with at Nvidia, it will be their developers helping work on a project your company is working on. The developers know the LEAST of anyone in a company like Nvidia/AMD, they will be given the best case scenario and told to tell everyone how good everything is going.

I'll pull you up on that... the people your describing would be public relations and sales... developer liasons with nvidia generally seem to know their stuff and often have hands on experience to be able to do their job.
 
2z9eerc.png


:p
 
developer liasons with nvidia generally seem to know their stuff and often have hands on experience to be able to do their job.

I used to go drinking on a regular basis with a senior NV dev rel guy, I wouldn't agree with any of that.:p

(free cards were always nice though)
 
Tbh the main thing for me is eyefinity,

If Nvidia bring something out like this and there card is better i will get that. If they dont support somthing like that i will probably go for a 5970 when then drop in price
 
As for Assassins creed and DX10.1.

The reason they had .1 nerfed in AC was not because nVidia can't do it - nVidia can infact do most .1 features anyway - it was because nVidia are already running at optimal performance on DX10 and ATI due to the way their architecture is needed .1 to get upto full speed... not a very nice move by nVidia but it keeps ATI cards just behind them performance wise in the game instead of just ahead... nVidia cards would see little to no benefit from DX10.1 in that game specifically.

Funny, not so long ago, I'm pretty sure you were denying nVidia had anything to do with the DX10.1 removal due to, lack of evidence?
 
I think I said there wasn't evidence that they had paid off (literally with financial incentives) the developers to remove DX10.1 support... as far as I remember I've said all along they had it removed to prevent ATI getting a performance advantage.
 
i buy nvidia cause there better,drivers are better,things like nhancer,fantastic programs that dont work on ati cards,i have never onwed and never will put an ATI card in my rig...they SUCK BIGTIME...

Nvidia "The way its meant to be played":D

The reason i get a reaction mate is because this forum is mostly ATI users and im very outspoken,ive been playing games for over 28 years,longer than most of the "Padawans" on this forum so i will say what i want,some people on here also dont like it when you have a bit of cash to play with a a nice big powerful rig...i sense a hint of jealousy.....IN DA AIR! LoL

Over at Guru3d the forum is mostly nvidia users,if somebody posts over there about having ATI they dont get flamed for it like they do over here...

Translated:

I will spend money that I don't even need to speed, for possibly a lot less performance, to prove some sort of point, though I'm still not quite sure what that point is.

Nvidia "The way you're meant to be played" *bends over*

I troll for reactions by posting deluded nonsense about another faceless corporation that also wants my money.

Neh, neh! People who have common sense and who are more sensible with money are just jealouse because I are can spend the more monies then yuor can.

Age/amount of years playing games x amount of games played = the right to say what ever I want, even if that's trolling and or deluded nonsense.

I don't actually know the difference between fanboys and users of certain brand hardware, if some one has an ATi card, then they're a horrible fanboy and a troll and anything they say is invalid as it doesn't match with what I think, wht I think is absolute, and no my beck isn't brown and smelly.
 
So are you or are you not an ATI fanboy kylew?

Are you not an nVidia fanboy? You know very well that I DISLIKE nVidia, that has absolutely nothing to do with ATi though, how hard is that to understand?

I can disagree with what nVidia does without having any sort of affinity to ATi now can't I?

I don't support what nVidia do as a company, so whilst they continue to be dishonest and underhanded, I hold my stance of "it's wrong" by not supporting them with my money.

On the otherhand, if nVidia offered noticibly better performance for a similar amount of performance, I wouldn't "punish" my self and buy what would obviously be a lacking ATi card just because I dislike them.

I dislike nVidia, but I'm not stupid.

Say there were 6 players in the graphics card "world", and I still disliked nVidia, who would you call me a fanboy of?

Am I a Samsung fanboy for disliking Maxtor hard drives just because I've got about 8 samsung HDDs?
 
Last edited:
Back
Top Bottom