• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 380 & GTX 360 Pictured + Specs

A "proper case" is not a huge case as far as im concerned. Its a decent quality, well designed case.
 
Xclio2.jpg


well i have this case :D no problem with space, plus it has a jet so i have to chain it down so it doesn't flay away :eek:
and happly enough im getting my geforce gtx 260 216 sonic graphics card
imageview.php
 
they could just as easily be A2 silicon versions running at those clocks and they are undervolted and running very cool, those clocks might even be well under the expected final clocks. unfortunatly its all guesswork, all we can say reasonably accurately is that they do seem to have working versions.


Not really, it would be a rare breed of idiot that had a fully working better clocks than expected A2 silicon, then spent millions and delayed production by a further 8 weeks by sending a new version of the silicon to be produced. Now I do know Nvidia have done some truly idiotic things, but I think even they wouldn't turn down the opportunity to release a fully working product that was working as, or better than expected and ***** millions more on a new version, because that quite literally makes no sense.


Yes, shift the blame, because ATI can do no wrong.

Well firstly TSMC said they had broken equipment, secondly it cost TSMC in the region of hundreds of millions of pounds in lost production wasted silicon and tarnishing their name publically, again, and thirdly, the poor production issues are largely responsible for Nvidia's manufacturing problems and the need for extra tweaked versions of their silicon and reduced clockspeeds. So you can pretend its AMD fans making an excuse purely for AMD< or you can realise the manufacturing issues are effecting both companies and the same manufacturing issues have barely effected AMD in the past 3 years since the 2900xt. While the GT200 shrink to 55nm was HEAVILY delayed, the shrink to 40nm was canceled at a large cost to Nvidia, the massive delay to the low end parts of the GT200b, the cancelling of the midrange parts of the GT200b and the late Fermi.

So its not really a AMD excuse when its actually effected Nvidia far worse, and the excuse is actually something Nvidia can point to as a reason for the lateness while if you noticed, AMD were on time, and will turn out to be 6 months ahead of Fermi, with a full mid/low end range of the new architecture. Maybe, just maybe, think about whose actually been effected over the last 2 years, and are still being effected before you decide its an AMD excuse.


Thats all ignoring the fact that quite clearly AMD cards were being produced without problem at launch, yields DO NOT CHANGE overnight without any change to the core. Literally the ONLY possibility for sudden and massive change in availability is the manufacturing plant screwing up, considering they've said they did, I'm not sure how you can argue its not TSMC's fault. Fud says its TSMC, TSMC say its TSMC, AMD says its TSMC, Nvidia say its TSMC, the entire world says its TSMC, it has effected Nvidia more than AMD but the only people to say AMD screwed up are Nvidia guys on forums.


Well tbh all I really care about is a price war as even the last generation cards are capable of any game today barring DX11 features.


It won't unfortunately, if the HD5970 spanks the Fermi and costs the same I'm not sure where the price war will come into it. If the current rumours of the specs for a 360gtx are to be believed, it will still cost the same to make a huge and expensive core but won't be ahead of a 5870.

Our best hope for a price war is a competitive product to the 5770. IF Nvidia can release a lower end Fermi with reduced mem bus, shaders, pcb layers and is actually decent performance at £120-150, then AMD might have to drop the 5770 to £100 and they'd HAVE to either introduce something inbetween the 5770 or drop the price on the 5850, either way its good for us.
 
Last edited:
It won't unfortunately, if the HD5970 spanks the Fermi and costs the same I'm not sure where the price war will come into it. If the current rumours of the specs for a 360gtx are to be believed, it will still cost the same to make a huge and expensive core but won't be ahead of a 5870.

Our best hope for a price war is a competitive product to the 5770. IF Nvidia can release a lower end Fermi with reduced mem bus, shaders, pcb layers and is actually decent performance at £120-150, then AMD might have to drop the 5770 to £100 and they'd HAVE to either introduce something inbetween the 5770 or drop the price on the 5850, either way its good for us.

From the specs and being conservative on the estimation the 360 _should_ be 10-15% faster than the 5870 - and possibly 20% faster with very good drivers. The 380GTX doesn't look like besting the 5970 tho... how well that will pan out will depend on the price difference.
 
The GTX380 will be taking on the 5870, and the GTX360 will be going up against the 5850. The only problem is ATI has a six month refresh plan and re worked versions of the 5 series will probably be in production.

Its unfair for the GTX380 to go up against a Crossfire card and impossible to say how they will compare. A beefed up, cheap 5870 is what the GTX380 will have to contend with.
 
Last edited:
380GTX should do well against the 5970 tho I don't think its realistically likely it will beat it... except in titles with poor multi GPU support. The 360GTX _should_ aslong as nVidia don't **** it up, on specs alone be quite a good bit faster than the 5850.
 
From the specs and being conservative on the estimation the 360 _should_ be 10-15% faster than the 5870 - and possibly 20% faster with very good drivers. The 380GTX doesn't look like besting the 5970 tho... how well that will pan out will depend on the price difference.

I'd be surprised, from those specifications, assuming the same clock speeds, the 5870 beats the 360 in every area except memory bandwidth (in which the difference is about 9%) and memory amount.
 
5870 has slightly faster pixel and texture fillrate than the 360GTX - but nVidia cards have traditionally run somewhat more optimal in this regard so being so close they are likely to edge it by around 5-10% on realworld performance - then you have the extra memory bandwidth, better efficency in some parts of the shader pipeline, etc. which will be all together good for another 5-10% performance. Aslong as nVidia keep on the ball it should just about edge it over the 5870. The 380GTX based on teh specs is roughly 40% faster than that - maybe 45% pushing it - which would tuck it in behind the 5970 - tho it probably won't be able to touch it in games where multi GPU is good for 80+% gains.
 
5870 has slightly faster pixel and texture fillrate than the 360GTX - but nVidia cards have traditionally run somewhat more optimal in this regard so being so close they are likely to edge it by around 5-10% on realworld performance - then you have the extra memory bandwidth, better efficency in some parts of the shader pipeline, etc. which will be all together good for another 5-10% performance. Aslong as nVidia keep on the ball it should just about edge it over the 5870. The 380GTX based on teh specs is roughly 40% faster than that - maybe 45% pushing it - which would tuck it in behind the 5970 - tho it probably won't be able to touch it in games where multi GPU is good for 80+% gains.

I have seen absolutely no evidence of Nvidia 'traditionally being more optimal' in regards to the fillrates of the cards (since when was 'tradition' a factor in technology?). In fact from the 'previous' generation of RV770 vs. GT200, RV770's were competitive in terms of raster-bound workloads such as antialiasing (and significantly, theoretically it could calculate twice as many z-buffer operations per-unit than GT200, if my memory serves) despite having half the units. I've always thought that it was mostly just Nvidia's habit of putting significantly more of those types of units in, at least this has been true for every generation since G80.
 
I think Nvidia's unreleased, unbenchmarked theoretical card might possibly beat something that is available today from AMD.

Or you could just buy an AMD card today and enjoy something which kicks everything Nvidia ACTUALLY produces right in the nuts.

Just a suggestion.
 
Also you can hardly call the spec of the new card traditional. Nvidia are taking a huge gamble on this card with respect to its CPU based design.

The other problem is Nvidia don't usually respond well to having to beat ATI. They usually get themselves in a right old tisswas, and struggle to except reality for a while.
 
Last edited:
ROPs and TMU setup is similiar to the old cards on both designs tho - from that aspect ATI always did slightly better in synthetic tests but dragged behind slightly in actual game benchmarks - specially with lots of alpha i.e. smoke - where you could compare fillrate. Unfortunatly can't find my idtech3 "shader" benchmark results off hand. (These aren't pixel shaders, but old style rasterization).
 
But how exactly do you know this won't perform simalarly to a HD5970? even if it's a little slower, it has the advantage of being a single GPU based card.

Have you seen the size of the HD5970? it's a joke.

There are uATX cases that can take the 5970, IMO any full size "ATX" case that cant take ATX cards is a joke.
 
I think Nvidia's unreleased, unbenchmarked theoretical card might possibly beat something that is available today from AMD.

Or you could just buy an AMD card today and enjoy something which kicks everything Nvidia ACTUALLY produces right in the nuts.

Just a suggestion.

Or he could wait till nVidia release their next gen cards then make a proper judgement on what to buy. That might be a better suggestion.
 
There are uATX cases that can take the 5970, IMO any full size "ATX" case that cant take ATX cards is a joke.

ATX cards - I didn't realise the atx specification actually covered the length of the card.....

My case won't fit a 5970 unless I rip the hdd cage out, it's a zalman gt900 and certainly no joke. Calling the size of the 5970 a joke is pretty pointless imo, so is calling any case that can't hold one a joke too.
 
Back
Top Bottom