• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

GTX 380 & GTX 360 Pictured + Specs

Assassins Creed isnt a game, at which point do you have any fun????!!!! (apart from turning it off and uninstalling)

And LolZ at the bickering, i thought PC people were supposed to be above the level of ps3 vs xbox tardiness.

Just buy what u want, its ur own money fgs!

There are some good posts in these "bickering" threads, drunkenmaster normally posts something resembling sense that are relatively informed. Some of us here really want to debate the capability of a card that we don't have performance figures for, and whilst some opinions are little more than guess work and interpolation (*cough* Roff's *cough*) the actual process of debating about them, does inform (or makes me get more informed) on the cards they are spending hundreds of pounds on.

What amazes me, is the time spent by some members furiously looking all over the internet for benchmarks of different cards, when if they dedicated as much time understanding how they work they'd actually learn something:p
 
There are some good posts in these "bickering" threads, drunkenmaster normally posts something resembling sense that are relatively informed. Some of us here really want to debate the capability of a card that we don't have performance figures for, and whilst some opinions are little more than guess work and interpolation (*cough* Roff's *cough*) the actual process of debating about them, does inform (or makes me get more informed) on the cards they are spending hundreds of pounds on.

What amazes me, is the time spent by some members furiously looking all over the internet for benchmarks of different cards, when if they dedicated as much time understanding how they work they'd actually learn something:p

Its not entirely guesswork... sure most of my sources are under NDA and I'm mostly working from snippets they let slip mixed with some semi-educated opinion on the matter.
 
As for Assassins creed and DX10.1.

The reason they had .1 nerfed in AC was not because nVidia can't do it - nVidia can infact do most .1 features anyway - it was because nVidia are already running at optimal performance on DX10 and ATI due to the way their architecture is needed .1 to get upto full speed... not a very nice move by nVidia but it keeps ATI cards just behind them performance wise in the game instead of just ahead... nVidia cards would see little to no benefit from DX10.1 in that game specifically.
 
As for Assassins creed and DX10.1.

The reason they had .1 nerfed in AC was not because nVidia can't do it - nVidia can infact do most .1 features anyway - it was because nVidia are already running at optimal performance on DX10 and ATI due to the way their architecture is needed .1 to get upto full speed... not a very nice move by nVidia but it keeps ATI cards just behind them performance wise in the game instead of just ahead... nVidia cards would see little to no benefit from DX10.1 in that game specifically.

Or maybe its because Nvidia are run by slimy bar-stewards that have no problem screwing the PC industry to hide their own shortcomings.
 
If NV could do most 10.1 features, then why did it take them so long to get a 10.1 card out of the door.
Originally the game supported 10.1 and 10, afaik, and the 10.1 was removed due to it performing better on ATi compared to just 10 on NV, while at just 10 ATi was a bit weaker (something that has been fixed via drivers since then afaik)
 
After seeing the presentation of Fermi by JSung, I am left disappointed. All that mumbo jumbo about rendering some weird liquid looking thing, the amount of computing this thing will pull off etc.
He kept going on and on and on about computing power, floating point calculations and precision, most of this stuff is hardly relevant to us - gamers. Maya lovers will probably appreciate this, but come on, the main customers are always going to be gamers. Unless some games are going to utilise this, but after all the console ports we are getting accustomed to, I really really doubt that.
Overall, Fermi is definitely hit or miss, and I am inclining towards a big miss Geforce FX 5800Ultra style
 
I am aware of that, but the fact is that Fermi is still being flaunted as a GPU computing platform that will be multi programmable blah blah, with no real promise of actual gaming performance, someone has already mentioned this in one of the articles I read about it and I tend to agree.
 
YAY i wont have to go get a ATI :D Im sure ATI will controll the market for a little wile longer but also sure NV will be right back up the top soon so hopefully i wont have to get a ATI.
 
Last edited:
If NV could do most 10.1 features, then why did it take them so long to get a 10.1 card out of the door.
Originally the game supported 10.1 and 10, afaik, and the 10.1 was removed due to it performing better on ATi compared to just 10 on NV, while at just 10 ATi was a bit weaker (something that has been fixed via drivers since then afaik)

nVidia support most of the DX10.1 spec through nvapi/non-standard paths... afaik the only thing thats entirely not supported is the increase in lights you can handle in a single pass in the deferred shader pipeline. Much of the reason nVidia didn't support it is due to the fact that nVidia doesn't see the performance loss from using DX10 over DX10.1 that ATI would see - due to the differences in architecture - by sitting out they force developers hands to not make 10.1 standard and nerf ATI performance potential. Its a nasty move but its business.

After seeing the presentation of Fermi by JSung, I am left disappointed. All that mumbo jumbo about rendering some weird liquid looking thing, the amount of computing this thing will pull off etc.
He kept going on and on and on about computing power, floating point calculations and precision, most of this stuff is hardly relevant to us - gamers. Maya lovers will probably appreciate this, but come on, the main customers are always going to be gamers. Unless some games are going to utilise this, but after all the console ports we are getting accustomed to, I really really doubt that.
Overall, Fermi is definitely hit or miss, and I am inclining towards a big miss Geforce FX 5800Ultra style

I don't think its going to be a miss as such... but its a bit late to the table for what it is... its looking like the 360GTX is going to be price and performance comparable to the 5870 - but it won't convincingly beat it on either... and the 380GTX which does look like convincingly beating it puts it price wise into the 5970/dual 58xx arena where its looks like barely clinging on by the skin of its teeth performance wise... so unless they really beef up the shader performance its going to be a bit less than spectacular launch wise.
 
These are posted in another thread here - and supposedly the work of a troll.

The numbers are more or less correct for the specifications listed - but the specs listed are around 20% higher than the supposed to be retail target.
 
Back
Top Bottom