• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to showcase Fermi/GTX 300

Well isn't that a suprise :rolleyes:

Answer the question instead of dodging it.

What do you expect me to not challenge your comments while others have been doing so quite allot lately i see in many threads that i did not bother to really get involved with to show you that what is aimed at you is purely of your own making.
 
:rolleyes: and you expect me to waste my time trying to convince someone who doesn't want to be convinced... you've either gotta meet me half way on this or pipe down.

Nothing I say will be taken on its own merit around here.
 
:rolleyes: and you expect me to waste my time trying to convince someone who doesn't want to be convinced... you've either gotta meet me half way on this or pipe down.

Nothing I say will be taken on its own merit around here.

Everything you say is taken on its own merit as few here if any know you personally.

Now answer the simple question.

You could have just answered the question instead of making that silly comment aimed at me as some how i don't have a right to comment on your posts that has now lead to 3 comments now of nothing.
Talk about missing an opportunity.
 
Last edited:
4870x2 will be hotter than a 5970 will ever be.

We've heard that ATI is working on an X2 version of Radeon HD 5870 card but the biggest obstacle is the power and how to cool such a card.

We’ve learned that the current TDP for X2 is 376W and that the company is working on this issue, as apparently they will have to slow down the GPUs down by quite a lot to get rid of the heat.

Even if they use downclocked Radeon 5850 cores that run at 725MHz, the power goes down by only 36W (2x170W) to 340W. The hottest card from ATI so far was HD 4870 X2 that had TDP of 286W. To release a Radeon HD 5870 X2 card ATI should go down at least to 300W, especially due to thermal issues, but who knows, maybe ATI will launch 300W+ card.

We might be looking at the dawn of graphics cards with three or even four power connectors, as two might not be enough this time.

http://www.fudzilla.com/content/view/15499/65/
 
Last edited:
Everything you say is taken on its own merit as few here if any know you personally.

Now answer the simple question.

You could have just answered the question instead of making that silly comment aimed at me that has lead to 3 comments now of nothing.
Talk about missing an opportunity.

Seeing as your not prepared to meet me half way... how about we turn this around... you show whats not true about his comment and I'll come back with counter points if appropriate.
 
Seeing as your not prepared to meet me half way... how about we turn this around... you show whats not true about his comment and I'll come back with counter points if appropriate.

I have not shown anything about not meeting you halfway when you have not said anything to meet you with in regards to to question.

~The only thing you said was
Well isn't that a surprise
Which is of no use to anyone.

If you had the answer you would have said it instead of rolling your eyes.
 
Last edited:
Back on topic... wonder how long til games start making use of directcompute, etc. for AI way finding and stuff like that... should make for some interesting advances in RTS games.
 
Say goodbye to a gpu maker (nvidia) that pushes the limits, works along and funds games developers, promotes innovation and new ideas etc and welcome the new overlords(ati) that dont care about the software development, promote mediocrity, supply cheap unreliable parts etc. the future of pc gaming looks even darker.


ok so lets break this down then shall we.


"a gpu maker (nvidia) that pushes the limits" well yes they do they push the limits of what a graphics card can do but then so do ATI. they also push the limits of what the general public will put up with but thats a whole differnt disscusion. :p

"works along(im sure there is suppose to say alongside) and funds games developers" well i dont think anyone can argue aginst this hence the whole batmangate :eek: issue needless to say Nvidia definatly works alongside developers and funds game developement.

"promotes innovation and new ideas etc" well they have brought us 3dvision and physx to thier mainstream cards i think that counts as new and inavotive. ;)

"welcome the new overlords(ati)" well they certainly have rolled in flattening nvidias card lineup and they are just about to claim the top slot of fastest overall card aswell.

"that dont care about the software development" well there are those that say that this is correct and those that dont, all i will say is that from my experiance there are more after issues with ATI cards and drivers than with Nvidia ones mainly needing a patch or a hotfix driver. bottom line i wouldnt go as far as to say they dont care but maybe not quite as good as nvidia are at it.

"promote mediocrity" er......er.....wel roff did only say the comment isn't without some truth... so we can ignore this bit. :)

"supply cheap unreliable parts etc" well all the ATI lovers are constantly banging on about how ATI's cards are cheaper than Nvidia's, as for reliablility well theres a lot of threads about the 5800 series but i wouldnt label them unreliable yet.

so there we have it on the whole the comment was not totally untrue so Roff's response was pretty much right.
 
Back on topic... wonder how long til games start making use of directcompute, etc. for AI way finding and stuff like that... should make for some interesting advances in RTS games.

I totally agree the possibilities are very exciting, the things these new cards are going to be capable of from both camps, we are just going to have to wait and see how the computing power is utilised.
 
ok so lets break this down then shall we.


"a gpu maker (nvidia) that pushes the limits" well yes they do they push the limits of what a graphics card can do but then so do ATI. they also push the limits of what the general public will put up with but thats a whole differnt disscusion. :p

"works along(im sure there is suppose to say alongside) and funds games developers" well i dont think anyone can argue aginst this hence the whole batmangate :eek: issue needless to say Nvidia definatly works alongside developers and funds game developement.

"promotes innovation and new ideas etc" well they have brought us 3dvision and physx to thier mainstream cards i think that counts as new and inavotive. ;)

"welcome the new overlords(ati)" well they certainly have rolled in flattening nvidias card lineup and they are just about to claim the top slot of fastest overall card aswell.

"that dont care about the software development" well there are those that say that this is correct and those that dont, all i will say is that from my experiance there are more after issues with ATI cards and drivers than with Nvidia ones mainly needing a patch or a hotfix driver. bottom line i wouldnt go as far as to say they dont care but maybe not quite as good as nvidia are at it.

"promote mediocrity" er......er.....wel roff did only say the comment isn't without some truth... so we can ignore this bit. :)

"supply cheap unreliable parts etc" well all the ATI lovers are constantly banging on about how ATI's cards are cheaper than Nvidia's, as for reliablility well theres a lot of threads about the 5800 series but i wouldnt label them unreliable yet.

so there we have it on the whole the comment was not totally untrue so Roff's response was pretty much right.

I think you pretty much summed it up.
 
Theres nothing unreliable about ATi cards, with the launch of any new item comes failures, because some items arrive broken. CPU's are the most strictly controlled items in terms of manufacturer(largely because AMD and Intel don't rely on cheap crap like TSMC to make things) so they rarely fail, but almost every new launch is followed by multiple threads on mobo's which don't support the new cpu with flashing first. I remember many many posts on mobo's that needed flashing with a new bios, with a old P4, before they'd work with a C2D. Does that make Intel unreliable, or simply new.

As for promotes new ideas, well, Nvidia didn't start Physx, they bought an almost unused API and PAID lots of companies to use of which all but a VERY small handful are useless implementations. Theres very little innovative about it, better cloth effects, better windows breaking, thats been a gradual improvement since the dawn of games, its not some new Idea Nvidia, nor Ageia had. 3d screens, Nvidia certainly didn't do that first either. DX9, 10, 11, 10.1, nope, ATI there(original dx10, Nvidia just flat out never provided a real DX10 card). Funds game developers, not really, pay them to sabotage the competition, funds entire games, makes games possible where without their funding the games wouldn't exist? Again, no. They pay for advertising, they do appear to pay to sabotage, they don't actually fund and promote fair gaming.

People seem to believe that ATi aren't interested, because they won't spend millions a year buying up game dev's and having them screw their opposition. IN reality there are plenty of people at ATi who work with multiple game dev's, infact even those Nvidia pay to give them special attention still get ATi support.

ATi almost always have an official hotfix driver for any specific improvements for most new games out that need some kind of fix. Infact they, and Nvidia, often have a new driver set for games before release, infact, ATi often have these for TWIMTBP games, and even more they've had working drivers on release of TWIMTBP games when Nvidia haven't.

So to break it down, the only original thing they've brought to us in the past 5-10 years is........ anti competitive practices? Personally I'm happy ATi haven't led the way on that front.

As for card costs of 5870's, the increase recently thats directly down to TSMC's reduced output, that will only not effect Fermi, because it will be fixed soon after xmas, you know, months before Fermi is even out. The prices will have dropped again before Fermi becomes available.

Also Loadsa, you might want to read Fudzilla more closely and realise he's never gotten a single thing right, even most recently he's had a go at AMD's Bulldozer being 2 quads stitched together, because he wasn't able to comprehend he was looking at a single core with 8 interger pipelines, not 2 quad cores together. He's ignorant, stupid and so anti ATi/AMD its hilarious.

It will no doubt be a high power card, the 5970, but its unlikely to be hotter, and its unlikely due to temps that they are lower clocked, just a power limit. The card won't be far off twice as fast as 4870x2, that it will use a little more power isn't surprising and theres still a limit to the amount of power you can supply safely to a card.

as for Fermi being 50% faster than a 295gtx, Nvidia have never beaten the last gen cards in crossfire/sli, a few percent here and there in some games, sure, overall one of the biggest gains, the move to 8800gtx, it was slower than the x1950xt's in crossfire in most of the big titles of the timei, BF2, Ep 2, etc, etc. Nvidia will likely come in ahead of a 4870x2/295gtx with a single gpu, but be behind 2 x 4890/2x285gtx's in crossfire.
 
Last edited:
Theres nothing unreliable about ATi cards, with the launch of any new item comes failures, because some items arrive broken. CPU's are the most strictly controlled items in terms of manufacturer(largely because AMD and Intel don't rely on cheap crap like TSMC to make things) so they rarely fail, but almost every new launch is followed by multiple threads on mobo's which don't support the new cpu with flashing first. I remember many many posts on mobo's that needed flashing with a new bios, with a old P4, before they'd work with a C2D. Does that make Intel unreliable, or simply new.

As for promotes new ideas, well, Nvidia didn't start Physx, they bought an almost unused API and PAID lots of companies to use of which all but a VERY small handful are useless implementations. Theres very little innovative about it, better cloth effects, better windows breaking, thats been a gradual improvement since the dawn of games, its not some new Idea Nvidia, nor Ageia had. 3d screens, Nvidia certainly didn't do that first either. DX9, 10, 11, 10.1, nope, ATI there(original dx10, Nvidia just flat out never provided a real DX10 card). Funds game developers, not really, pay them to sabotage the competition, funds entire games, makes games possible where without their funding the games wouldn't exist? Again, no. They pay for advertising, they do appear to pay to sabotage, they don't actually fund and promote fair gaming.

People seem to believe that ATi aren't interested, because they won't spend millions a year buying up game dev's and having them screw their opposition. IN reality there are plenty of people at ATi who work with multiple game dev's, infact even those Nvidia pay to give them special attention still get ATi support.

ATi almost always have an official hotfix driver for any specific improvements for most new games out that need some kind of fix. Infact they, and Nvidia, often have a new driver set for games before release, infact, ATi often have these for TWIMTBP games, and even more they've had working drivers on release of TWIMTBP games when Nvidia haven't.

So to break it down, the only original thing they've brought to us in the past 5-10 years is........ anti competitive practices? Personally I'm happy ATi haven't led the way on that front.

As for card costs of 5870's, the increase recently thats directly down to TSMC's reduced output, that will only not effect Fermi, because it will be fixed soon after xmas, you know, months before Fermi is even out. The prices will have dropped again before Fermi becomes available.

Also Loadsa, you might want to read Fudzilla more closely and realise he's never gotten a single thing right, even most recently he's had a go at AMD's Bulldozer being 2 quads stitched together, because he wasn't able to comprehend he was looking at a single core with 8 interger pipelines, not 2 quad cores together. He's ignorant, stupid and so anti ATi/AMD its hilarious.

It will no doubt be a high power card, the 5970, but its unlikely to be hotter, and its unlikely due to temps that they are lower clocked, just a power limit. The card won't be far off twice as fast as 4870x2, that it will use a little more power isn't surprising and theres still a limit to the amount of power you can supply safely to a card.

as for Fermi being 50% faster than a 295gtx, Nvidia have never beaten the last gen cards in crossfire/sli, a few percent here and there in some games, sure, overall one of the biggest gains, the move to 8800gtx, it was slower than the x1950xt's in crossfire in most of the big titles of the timei, BF2, Ep 2, etc, etc. Nvidia will likely come in ahead of a 4870x2/295gtx with a single gpu, but be behind 2 x 4890/2x285gtx's in crossfire.

/thread
 
Theres nothing unreliable about ATi cards, with the launch of any new item comes failures, because some items arrive broken. CPU's are the most strictly controlled items in terms of manufacturer(largely because AMD and Intel don't rely on cheap crap like TSMC to make things) so they rarely fail, but almost every new launch is followed by multiple threads on mobo's which don't support the new cpu with flashing first. I remember many many posts on mobo's that needed flashing with a new bios, with a old P4, before they'd work with a C2D. Does that make Intel unreliable, or simply new.

As for promotes new ideas, well, Nvidia didn't start Physx, they bought an almost unused API and PAID lots of companies to use of which all but a VERY small handful are useless implementations. Theres very little innovative about it, better cloth effects, better windows breaking, thats been a gradual improvement since the dawn of games, its not some new Idea Nvidia, nor Ageia had. 3d screens, Nvidia certainly didn't do that first either. DX9, 10, 11, 10.1, nope, ATI there(original dx10, Nvidia just flat out never provided a real DX10 card). Funds game developers, not really, pay them to sabotage the competition, funds entire games, makes games possible where without their funding the games wouldn't exist? Again, no. They pay for advertising, they do appear to pay to sabotage, they don't actually fund and promote fair gaming.

People seem to believe that ATi aren't interested, because they won't spend millions a year buying up game dev's and having them screw their opposition. IN reality there are plenty of people at ATi who work with multiple game dev's, infact even those Nvidia pay to give them special attention still get ATi support.

ATi almost always have an official hotfix driver for any specific improvements for most new games out that need some kind of fix. Infact they, and Nvidia, often have a new driver set for games before release, infact, ATi often have these for TWIMTBP games, and even more they've had working drivers on release of TWIMTBP games when Nvidia haven't.

So to break it down, the only original thing they've brought to us in the past 5-10 years is........ anti competitive practices? Personally I'm happy ATi haven't led the way on that front.

As for card costs of 5870's, the increase recently thats directly down to TSMC's reduced output, that will only not effect Fermi, because it will be fixed soon after xmas, you know, months before Fermi is even out. The prices will have dropped again before Fermi becomes available.

Also Loadsa, you might want to read Fudzilla more closely and realise he's never gotten a single thing right, even most recently he's had a go at AMD's Bulldozer being 2 quads stitched together, because he wasn't able to comprehend he was looking at a single core with 8 interger pipelines, not 2 quad cores together. He's ignorant, stupid and so anti ATi/AMD its hilarious.

It will no doubt be a high power card, the 5970, but its unlikely to be hotter, and its unlikely due to temps that they are lower clocked, just a power limit. The card won't be far off twice as fast as 4870x2, that it will use a little more power isn't surprising and theres still a limit to the amount of power you can supply safely to a card.

as for Fermi being 50% faster than a 295gtx, Nvidia have never beaten the last gen cards in crossfire/sli, a few percent here and there in some games, sure, overall one of the biggest gains, the move to 8800gtx, it was slower than the x1950xt's in crossfire in most of the big titles of the timei, BF2, Ep 2, etc, etc. Nvidia will likely come in ahead of a 4870x2/295gtx with a single gpu, but be behind 2 x 4890/2x285gtx's in crossfire.

what are you taling about cpu's for.
"Theres nothing unreliable about ATi cards" Did I say thier was.


OH whats the point.
 
The 5970's slated to have lower clocks, could that insinuate that they're failed 5870 cores that couldn't quite make the 850/1200 standard?.

So, that'd still be TMSC fail lol.

Or it is an ATI fail because of a design flaw preventing enough chips hitting the predicted speed.

Or simply it is a heat-power issue.


It is certainly nothing to do with TSMC.
 
Back
Top Bottom