• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Dx10.1 Vs Dx10 - performance video

I disagree - nvidia had a working, stable and feature rich hardware physics implementation up and running in a matter of weeks... and it is something that game developers are interested in. I wouldn't call DX10.1 unimportant either some of the stuff like SSAO has huge potential... but outside of the few game studios that are being pressured to use it by ATI theres very little real useage or interest in it at this time. By the time most developers need these features to be able to implement things in a game we should have DX11 available and GPUs that can effectively use these features for best effect rather than in a limited highly specific fashion.

I think what some people are trying to get at is at the time dx10 was meant to be 10.1 but nvidia went moaning to ms and it got changed and really it seams to me that was a bit of a kick in the teeth for ati if that is true.
 
Not really - use the search function you'll see how many times I have commended or reccomended ATI for other reasons... I've often shown the 4870 as a worthy alternative to the 260 in many of the threads where people have asked for advice...

I'm mostly skeptical of their ability to deliver what game developers actually need right now which nvidia tends to get right more often hence my preference in that direction at the current time.

I am most certainly less bias towards nvidia than people like rafster and lightnix are to ATI.

Maybe but you seam to really push certain things what is to do with nvidia
 
LOL! big bang drivers come out... ATI took over a month to respond with 8.12 which almost caugh up again performance wise... at which point nvidia bought out another driver set which gave 7-10% gains across the board... and guess who took almost a month to come out with a driver that caught up again? and again and now we see 182.xx and 185.xx drivers out with more big gains - in some cases more than 40% performance gains... ATI will no doubt respond in their next driver release... but when was the last time you saw them push out drivers with a performance increase that wasn't in reaction to a performance increase from nVidia?

Physx useless? under used maybe... only someone with bias or of limited intelligence would call it useless.

Limited intelligence? You're worse than helmutcheese mate! Are you his son or brother?? All you do is push what frankly are useless features, call anyone who doesnt agree with you "limited intelligence" and put down ATi on every front, if you're happy spending double the amount on some of the cards that came down then.....this round is the consumer round, there is so much choice and at a great price.

PS I have owned more Nv cards than ATi!
 
Last edited:
So you admit your bias then?

Not bias, this round for me is ATi, if 280s were avail for £200 I'd swap out my X2 for one based on the thread I opened (don't need the brute force of a X2)

The reason I am so vocal is this forum is full of crap about drivers, performance etc etc of ATi cards...

The difference between you and I is I would have no problems getting a NV card, you probably will never get an ATi card because of your love for their "tech"...
 
I disagree - nvidia had a working, stable and feature rich hardware physics implementation up and running in a matter of weeks... and it is something that game developers are interested in. I wouldn't call DX10.1 unimportant either some of the stuff like SSAO has huge potential... but outside of the few game studios that are being pressured to use it by ATI theres very little real useage or interest in it at this time. By the time most developers need these features to be able to implement things in a game we should have DX11 available and GPUs that can effectively use these features for best effect rather than in a limited highly specific fashion.

Physx useless? under used maybe... only someone with bias or of limited intelligence would call it useless.

OK, maybe not useless but at this current time i would not choose one card over another just because it supports PhysX or has DX10.1.

The reason I am so vocal is this forum is full of crap about drivers, performance etc etc of ATi cards...


Its not just this forum, its the Internet, everywhere people are choosing Nvidia over ATI because of what a few Fanboys say about ATI and there drivers.
 
Last edited:
OK, maybe not useless but at this current time i would not choose one card over another just because it supports PhysX or has DX10.1.

Yeah true enough.

Besides aslong as you have a spare PCI-e port running x4 or faster, and avoid vista (and theres plenty of reasons to do that) you can slot in a cheap nvidia card and get full physx acceleration, shame they don't promote this more. Although personally I'd rather see a platform independant solution.
 
I wouldn't say PhysX is a useless technology, it could do with a bit of spit and polish, sure, it's not perfect but what is? The problem I have with it is that Nvidia just use it as a marketing tool - they don't want to enhance gaming with it, they just want to use it to sell video cards, but only their own video cards. It's a lot like how MS use DirectX to sell their operating system, but at least it allows a reasonably fair market as far as video cards are concerned, but has pretty much kept the gaming market exclusively to itself. I could see the exact same happening but for video cards if PhysX were the only dominant accelerated physics API.

I mean, power to Nvidia if they do get it running on all GPU platforms without too many political issues, good for them. That said, I'm not sad to see the demos and that go after selling my 8800 GTS 320MB, just IMO. The only PhysX thing I played that really had impressive looking, game changing physics was Warmonger, but the game itself was terrible.

And yeah, it does annoy me to no end when people talk about ATi cards inherently having low minimum framerate problems and driver issues when typically that's not the case. The same as it would annoy me if people started saying AMD CPU's crash more or something ridiculous like that.
 
amd1ql3.png


I repeat, when the 4800 series were launched, they were aimed at the 9800 GTX and 8800 series. Only R700 was meant to compete with any of the GT200 cards at that point - they were priced accordingly, until Nvidia took the axe to their pricing.

On the note of driver performance, when the cards were first launched, things were reasonably similar to what we see now. The 180 'big bang' drivers came out, which gave Nvidia a reasonably large advantage in a few games came out, then there was a short period of time before ATi came back with the 8.12/hotfix drivers which pretty much levelled the playing field again.

That said, at the moment, GTX260-216 vs. 4870 1GB? I'd get the 260 because it's cheaper. No question. not because of CUDA, PhysX or people believing the drivers are better (falsely, IMO), just because they're similar cards with different prices.

Yeah the 4800's caught Nvidia completely off guard, the 4870 ended up rivaling the much more expensive 280, and even beating it at high resolutions with 8xAA, and since then its Nvidia who have been on the back foot, as they canned their first lot of GT200's within months of their release, they canned their 260 for a new one with extra streams (216), then not long after the 280's get canned.

ATi's won this round, and look like doing more damage, as they are about to release their 40nm 4770's in May, and what is Nvidia's response to those, thats right, yet more renaming of their years old 8800's. :)
 
Last edited:
I honestly don't get all the e-Drama when it comes to the current competing ATI and NVidia cards. I think we can all agree that ATI gave NVidia 'suprise buttsecks' with the 4800 cards. Whether or not the tech behind them is better than the 200 series is open for debate, but ATI have been clever to bring in GDDR5 early.

At least we can all rest safe in thge knowledge that NVidia have been shaken into action, they will NOT be resting on their laurels with the 300 series, they will be working like madmen to give us the '8800GTX of 2010' as I like to think of it.
Of course, ATI will not be idle either. The RV870 is sure to smash faces as well.
Good healthy competition from now on. :D
 
the main point is this round is the consumer round, not PhysX, DX10.1 or whatever. Both are technologies which are not product differentiators. If ATi didn't hit as hard as it did, you'd still be paying way more for your GPUs and have little driver development (do you really think NV would have an incentive to have released a big bang and as ATi have an incentive to bring some improvements recently?) Too much opinion here is in theory and development based which bodes little value to reality and the consumer. TBH all these extra things like PhysX and DX10.1 are great marketing bandwagons but nothing else...wether you buy a 48xx/49xx or a 2xx card, you're buying something good that will run games well.

I hope the GPU industry continues to be competitive in the future and I am glad there is a shift of focus away from monolothic GPUs to great value mid/high end cards (somewhat ironic as I own one!)
 
While your point has merit if we take your thinking too far... and just stick with whats best for consumers... well mostly thats about bang for bucks and people wanting as much as possible for as little money as possible... we'd still be stuck on fixed function pipeline cards and 256Mb of VRAM would be top of the line... and everyone would still be playing on 1280x resolutions. Half-Life 2 would just be coming out.
 
Last edited:
While your point has merit if we take your thinking too far... and just stick with whats best for consumers... well mostly thats about bang for bucks and people wanting as much as possible for as little money as possible... we'd still be stuck on fixed function pipeline cards and 256Mb of VRAM would be top of the line... and everyone would still be playing on 1280x resolutions. Half-Life 2 would just be coming out.


Err, right. May I suggest you read about competition and what happens when firms are competing. One aspect is to offer a product which is "better" than the other either primarily through a faster chip or secondary via extra features.

The scenario you talk about is more likely if one company existed - you pay more for a somewhat lesser product and there is less incentive for a company to innovate....there are many, many examples across industries....

What the consumer needs

- Bang for buck / value for money
- Choice
i.e. competitive market

What they don't need

- single major GPU player
- little choice/brand specific technologies

We were in danger of that during the 8xxx series and ATi 2/3xxx series, ATi became pale in comparison..

I'd much rather the companies outdo each other with new, faster exciting cards than push pointless tech....
 
Last edited:
As I've said before... what the consumer wants or "needs" and whats required to make newer and better games (with the very features the consumer is often asking for) doesn't exactly go hand in hand. If we countined that line of thinking we wouldn't have made half the progress we have now and would just about be playing games like Half-Life 2 right now. If companies are focused on value for money no one is willing to put up the money for the break throughs needed to move things forward.
 
As I've said before... what the consumer wants or "needs" and whats required to make newer and better games (with the very features the consumer is often asking for) doesn't exactly go hand in hand. If we countined that line of thinking we wouldn't have made half the progress we have now and would just about be playing games like Half-Life 2 right now. If companies are focused on value for money no one is willing to put up the money for the break throughs needed to move things forward.

We're not talking about better games or what not, that's the jobs of the developers to stop porting loads of games, polish their games more and MS to make DX better, we're talking about better GPUs, ATi upped their game to match Nvidia and Nvidia obviously feel pushed now, so we'll see some fantastic cards next round again. Hardware physics is part of that of course, but I firmly belive the industry standard will not be PhysX but Havok and OpenCL purely on the basis that Intel is one company Nvidia can not defeat. ATi are at fault here for, as you say, not supporting developers as well but I don't necessary agree with the NV way (in any other industry, it would be construed as monopolistic actions...)

I can't really say in recent times there have been many games that blow me away, and it's not because of the graphics, it's because of the poor gameplay. What's the point in having fancy effects if the game is crap.

And by the way, HL2 is miles and miles better than some of the games at the moment....:D
 
I can't really say in recent times there have been many games that blow me away, and it's not because of the graphics, it's because of the poor gameplay. What's the point in having fancy effects if the game is crap.

And by the way, HL2 is miles and miles better than some of the games at the moment....:D

Well there I do agree.
 
As I've said before... what the consumer wants or "needs" and whats required to make newer and better games (with the very features the consumer is often asking for) doesn't exactly go hand in hand. If we countined that line of thinking we wouldn't have made half the progress we have now and would just about be playing games like Half-Life 2 right now. If companies are focused on value for money no one is willing to put up the money for the break throughs needed to move things forward.
Developers are only prepared to take advantage of new hardware features and raise their performance requirements, on the basis that the majority of their user base will be able to meet those requirements.

And given the (very small) size of the high-end GPU market (which incidentally has fallen of a cliff since the global downturn) what counts in this regard is midrange penetration. Developers could only target 9800 Pro class performance once the 6600 GT had arrived, and made that level of performance mainstream, and the same thing holds with the 7600 GT and the 6800/X800 generation.

Arguably then, what holds back developers is not a lack of high-end parts to support, but instead a lack of suitably equipped low and mid range cards, which is what the majority of their customers will be running their game on. So in fact the best way to move games forward is to put the most powerful hardware into these price points, to ensure that even someone paying £100 can get 8800 GTX level performance, so that this becomes the new baseline. That's what ATI's strategy has done by forcing down prices massively. ATI has helped advance, not retard, game development.

Does this strategy slow down the pace of hardware development, as you claim? Well, think about it. Do graphics card companies want you stick with the same hardware you've always had; do they want you to ignore their latest products? Does that make good business sense? No.

They want you to upgrade as much and as often as possible and the way they do that is by increasing performance in every segment as much as they can, each generation. Even if ATI aren't serving the high-end GPU market, they're still going to be concerned with making the fastest mid-range card they can on the process they're working with. So even without a high-end market mid-range cards will still double in performance every generation, and games will still continue to get better because the mid-range segment is what developers target anyway.

If the HD 5870 is double the performance of the 4870 and the HD 5850 is much faster than a 4870, for $200, are you really going to complain that performance isn't advancing?
 
Back
Top Bottom