• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The [[Offical]] ATI vs NVIDIA thread

I am pretty sure they can be as bad as each other, nVidia has just recieved and awful lot of bad press lately, not that they don't deserve it.

I will always go for best pricepoint/performance at the time. I know I have a 5870 instead of a 5850 but heck, I spoilt myself this time and picked it up for 280 euros before the prices hiked with the intention of popping another 5870 in when the prices drop and if I feel my single GPU is lagging a bit performance wise (Rage engine will be the point I where I pop another in I think).

My 8800gtx is still going strong in a friends computer, that was a truly brilliant card for its time.
 
I am pretty sure they can be as bad as each other, nVidia has just recieved and awful lot of bad press lately, not that they don't deserve it.

I will always go for best pricepoint/performance at the time. I know I have a 5870 instead of a 5850 but heck, I spoilt myself this time and picked it up for 280 euros before the prices hiked with the intention of popping another 5870 in when the prices drop and if I feel my single GPU is lagging a bit performance wise (Rage engine will be the point I where I pop another in I think).

My 8800gtx is still going strong in a friends computer, that was a truly brilliant card for its time.
I think the same too. I hear some say they've had horrible experiences all along with drivers for a particular brand and it's usually the problem cards. It seems like each company has released some cards that are bad and it's usually those cards that give them a bad rep and never seem to get the drivers right. Other cards work just fine and people need to just look for the better supported cards, not the company (as both get it wrong at times).
 
Price and peformance plays a bigger part in my choice usually although i will admit nvidias shenanigans of the last two years pushed me a little more towards ati this time as well as having a good card out and for sale. I have no doubt ati do dubious things but at least they seem to have the sense not to get caught out as easily as nvidia.

I want a good card from nvidia for the competition but as a company i wouldn't be too worried if they vanished tomorrow taking their management with them. I have had and liked a lot of nvidia hardware in the past although i have to say i found drivers more trouble with nvidia then i have found with ati.

Horses for courses really and i will never be totally loyal to either one of them as i think that's a bit sad to be honest people need to realise both companys would screw us all over if it was in their best interests they have no loyalty to us and we should do the same back. They want my money they earn it by making good hardware and whoever has the best performance for the best price come the time i am upgrading will get my money regardless of which company it is.
 
Only from a moral perspective relating to the poor behaviour of Nvidia. The world would be a better place if more people shared that trait.

hear hear! In my line of work I'm starting to believe that many people honestly don't care about morals/ethics etc. They really don't as long as it's profitable.
 
All that chart shows, is nvidia sell a lot more units than anyone else! So will naturally have a higher crash rate than ati. How many of these crashes were do to operator error? Poorly set up pc hardware/software?

That's not all it shows, if Nvidia has 65% market share and ATI has 30% (Steam survey), then if their drivers were equally reliable, nvidia would only cause slightly over twice as many crashes as ATI. According to that chart it's far higher. Operator error would affect ATI and Nvidia the same surely?

A lot of people on this forum have reported problems with ATI's drivers on the 5XXX series cards, I don't have one of them so I can't speak to that but I don't get crashes using the same drivers on my 4890 in Vista or W7. My old reliable 8800 GTX on the other hand regularly crashed in Vista with the equally reliable nvlddmkm.sys problem, there used to be a huge thread here about it.

Also at least ATI try to get drivers out every month, when I got my GTX Nvidia didn't release any whql drivers for about six months!
 
I don't care which chip my card has..... ATI or NVIDIA, I don't care. Whoever has the fastest card for ~£200 wins. Last 2 cards I had were nvidia (gf7+8)... my current card is ATI.

My only prejudice, is buying cards manufactured by "budget" companies. Inno3d, Club 3d, Overclocker's value range.

In my experience they don't overclock well, and are much more likely to break compared to cards by quality manufacturers. I'd rather have a cheaper model-number made by a decent manufacturer, than one of these cards.
 
But seriously, I think the colours are what separates them, like 2 football (or soccer as it is called in Canada) teams. One football team is red and the other is blue, and you like the blue team, you will grow to hate the red team after awhile. It is the same mentality for ATI vs Nvidia, I bet, in the minds of a lot of poeple. Not me though, just an observation.

And yeah, that guy made me so angry. I wanted to pound him, lol. But then he would probably have gotten security to escort me to jail or whatever. That would not help me to play Quake 4, so I let it go, and he got what was coming to him anyway :p
 
I prefer nvidia just because of the drivers, iirc it's more hassle to force AA on catalyst ( perhaps it has changed) on games like Mass Effect or Bioshock ( you need to rename the .exe) than in the nvidia drivers, where you just turn AA on in the panel.

And for physx, even though it's barely noticeable, I like to have all the features of a game enabled/working nippy. And in the early days of GTA IV, it ran better on cards like 8800GT's than faster cards like hd4870 or hd4890.
Basically, the games that run/ran better on nv (gta iv,fsx, etc), I needed all the performance I could get, while in games that ran better on ATI cards ( source or generally high AA levels ( for me 4x aa is usually enough)...), I already had more than enough FPS. That's why I chose a GTX260 216sp ( evga SSC version, so oc-ed over stock) over a 4890 at the time...

I have nothing against ati though, I recommended ATI cards for my mates as they cared more about bang for buck, I wanted performance in specific games that happened to run better on nv cards ( mainly GTA IV) and mostly, I don't want any hassle forcing AA in games, I don't want to rename any .exe's as it messes up stuff like xfire ( the app), just want to use my drivers to enable all the AA I want in games where possible without a 3rd party tool.
 
Last edited:
Not really, the open source dirvers are not and will not be competitive. At best, we can hope the linux community ensures that at least basic functionality exists stably between Linux revisions.
I'm glad you can tell the future :p
In a year they went from zero support to every card up to the 4xxx series supported with basic power management, opengl etc. They have already achieved what you are "hoping" for. In the next 2 years gallium 3d will have time to mature (modern fully up to date(3, 3.1, 3.2) opengl support). It is also set to provide features the binary drivers lack. Such as exposing the gpu shaders for general use. One use they are working on is using shaders for video decoding so even old graphics cards will be able to accelerate h264 decoding (if you have the codec installed). This is going to be a lot more than "basic" support.

However, with 12 years of linux use I've experienced 12 years of ATI nightmares and will never, ever, ever touch 1 with a barge pole in a machine which will run linux. ever.
I never used them back then as I knew they were bad. They're alright now, aside from slow support for new x.org releases.

OpenCL what? No one has heard of OpenCL, it is dead in the water. CUDA is the de facto industry standard and the words in everyone mouths in the scientific community. Science is being done here and now, using the CUDA platfrom on Tesla hardware. just like Hoover is synonymous with vacum cleaner, CUDA is with GPGPU. Everyone tlaks about CUDA. OpenCL only appears on ATI publicity.
This is both:
1) Wrong (Hello from the physics community - using OpenCL :)) and
2) Sounds like a marketing slide....

GPU computing is very important and exciting to us but as CUDA has yet to take off in the scientific community (Mostly individual use, no large scale deployments) It looks like it'll be all OpenCL. CPUGPU computing is extremely useful, no; nigh-on essential for climate & weather prediction ... but as there are no supercomputers using CUDA it simply isn't used for this.

If you were referring to the Tesla based supercomputer that was planned - it was cancelled due to concerns over the heat and cost.

I would prefer an entirely open standard, and with time OpenCL may come to bear fruit. But its already too late, CUDA is now the standard.
Only for games, which will now switch (aside from TWIMTBP) because both parties will support it. Developers want to sell to more people, they don't care what vendors hardware you buy.

That's not all it shows, if Nvidia has 65% market share and ATI has 30% (Steam survey), then if their drivers were equally reliable, nvidia would only cause slightly over twice as many crashes as ATI. According to that chart it's far higher. Operator error would affect ATI and Nvidia the same surely?

Steam is completely wrong. Market share based on actual sales:

........Q4 2008.....Q3 2009......Q4 2009
Intel ..47.7%........53.6% .......55.2%
Nvidia 30.6%........25.3% .......24.3%
AMD ..19.3%........20.1% .......19.9%
http://techreport.com/discussions.x/18366
 
Last edited:
But does onboard video really matter, since it is so crappy? If you want to play games on your pc, you have to buy either nvidia or ati.

That wasn't what I was saying. Look at the difference 19.9% vs 24.3%. Nvidia has a little under 1/5th more users. This is not 65% vs 30% - Steam includes onboard too. It's closer to 58% Nvidia vs 42% ATI.

By Steam statistics it is out by 7 points on Nvidia and 12 on ATI.
 
Last edited:
This is both:
1) Wrong (Hello from the physics community - using OpenCL :)) and
2) Sounds like a marketing slide....

OpenCL is used by ATI and yes NV use it a bit as well(not much)
But the BIG boys use Cuda even I use Cuda in Adobe prem for HD films were as opencl can't do it and many apps now use Cuda for everyday use.


GPU computing is very important and exciting to us but as CUDA has yet to take off in the scientific community (Mostly individual use, no large scale deployments) It looks like it'll be OpenCL. It would be extremely useful, no, nigh-on essential (for the future) for climate & weather prediction ... but as there are no supercomputers using it this is not possible.

a lot of universities are using Cuda now for Ray Tracing\RayScale\scientific computing
and don't forget that general purpose programs in C, can run on the GPU using Cuda

If you were referring to the Tesla based supercomputer that was planned - it was cancelled due to concerns over the heat and cost.

I think you should tell Microsoft that as their new server software is being run by Cuda and the GPUs :)

I think you have gamers that want the fastest card just for games.
Then you have the rest of us who use the GPU for more then games.
 
OpenCL is used by ATI and yes NV use it a bit as well(not much)
But the BIG boys use Cuda even I use Cuda in Adobe prem for HD films were as opencl can't do it and many apps now use Cuda for everyday use.
That isn't CUDA. It is OpenGL / Direct X

Source - Adobe: http://kb2.adobe.com/cps/405/kb405445.html

a lot of universities are using Cuda now for Ray Tracing\RayScale\scientific computing
I know of individuals working at a universities doing so. But no large deployments. I'd genuinely be interested to see one though if you know of any in particular?

and don't forget that general purpose programs in C, can run on the GPU using Cuda
Yes but it requires a great deal of work. Fermi should change this, though.

I think you should tell Microsoft that as their new server software is being run by Cuda and the GPUs :)
Is this what you meant?

"Microsoft Research, which has just installed a parallel x64 server cluster running Windows and a whole bunch of Tesla GPUs. (The exact configuration has not been divulged by Microsoft Research."

Source: http://www.theregister.co.uk/2009/09/28/microsoft_nvidia_collaboration/

This is what I mean. One cluster hidden away in a Microsoft research center, hardly taking over the world.

I think you have gamers that want the fastest card just for games.
Then you have the rest of us who use the GPU for more then games.
Unfortunately most of the use of CUDA has been for games, though (Physx).
 
Unfortunately most of the use of CUDA has been for games, though (Physx).

Thats rubbish - CUDA adoption in industry and education isn't insignificant and is fast growing theres 100s of projects (of all sizes) that are publicised and lots more that aren't. CUDA is preferred over Open CL due to:

Better support
More mature and stable feature set
Only having to debug against one platform
Better documentation
Better performance (currently) for many commonly used features
Larger availability of (quality) framework and blank slate projects
The level of intergration with visual studio

I wouldn't say Open CL is dead in the water - many people I talk to would prefer to use it if some of the above points weren't true - but currently CUDA is fast becoming the industry standard and Open CL is losing popularity in commercial useage. I would say your viewpoint is skewed due to being too close to the scientific community where Open CL took up ground faster before CUDA was really anything to talk about.
 
Last edited:
Back
Top Bottom