• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The [[Offical]] ATI vs NVIDIA thread

NVIDIA have CUDA that seriously speeds up programs that can use it.

Examples- ADOBE premiere and after effects,

Folding at home seems to be absent from ATI current cards, even thoe they had it first.


This
Also I lost faith in ATI after the VIVO 19XX cards balls up.
They just forgot to tell people that the VIVO chip would be on SOME cards the other would use software VIVO I took mine back asap.
 
Lol graphics card warz. Always brings a smile to me face.

graphicscardpr0n2.jpg
 
Last edited:
Just out of interest, what do you need that nvidia does that AMD doesn't?

In most cases this is going to be PhysX/CUDA. Folding@Home, as an example. Also, CUDA works with several Adobe programs, which will help a lot since that software tends to get bloated over years of upgrades and new versions.

Lol graphics card warz. Always brings a smile to me face.

This isn't a war, it's a proper discussion. As the title suggests, its a discussion thread, which is not for fanboy rants or people who have no knowledge of both sides. Also, please only join in if you have something worthwhile to bring to the table.
 
Last edited:
The point is that the "best card for the job" is entirely down to a mix of what the job is, personal preference, and cost. Just saying "this is the fastest card in the world" doesn't mean anything. If you're Folding, you don't use an ATi card, even if it is the fastest card.

I will back down now, since you've added a comedic magazine front. I lol'd at the warning.
 
The point is that the "best card for the job" is entirely down to a mix of what the job is, personal preference, and cost. Just saying "this is the fastest card in the world" doesn't mean anything. If you're Folding, you don't use an ATi card, even if it is the fastest card.

I will back down now, since you've added a comedic magazine front. I lol'd at the warning.

Yeah I'm not trolling or anything. Just that the whole fanboy issue that so often crops up in this and other hardware forums is so unnecessary. All people have to do is read some reviews from a variety of sources and decide on a budget and hey presto - new card drops through letterbox. It doesn't help when Nvidia especially likes to spend more money on marketing than on actually producing cards - e.g. the constant rebrands and extraneous features that they like to add.
 
I will agree - the renaming of nVidia cards is annoying. But on the plus side, it means you can tell which cards are better now, rather than having the old problems that used to arise...

GTS250 is beaten by GTX260 which is beaten by GTX275 etc. number goes up, card gets better. It also doesn't help that nVidia had just hit a card named 9800... which is the same as an old Radeon card.

In the old days, the 7800 was more powerful than an 8600, and an 8800 was more powerful than a 9600... number goes up, card gets worse. I much prefer the new way of doing it... AMD/ATi seem to have taken up that naming convention, which means you actually have to look at the spec to figure out which is better.
 
I was going to pick a nVidia card, simply because most people have nVidia, looking at steam stats. But as none were in stock here and those which were - were stupid prices, I went for ATI. I chose the 4890 for £150, although gutted as it went down £10 on halloween the day after I bought it and wasn't warned of this "halloween sale", yet I am very pleased with the card.
 
I don't care either way and am very surprised seemingly intelligent people can be reduced to squabbling playground kids over something so arbitrary.
 
I was going to pick a nVidia card, simply because most people have nVidia, looking at steam stats. But as none were in stock here and those which were - were stupid prices, I went for ATI. I chose the 4890 for £150, although gutted as it went down £10 on halloween the day after I bought it and wasn't warned of this "halloween sale", yet I am very pleased with the card.

Why would they warn you :O
 
It doesn't pay to get too picky about businesses and ethics... look too closely at 99% of businesses and you wouldn't want to deal with them either...

I really can't understand that, it DOES pay to be picky, even if it means you end up not using many many companies.

If everyone was as picky, and "evil" companies(not talking gpu makers here, but seriously evil companies that dump toxic waste, test drugs on people it shouldn't, etc, etc) wouldn't get much business and taking the low road as a business would be a less common thing, which would benefit us all.

Personally, I try not to use any companies "I" deem to be immoral, cheat, etc, etc. AGain I'm not really talking about computing in general here, but because your post wasn't really aimed there either, but in general.

For instance personally I don't care that Tesco's puts smaller shops out of business, thats society moving on, not Tesco's being evil. Its sad, but every industry has to move on and every industry loses jobs to increases in efficiency and new production methods, just like mining jobs have been lost for decades, small grocers simply aren't required, tough/harsh, maybe but its not evil.

While some toy company that puts toxic chemicals in kids toys, to save a buck, are evil and should be left to die when no one uses them.

Nvidia certainly aren't "evil" , but I really don't like their push for closed standards and anti competitive practices, because it harms us all. Lets say Nvidia got their grubby mits on every game, every game ran 90% faster on their hardware due to dodgey physx and optimised paid for code. When ATi cards are no longer a choice, that leaves us all screwed on the prices of the only competitive cards.

Clean open competition with help from both sides to help games run faster, but NEVER at the expense of the other and every standard open, with both companies focusing on getting their drivers running as well as possible and letting game dev's make their games, as they see fit, with easily available open physx/ai/weather engines they can reduce game design time by using.

Imagine if instead of 800 games all coming up with a new weather system engine, there were 5 constantly updated engines they could just pick one of and optimise as they seem fit. You'd be cutting down the time to produce games, or more importantly, freeing up man hours to make the rest of the game better. If physx was open and freely useable, physx could just focus on improving themselves, and both gpu companies could spend whatever time they wished optimising physx's code to run on their gpu's. Rather than one company modifying physx code, not for quality or improvements, but to run better on one type of hardware over another.

I couldn't care less whose on top in general, and have never cared a jot about who has better flat out performance, its ALWAYS been price/performance for me. Which meant a couple 6800's in SLI when I got a great deal were superb(if loud) cards, and a cheap 8800gtx for 1/3 less than they usually were got my money, £200 when they were normally £300 everywhere ;) (if crap 64bit drivers when Vista was new).

My honest analysis though, is that Nvidia can't survive long term, not least if they insist on building cores that aren't designed for the process they are going to be used on. AMD has made HUGE gains simply by deciding to switch from "lets design the best core we can think of, then think about producing it" which caused a huge problem, to "lets make the best core we can ON THIS SPECIFIC PROCESS with all its limitations" which has since they've done this, meant huge profits, huge sales and fantastic prices for us.

The fact the 5XXX series is out on time, working with good performance on this, the worst process out of TSMC yet is pretty much proof of this.

Even if Nvidia change to a better working plan, their long term future is still heavily in doubt, which is a shame, I'm hoping Intel's next, or more likely next but one version of Larabee gets competitive, because AMD being the only real option will lead to problems. Not on purpose, competitions breeds innovation, when theres no pressure, you simply aren't rushed into coming up with brilliant idea's.
 
Last edited:
Im price/performance driven and refuse to spend anything over £200 on a graphics card. Before this I used to buy £350+ cards until I bought a 8800GTX which died after 14 months (2 months out of warranty).

I dont see it mentioned but I think its *wrong* how nvidia renumber old stock, repackage and sell it as 'new cards' - using 2-3 year old tech.

So yes Im a little biased due to bad experiences with nvidia cards. ATI have had driver issues in the past, had some fun with an old Radeon 9800 Pro back in day - although my current 4870 is working away happily.

I was a little concerned about the heat aspect - if you leave the fan on auto in the Catalyst Control Centre temps were hitting 70-80c idle - a little too high for my liking - moved to a fixed 40% speed, yes its a little louder but as we speak the temp is 47c - scary difference..
 
I'm lucky where I am right now. My PC is running amazingly well at everything I've thrown at it since i upgraded to where I am now. (about 8 months?) I've even got some fresh releases on there, and every game runs so smoothly.

I'm lucky because I know that Fermi is just around the corner. If it works and is amazing, AMD will bring out a new card to combat it, and as mentioned, competition breeds innovation (nothing does this better than a war... look at how much tech increased over WW2!). By the time I need a new PC, I'd guess there will be at least one card from each designer in my price range that is better than my current setup.

I know there at least used to be something between AMD and Intel in that they'd share some tech, which helps each other get better stuff, makes them more competitive, which pushes each other to get better and everything... Why don't nVidia and AMD do this?

Wouldn't it be so much more competitive and awesome if AMD share what its doing with cores - the huge number of shaders and raw power... and nVidia share CUDA and PhysX. I would love for that to happen. Then both companies would just be working on driver optimisation and new cards when tech hits new levels.
 
ATI have the most horrific linux support imaginable, while Nvidia's support is more or less flawless, given the constraints.

Nvidia's CUDA is an amazing technology that is really well supported and rapidly being taken up as the industry standard.
 
I am going to have to stick with nivida, mainly because ATI drivers in linux are so bad that they have now started to include a holographic of "ATI unsupported hardware" in the bottom corner when I ran my 5770 on SUSE.

I sold the cards and ran back to my old GTX260 where linux drivers are awesome.

It's such a shame as it's the only thing holding me back from buying ATI and sticking with them.

I want to buy a 5870 or a 6000 series card, but can't with the state of their linux drivers.
 
IMAO - The [[Offical]] ATI vs NVIDIA thread - he know what he was starting + it would bring out the fanboys on either side, these threads have been up before!!!!

OFF with his head or Ban him for a month lol - cast your vote.

Have had both make's of cards, l just go for the best one at the time - price + performance.
 
I use my graphics for 2 simple purposes, games and for hooking up to my tv.

As such, I've always opted for a mid-high card with the best bang for buck, and 8 out of 9 times this has been ATI.

9000 pro --> 9800 pro --> X800XT PE --> X1800XT --> 2900XT --> 8800GT --> 4850 --> 4870 --> 5770
 
Back
Top Bottom