• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD To Launch RV770 On June 18th

i get the same bsod from the nvidia driver, if only ATI's cards were up to par with Nvidias speed wise, or just a tiny percent slower.. id have no problem going with ATI

especially since i like a lot of source engine games

then again i guess thats why they are cheaper?
 
you wont notice 10fps when your above 80 anyway

well that depends, if its 10fps ontop of 40-50 then you would

You have 8800GTs in SLI...if you had just cheapo 3850 512mb cards in crossfire you probably wouldn't notice the difference at all. Also you can easily pencil mod them and they will do well over 900mhz on stock cooling. At those speeds the setup would be quite fast.

so why do so many people buy nvidia cards then? :p
 
well that depends, if its 10fps ontop of 40-50 then you would



so why do so many people buy nvidia cards then? :p

even when ATi were obviously ahead nVidia sold more - not because they were better tech but because they are better at marketing. NV are very good at viral marketing as well and its seems very likely that a lot of the FUD about ati originates from that.
 
The thing is, even a 3850 at stock plays pretty much everything (apart from Crysis) at 1680x1050, 2xAA, 16xAF, max details above 60fps no problem. I average about 60-65 in CoD4 multiplayer at these settings, and the same in GRID..and in some games like UT3 of course you get 100fps at least.

There's no doubt the nvidia cards are better, but for example an 8800GT might get you 70fps average in CoD4 at the same settings....but it (used to) cost MUCH more. At the moment the clear choice is the 8800gt since the price has dropped so much.

Furthermore, the ATI cards can be easily pencil modded, which in my book is a huge plus...if you have aftermarket cooling most 3850s will hit 950mhz, 900 on stock cooling. I always just look at bang/buck and overclockability, and even though I loved my 8800GTS 640mb and voltmodded 8800GTS 512mb....the ati cards i've owned have beaten them in terms of price/satisfaction ratio.

My point is that a lot of times people only look at benchmarks and don't realize they are paying 50% more for 10% increase in performance.
 
well i go for best performance in game benchmarks usually. The last and only ATI card ive had was a 9500

You've kind of contradicted yourself here. 9800Pro was the king of the hill of it's time. The X800XT PE was the king of it's time due to SM3 being in it's infancy. The X1900 series was a much better card than any 7 series. Terrible IQ, the inability to do AA and HDR simultaneously and couldn't scale as well as the ATI card (oh, and bad failure rate). Although somehow the last card you've had was the 9500 :confused:.
 
You've kind of contradicted yourself here. 9800Pro was the king of the hill of it's time. The X800XT PE was the king of it's time due to SM3 being in it's infancy. The X1900 series was a much better card than any 7 series. Terrible IQ, the inability to do AA and HDR simultaneously and couldn't scale as well as the ATI card (oh, and bad failure rate). Although somehow the last card you've had was the 9500 :confused:.

I believe the difference between the 9500 and 9700 was an extra 4 shaders that were locked, the 9500 could be exploited to unlock the shaders, and turn it in to a 9700 pro, and then overclocked...
 
Last edited:
so why do so many people buy nvidia cards then? :p

umm, better nVidia marketing, and the fact that most people (enthusiasts) look at benchmarks and then go for that, instead of taking the time to pencil mod and alter their cards... in other words it's all about raw FPS, and e-peen, and because ATi isn't able to come out on top in those areas, they get ignored.

anyways, +1 on staying with ATi. Though i like my 8800GT, it has too many issues in the way of drivers, and can't overclock well at all, contrary to most, and i don't like the company's attitude and IMO lack of innovation.

Intel so far has kept me happy, but i won't hesitate to go back to AMD if i can...
 
I mostly had 2 bad experiences of ATI more or less in a row and at the time when I was replacing cards nVidia's were the fastest. I'd since got some newer cards since then that were also from nVidia mainly due to 1) I liked my older card and 2) they were again the faster brand when it came to upgrading.

Would really like a nice, cool and fast 55nm ATI GPU though to replace this GTX 90nm power-hungry beast :D
 
Nvidia's driver release programme isn't anything to be proud of, but lets not put ATI up on a pedestal here either - when the 2xxx were released, the driver support was shocking, especially for AA and took several months to start to improve.
 
The X800XT PE was the king of it's time due to SM3 being in it's infancy. The X1900 series was a much better card than any 7 series. Terrible IQ, the inability to do AA and HDR simultaneously and couldn't scale as well as the ATI card (oh, and bad failure rate). Although somehow the last card you've had was the 9500 :confused:.

The x800xt and 6800U exhanged the performance lead on an application-specific basis, so it's unfair to imply it was "the king of it's time".

As for the x1900, yes, it was arguably a better card than the GF7-series (though that was still application specific). However it was released on Jan 24th 2006 - over 6 months after the 7800GTX. Very few people who bought a 7-series will have considered the upgrade to the x1900 to be worthwhile in performance terms.
 
The x800xt and 6800U exhanged the performance lead on an application-specific basis, so it's unfair to imply it was "the king of it's time".

As for the x1900, yes, it was arguably a better card than the GF7-series (though that was still application specific). However it was released on Jan 24th 2006 - over 6 months after the 7800GTX. Very few people who bought a 7-series will have considered the upgrade to the x1900 to be worthwhile in performance terms.

Aye well fair enough on that regard. I kinda overstepped the mark on that one as it's tit for tat but I didn't want a hole in my debate :p.

7800GTX to X1900XT/XTX maybe wouldn't be considered by all but a few will have. Capable HDR + AA, better IQ by far, gamers on bigger resolution screens. Plus if the 7800GTX failed (which a few actually did) then the X1900 would look more desirable.

I did read an article not too long ago that suggested that Nvidia's hardware works best for the generation of games that are based around that card at the time where as ATI's hardware was meant to give better performances through the newer titles. Now when it comes to the X800 for this then it's debatable about this as it can't run SM3. However, in games that both cards could run, the newer titles favoured ATI (from what i read as there were a few benchmarks I think, though not sure).

Although it's completely useless information for the majority of people reading this but it would be useful for those who tend to hang on to their hardware for longer than we do. I just searched for the article and came up with nothing :(. I also think that G80 isn't a part of this trend either. I just think it applied to 6 series and back. Maybe even 7 series. I can't remember :o.
 
Loved my 7800GTX, those were the days of oblivion and if I recall was playing hitman blood money and just as I was gonna strangle a copper my 7800GTX started to burn :)

I let too much dust build up in the fan and guess the chip just decided to melt :)

Skipped the 7900 series and went 8800GTX :)
 
I think im going to buy RV770

Was on bf2 last nite and a cluster bomb landed next to me at the same time as artilery and both exploded at once, and i got that terrible multicoloured stuff on the screen, where your gfx runs out of ram / cant load as many textures as its supposed to, you know the multicoloured ribbon:
screen1.JPG


Which i think is because i dont have enough vid mem (256mb) to handle 1680x1050 and another monitor at the same time @ 1280x1024.
 
Back
Top Bottom