• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

DX10 ATI vs NVidia gaming benchmarks

Well I've been talking to some ppl with GTX's and they can run the beta smooth at 1680x1050 with a overclocked GTX, thats with everything on high and possibly 4xAA.

AA is disabled for the beta
Shader's on high doesn't cripple my GTS. I just get an odd pause when walking between indoors & out. Its enough to annoy me but isnt as bad as people are saying. I play everything on high except for texture's & shader's on medium, they run on high, but I prefer it smooth 100% of the time!
 
I play everything on high except for texture's & shader's on medium, they run on high, but I prefer it smooth 100% of the time!

Hence that is because it is only a BETA! :p its not fully optimized and ain't running at its best, I bet it runs fine on high in the final version, or even the demo!
 
AA is disabled for the beta
Shader's on high doesn't cripple my GTS. I just get an odd pause when walking between indoors & out. Its enough to annoy me but isnt as bad as people are saying. I play everything on high except for texture's & shader's on medium, they run on high, but I prefer it smooth 100% of the time!

So why when I enabled 4xAA or 8xAA does it look a lot less jaggy? something is happening :confused:

So what sort of FPS you get? at all high I get around 19fps-40fps, even 30-40fps is unplayable in crysis.

Either way there is something producing the same exact effect of AA when you enable AA, so its as good as to me, shame it cripples performance even more when I enable 4x, I would be surprised if the actual full game will be much better than this, a 20% increase in performance would be good, but doubt it tbh.
 
Last edited:
Ermmm, I will say what I think, and when I see the 8600GTS ever beating the 2900XT, I will think thats a discrace, no, the card aint a lemmon, its just crap, and anyone who thinks its better than an 8800 series card is wrong.

Stop trying to defend the card, you dont own one.

you are a tool sometimes, you do realise by what you said , you indicated the 8800gts's are crap, yes because if you take the blinders some of the tests the 8800gts's are beaten by the, yes 8600gt's, but thats of NO concern to you and your anti ATi speil is it.

also worth pointing out that in every game(excluding lost planet, because anyone that claims its playable in high res with dx10 is just wrong, everything performs like crap), at 1600x1200 and 1920x1200 the 8800gts's, both, get beaten by the 2900xt 512, in MOST of the tests it wins at 1280x1024 aswell. remember, there are OVERCLOCKED 8800gts's in there, which tend to just tip the scales, however the 2900xt's overclock great, and get a pretty big damn boost from overclocking. stock to stock, the 2900xt is the better card in most things except lost planet(utter tosh code) and world in conflict. what would be very interesting is this, ati and nvidia often have performance glitches with brand new games. first release of drivers after can often massively improve things(not always). world in conflict is the only game here where the 2900xt is not better than the 8800gts's.

in all real situations, playing the card above 1024x768 basically, the 2900xt IS faster than the 8800gts's. WIC is the only exception, and its the newest, and it hasn't had a driver update since release. if 7.10's, possibly 7.11's improve performance a lot then the 2900xt beats the 8800gts in another game.

i think, quick check, the only game the 8600gt's can beat the 2900xt, is WIC, and in the benchmark for it the 8600gt can beat a 8800gts.
 
Apologies to the guy that asked for a link about nvidia having the dx10 spec changed. I had a 2 min look as going to bed, but couldn't find the article. It was something about virtualisation I seem to remember, might have been on beyond3D, but can't remember as it was a fair few months ago.

I hope you're not remembering this fine piece of 'journalism' by Charlie over at the Inq. The opportunity for him to have a rant at both nvidia and microsoft in the same article was obviously too good for him to miss.

The Beyond3D forum seems to take a somewhat less sensationalist view of the subject.
 
you are a tool sometimes, you do realise by what you said , you indicated the 8800gts's are crap, yes because if you take the blinders some of the tests the 8800gts's are beaten by the, yes 8600gt's, but thats of NO concern to you and your anti ATi speil is it.

also worth pointing out that in every game(excluding lost planet, because anyone that claims its playable in high res with dx10 is just wrong, everything performs like crap), at 1600x1200 and 1920x1200 the 8800gts's, both, get beaten by the 2900xt 512, in MOST of the tests it wins at 1280x1024 aswell. remember, there are OVERCLOCKED 8800gts's in there, which tend to just tip the scales, however the 2900xt's overclock great, and get a pretty big damn boost from overclocking. stock to stock, the 2900xt is the better card in most things except lost planet(utter tosh code) and world in conflict. what would be very interesting is this, ati and nvidia often have performance glitches with brand new games. first release of drivers after can often massively improve things(not always). world in conflict is the only game here where the 2900xt is not better than the 8800gts's.

in all real situations, playing the card above 1024x768 basically, the 2900xt IS faster than the 8800gts's. WIC is the only exception, and its the newest, and it hasn't had a driver update since release. if 7.10's, possibly 7.11's improve performance a lot then the 2900xt beats the 8800gts in another game.

i think, quick check, the only game the 8600gt's can beat the 2900xt, is WIC, and in the benchmark for it the 8600gt can beat a 8800gts.

Have you had both cards?

I'm not anti ATi, I thought the last ATi card I had was totally awsome...

I dont think someone is wrong if they claim playing Lost Planet on high at 1680x1050 with a heavily overclocked GTS is wrong, I guess there system must be wrong then for giving them that much FPS.
 
Have you had both cards?

I'm not anti ATi, I thought the last ATi card I had was totally awsome...

I dont think someone is wrong if they claim playing Lost Planet on high at 1680x1050 with a heavily overclocked GTS is wrong, I guess there system must be wrong then for giving them that much FPS.

I think he owns a GTX/GTS and a 2900XT. Whatever card he does not had I'm sure I've seen him post that he's had all three but I might be inaccurate.
 
Have you had both cards?

I'm not anti ATi, I thought the last ATi card I had was totally awsome...

I dont think someone is wrong if they claim playing Lost Planet on high at 1680x1050 with a heavily overclocked GTS is wrong, I guess there system must be wrong then for giving them that much FPS.

i have an unsold 8800gtX sitting here in this room, while my 2900xt is in my vista 64 rig, i had it in there for 3-4 months before the 2900xt came out, when i got the card suddenly i stopped getting random bluescreens, CTD's with nvidia driver reseting popup and other random irritation. also had a gts, a x1900(actually have a 1900xt in 2nd rig, a x1950pro i used between the 8800gtx/2900xt(to play lotro as it bluescreened after 5 mins without fail in late beta early full release) etc, etc.

the simple fact is that in that review, the 2900xt BEATS the 8800gts's, both, in MORE than half the benchmarks at anything except 1024x768, which is a resolution i wouldn't drop to even with a x800. so based on a brand new game, which would be most susceptible to improvements, and lost planet, a widely crap game in both dx9/10 on BOTH cards you say the 2900 is useless. the fact was the 8800gts 320 lost to the 8600gt 512 in the same benchmark for lost planet, but again, used this as a reason the 2900 was the much much worse card.

you can not pick and choose,overall its VERY close between gts/xt in everything, take out the 1024x768(be honest, you think anyone uses that res?) and the Xt is ahead, marginally. yes it doesn't beat the overclocked gts's, which take the lead back, marginally again, but you can overclock the xt's so it all ends up the same.

frankly either card gives a very similar performance level.

nvidia screwed up because it made vista generally utterly useless from crashing and other issues for months.


hardware AA is going the way of the dodo, again ATi did something architechturally great, and something Nvidia WILL do, a generation or two early--- do'h.

fact is AA is dodgey in a huge majority of big releases since and including STALKER, this is a not a short term phase, but the long term intention of game designers. the 16 pipes aren't "hugely" a problem, it doesn't help in benchmarks because when theres little to do on screen, you need base raw power to push the framerate up. so the gtx in low load scenes has a lot more raw pixel pushing power to push up the framerate to 300fps when nothings going on, and unfortunately 16 pipes is limited when it comes to maximum framerates, which sucks but thats life.

but when those 16 or 24 pipes are waiting on pixel process's to happen they become a lot less loaded, think cpu cores waiting on memory access to a point, they aren't fully loaded, so minimum framerates need less pipes.

the ati architechture is much harder to leverage all the power out of, which is a shame, i'm not sure if what was dropped from DX10 support meant game makers could put less effort into shader AA or anything, afaik it was a virtual memory usage type thing that Nvidia dropped support from and MS let drop out of the DX10 spec totally.


the simple fact is, that anyone thats honest thats used a 2900/8800 will tell you at a mid/high res they are very VERY equal to each other. they are a bunch of games they perform almost identically, there are lots of games where ATI pull ahead, and a couple where they have a ridiculous lead, and likewise there are nvidia games which let them pull ahead, and a couple that have a ridiculous lead.

obviously this means a few people playing a very particular set of games can have a massively better experience on one card over the other. a very varied/large array of game usage and you get a similar experience on both cards.

there are some very big older game benchmarks where high aa levels cause the 2900xt to drop behind, but they are old games, most of them we are talking 150 instead of 200, nothing last gen can't handle them. most(moving towards all) new games you will not be using much in the way of AA settings with ati/nvidia so that isn't largely relevant anymore.
 
I think he owns a GTX/GTS and a 2900XT. Whatever card he does not had I'm sure I've seen him post that he's had all three but I might be inaccurate.

only just noticed that one, i'm being followed.......... help :p

yeah, i'm a bit of a hardware nut, i benchmark to the point of checking everythings working, but that means one run of the latest benchmark on install and maybe with new drivers. i just like to build computers, and find its same cost to sell at the right time then buy the next gen card/board/cpu, as it is to completely replace everything for one lump sum every few years.

i think the only cards i haven't owned in the past 5 years were a couple ultras/gtx 512, and not sure i had a 7900 of any kinda. i play a lotta different games, mostly looking for something good to play and finding crap games and moving on ;)

as i said before, you can have a noticeably poor experience if you happen to play 5 games and they are all crap on the opposing hardware, but there are games that run crap on either hardware at the moment. thing is, i buy newest hardware to run the games that are out in the next few days, weeks, months after i get the card. theres very few old games that don't run great on old and new cards. thing is, when you buy a card now, you don't know for instance if ut3 will run better on ati/nvidia hardware. its easy to end up playing 5 games longterm, and buying the card that does better in those 5 games. its impossible to buy a card you know will run the NEXT 5 games better, but law of averages is some will run better on ati, some better on nvidia. for 5 years the competing cards in any generation get a lot of praise, but there was no real end user difference between a 7800 gtx 256, and a x1800xt, or a 7900 gtx, and a x1900xt, or a x800 and a 6800. ok, the 9800 was probably quite a bit better than the 5900, but then the gf3/4 spanked the 8500.
 
raw fillrate determines high resolution power.

do the maths, resolution of 1280x1024 requires 1310720 pixels on screen to compose the frame. then multiply that figure by 60 for 60 frames per second and you can then see that fillrate is a massive demand.

its stupid to have 320 shaders and only 16 pipes, the 16 pipes are a huge bottleneck.

2900 is not ideal for high rez work, and when the AA is put on the fill rate demands increase more.
i explained this over 6 months ago but no one listens, they just cry after they bought the 2900.
 
Will and Mav would make a great couple, they could even team up and be the official forum clowns.;)

O_o

Sorry but I cant be flamed for saying my 8800GTS WAS faster than my 2900Xt which is working perfectly.

Raven works for ATi? That explains it all :rolleyes:
 
Very nice from someone with 24 posts to insult others. :rolleyes:

Actually my original account " RavenUK" had its password hacked and i didn't have the original email to sort it out, your a one to give lectures on insulting others, how many bans you had now?
 
Last edited:
Actually my original account " RavenUK" had its password hacked and i didnt have the original email to sort it out.

a likely story, more like you went on a rampage and posted loads of crap when you were drunk and now you being sly with the return of the raven on a second account. ;)
 
Actually my original account " RavenUK" had its password hacked and i didn't have the original email to sort it out, your a one to give lectures on insulting others, how many bans you had now?

Whats that got to do with you ?, go and look at the ban and the simple reason for it (quoting swearies from past post and left 1 letter unstarred out in each word).

Was not a big deal and I can live with it, even you commented on it was harsh :rolleyes:
 
Back
Top Bottom