Get a grip yourself m8, and never tell me what to do or grow up you dont have any say in either matter.
I never said you said I was stupid, I asked the question.
WFT Scottish thing, I am Scottish so what did you imagine I said in a thread ?
Also looking at a RIVAL's Architecture WILL help, you can see strong and weak points and better it and learn from it.
thats basically just wrong, looking at someone elses hardware, firstly, is fairly difficult, secondly copying parts doesn't work, thirdly, r600 had production problems more than anything and changes in staff, its likely the first silicone designs for the core were completed 1-2 years ago, with single cores made at massive cost, improvements/tweaks added over the time. you can not have a architecture basically finished, look at another card, and change your design to match, they run fairly differently. you also can't just cut out a single part and add in another, the whole ground from the 1st to the last is designed to work in sequence, you can not change a part of that because you want to, you can't.
from what i've seen world in conflict shows bigger drops in performance going dx9 xp to dx9 vista, and a similar/slightly worse drop in performance when switching to dx10.
ati made semi the right call, but wrong time. AA hardware is changing massively within the core. almost ALL games coming out now, all dx10 games are having issues with AA because simply, they do not use AA hardware. they are using other methods to improve image quality, ATi knew this, nvidia have complained about it. next gen, maybe the one after nvidia will be adopting a ring bus type memory interface, again it was just a touch early. the R600 is a case of insanely good tech, just misused and introduced to early. however, its worked fantastically by all accounts on their lower end units which are cheaper, and much better for the segments they are aimed out. they are winning contracts with big oem's over nvidia almost daily right now. ati HAVE infact seemingly won this round. for instance dell switching from mostly nvidia to largely ati is a MASSIVE, i mean a massive win for ati. there are a bunch of other oem's that are switching to use ati in the low end. if nvidia profits are 50% low end, 40% mid end, and 10% high end, and they are losing the low and high end they've failed massively. this is highly indicated by their massive push to get a midend card out cheaper to win it back, and lower end cards on a smaller process.
there are actually dozens of small things in world of conflict that look miles better in dx10. there are 2 games i think that run badly in dx10 really, that would be company of heroes mostly as it basically looks marginally better, if at all, for a massive performance cost. its main reason for performance drops is infact cpu overhead. if you can limited to run 100 particles because the overhead is large, but its a cpu intensive style game. then you add dx10, the overhead of the particles is reduced, leaving it like that the performance WOULD increase, but they added 100's of extra particles, adding further overhead to the cpu, and leaving more to be drawn by the gpu. it was just a terrible update. bioshock runs very similar speed dx9/dx10, doesn't look worse.
arguments all over the net when sm3.0 was introduced, didn't really look any better, limited hardware, seemed pointless at first, a year later, games could do effects that looked good with it. coders need time and practice to make things look better, this is the first try with dx10, and most of the games were largely designed and coded working with dx9 hardware and experience. dx10 will take time to mature, get over it. we have q6600's all over the place here, yet they are barely being used. we have dx10 ahrdware for months before first dx10 games. we had dx9 hardware months before dx9 games. guess what we'll have dx10.1 hardware before we see dx10.1 games and the first ones of those will appear no different, it will always be that way.