• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Where did Dx10.1 from Vantage go, Nvidia knows

I have to say that I agree with Finality here...

AMD have been more technologically advanced than Intel for ages, with proper dual & quad core while Intel have been almost cheating in a way... If you took away the cache, the raw speed & overclockability of a C2D the AMD is a much better chip.

As far as ATI vs nVidia are concerned most of the advances that i've read about regarding core architecture have been made by ATI. They also push the boundaries considerably further than NV do in terms of core clock speeds and always seem to use to high end memory. New 4870's for example are sposed to be GDDR5, where NV are still on GDDR3 & i've had a bad recent experience with an XFX8800gt with real CRAP Quimonda ram that failed stability tests with a 15mhz OC...

Like with Intel vs AMD... if you take the shader processor off something like an 8800GTS 512mb im sure a 3870XT would spank it, having better ram and a core that runs 250mhz faster at stock ;)

Plus I would say that ATI's image quality is also better then NV

But Hey-ho... Just IMO :p
 
Last edited:
attack? you say it as if ive called you names and pulled your hair lol. deal with it, you cant have your cake and eat it. i have absolutely NO PROBLEM with reasonable informed decisions opinions. what i dont care much for is an undeserved bias against or for a company.

That your opinion & not a fact of undeserved bias against a company & does not give you the right to be the law maker on how its discussed or attention will be turned to poster personally.
not at all? hypocrisy is campaigning against the conditions of battery chickens and then eating out at KFC. Hypocrisy is demonstrating against fox hunting and then betting on a dog fight. Hypocrisy is complaining about a manufacturers brute force approach and then doing the same your self.
Continued Brute force approach into making technology instead of innervation & adopting known better methods.

I dont make technology nore do you know of a better approach to my needs & for what i use my system for to get the job done.
Once again your tell me that i can only make comments about companies if you think im adhering to it my self which you have no right to do in that case no one shall comment because no one is an angel.
no, you dont not have to be in the same business lol. but you go ahead and make me look like the bad guy if that makes you feel better. true colours and all that:)

You did that your self by making it a personal thing against me because of my views of companies that have a far greater impact to people than what i do with my own system.
 
Last edited:
Who cares if its GDDR3 or 4 or 5, ATI always pumped up Core speeds on their GPU's same as Intel did on their P4 CPU's, but thats not much good if your Memory is lower clocked (needs be a balance).

The Nvidia cards seem to have higher memory speeds be it GDDR3 or other.
 
You did that your self by making it a personal thing against me because of my views of companies that have a far greater impact to people than what i do with my own system.


did what exactly? lol you're just complaining and tbh i dont care any more:) the amount of impact you have on a pewrson has nothing to do with it lol.


[Monstermunch - i love the hypertransport ideal. and god know why intel have taken this long to implement it - its brilliant. however outside of that, what have amd brought to the table first? it wasnt their support of ddr2? im pretty sure intel were the first with ddr boards, they tried their luck with rdram (and failed). were they the first with pci-e? im not sure about that one. ddr3? intel. sse2 and sse3? intel. 3dnow is amd's baby. native quad core goes to amd. what else is there?

AMD have been more technologically advanced than Intel for ages, with proper dual & quad core while Intel have been almost cheating in a way... If you took away the cache, the raw speed & overclockability of a C2D the AMD is a much better chip.

but they are there though. you can say 'if you cripple a cpu then its only as good as a slower one'. well of course it is. but why would you do that?
 
I have to say that I agree with Finality here...

AMD have been more technologically advanced than Intel for ages, with proper dual & quad core while Intel have been almost cheating in a way... If you took away the cache, the raw speed & overclockability of a C2D the AMD is a much better chip.

As far as ATI vs nVidia are concerned most of the advances that i've read about regarding core architecture have been made by ATI. They also push the boundaries considerably further than NV do in terms of core clock speeds and always seem to use to end memory. New 4870's for example are sposed to be GDDR5, where NV are still on GDDR3 & i've had a bad recent experience with an XFX8800gt with real CRAP Quimonda ram that failed stability tests with a 15mhz OC...

Like with Intel vs AMD... if you take the shader processor off something like an 8800GTS 512mb im sure a 3870XT would spank it, having better ram and a core that runs 250mhz faster at stock ;)

But Hey-ho... Just IMO :p

While i am amd fan more so than intel and split ati.nvidia depending what mood i am in and what card i got i not sure it really matters how something does what really,to most of us it is how fast it is like the c2d it doesn't really matter how it does it, it is just faster same with the ocing of c2d and quads amd just need to get their 45 chips out soon and with a bit of luck they will overclock well/better.
A few of us moaned about G80 and renaming but really until ati step up and bring out their 4XXX nvidia will sit back and even if it is old tech and not have the lastest features it still is the fastest and that what we all want.
 
Who cares if its GDDR3 or 4 or 5, ATI always pumped up Core speeds on their GPU's same as Intel did on their P4 CPU's, but thats not much good if your Memory is lower clocked (needs be a balance).

The Nvidia cards seem to have higher memory speeds be it GDDR3 or other.

Sorry but thats not entirely true... TBH I would say ram speed wise there is nothing between the two, except fot the fact that ATI dont release cards with massively OC'd ram for their stock speeds like NV do with Quimonda
 
You did that your self by making it a personal thing against me because of my views of companies that have a far greater impact to people than what i do with my own system.


did what exactly? lol you're just complaining and tbh i dont care any more lol. how much you impact people has nothing at all to do with it.


[Monstermunch - i love the hypertransport ideal. and god know why intel have taken this long to implement it - its brilliant. however outside of that, what have amd brought to the table first? it wasnt their support of ddr2? im pretty sure intel were the first with ddr boards, they tried their luck with rdram (and failed). were they the first with pci-e? im not sure about that one. ddr3? intel. sse2 and sse3? intel. 3dnow is amd's baby.hyperthreading? intel. native quad core goes to amd. what else is there? as far as cpu's go, inter got there first with the p2 and p3, amd won out with their thunderbirds, intel released the woeful netburst tech 1.5ghz p4's (willemette?) that gave amd a free run to improve in their thunderbird and tun it in to........a palamino XP. intel migrated to ddr memory and a 130nm process with their northwood p4's and pulled away from the xp's. amd never truely outpaced the p4's though they were the clear winners for the price......

....the xp 64's changes everything. through each revision AMD led and things got very competitive. intel revamped the p4 with the prescott release, but it had major current leakage and subsquently heat problems. prescott was deemed a failure and they abandoned it, and the netburn arch it was based on, completely. amd made small revisions to the a64 in the time. none of them really building on the a64 degisn significantly, though heat output and power usage gradually declined over this period (i say gradually, there were a few initial problems with some new core revisions)

intel readied their core2 tech which ironically isnt far removed from the p3 they abandoned when they went with netburst, and due to the advances made in fabrication since, intel seemed to just find it very easy to build on the processing units of the p3 and then ramp the speed up...and it worked. amd again took the back seat and have done ever since. now, we are right pretty much even again so here, right now, is where it'll start to get interesting!


AMD have been more technologically advanced than Intel for ages, with proper dual & quad core while Intel have been almost cheating in a way... If you took away the cache, the raw speed & overclockability of a C2D the AMD is a much better chip.

but they are there though. you can say 'if you cripple a cpu then its only as good as a slower one'. well of course it is. but why would you do that?

Like with Intel vs AMD... if you take the shader processor off something like an 8800GTS 512mb im sure a 3870XT would spank it, having better ram and a core that runs 250mhz faster at stock

again just why would you do that lol. besides, take the shaders away and the card wouldnt do anything at all, since the card is built around unified sharders. thats a pretty useless statement it has to be said.

Plus I would say that ATI's image quality is also better then NV

But Hey-ho... Just IMO

oh these days they are neck and neck. it took ati a while to catch up with the g80s but they did it near enough:)
 
Last edited:
BTW, AMD were the 1st with DDR outside a Graphics card not Intel.

The AMD Athlon Thunderbird went from a 1.2GHZ model to a 1.2GHZ DDR model.

The 1st DDR based Mobos had AMD 760 Chipsets on them till later ALI made a Chipset (AMD stated they were not wanting to make Chipsets after initual release).

The Memory I had was Samsung PC2100 (DDR).

All this was around Jan 2001 (when I bought the Rig not sure when parts were released).
 
Monstermunch - i love the hypertransport ideal. and god know why intel have taken this long to implement it - its brilliant. however outside of that, what have amd brought to the table first?
You must be joking right... Try 64bit architecture, Dual Core... they set the standard...


but they are there though. you can say 'if you cripple a cpu then its only as good as a slower one'. well of course it is. but why would you do that?
Im not saying anyone would... Im saying that if AMD had the resources etc that Intel have for things like Cache memory etc they would be just as good if not better then Intel. Because at the end of the day, the AMD technology is more advanced... FACT

Its just a shame that they cant keep up with Intel speed wise & cache wise.


again just why would you do that lol. besides, take the shaders away and the card wouldnt do anything at all, since the card is built around unified sharders. thats a pretty useless statement it has to be said.
No it's not a pointless statement, I was just trying to say that without the seperate Shader Processors ATI would be faster.


oh these days they are neck and neck. it took ati a while to catch up with the g80s but they did it near enough:)
I agree that there is little between them now, but ATI's image quality is generally sharper & colours are more vivid... NV tend to add some kind of blurring technology to fill the gaps where ATI cards show more detail... IMO anyway
 
You must be joking right... Try 64bit architecture, Dual Core... they set the standard...

well, amd announced dual core first, but they were both launched very close to april 2005 werent they? as for 64bit arch, be careful lol. amd were the first x86-based cpu's with 64bit support, but they were not the first 64bit cpu's from either company. that goes to intels itanium which was released 2 years previous to amd's first 64bit opteron

http://www.pcper.com/article.php?aid=131&type=expert so the ee 840 smithfield was the first dual core cpu out.

Im not saying anyone would... Im saying that if AMD had the resources etc that Intel have for things like Cache memory etc they would be just as good if not better then Intel. Because at the end of the day, the AMD technology is more advanced... FACT

well thats daft then. how is it more advanced? im asking you, whats more advanced about it?

Its just a shame that they cant keep up with Intel speed wise & cache wise.

agreed

No it's not a pointless statement, I was just trying to say that without the seperate Shader Processors ATI would be faster.

without the unified shaders the gpu would not work at all so it is a useless statement! you might as well say if a farrari didnt have an engine, then a robin reliant would be faster. equally as daft.

I agree that there is little between them now, but ATI's image quality is generally sharper & colours are more vivid... NV tend to add some kind of blurring technology to fill the gaps where ATI cards show more detail... IMO anyway
since you made that statement, id like you to prove it. if you have an opinion you obviously have some first hand experiance, or at least have seen some comparisons to back this up, so lets see them then.
 
Last edited:
It's a little hard to say that this isn't a flame war against Nvidia but so be it.
If you are looking to pay games then buy a Games console. If you have a PC Video is most probably your priority. I know there are ardent game players and they usually have loads of money to waste on a PC and graphics card outlay.
I say if you are happy with your current setup and installed O/S then why be dictated to by some stupid benchmark uttillity, which does nothing but undermine your system anyway and makes a PC you may have spent something like £2000 for look like an excited snail. I think that instead of trying to "keep up with Joneses" is a common addiction amongst pc users. What is missing is focus. It is no good trying to keep up with technology that may be out in six weks or six months time. If you are happy using XP stick with it. If you like Vista and your GFX card works with it, why worry?
You know its no good complaining to the GFX card companies and MS because the are companies whom want to sell you more of what you don't neccessarily need.
I only upgraded my sound card this year because I was able to use my onboard sound and connect it via my SPDIF out to my Soundblaster Extigy. Software and drivers went out for this with ME. And the other software didn't work with SP1 XP. So I resulted in using it via the onboard sound terminal.
Graphics have been slightly kinder. Most O/S's and games work with any GFX card with at least 128mb of memory. I still think some of the high end games have a low threshold of around 256mb GFX memory. If I wanted to play the latest game I'd buy it either for my PS3 or XBox 360. This way I know I am getting dedicated Graphics and a dedicated workload.
I may have a PC with an 8800GTX GFX card in it with a Q6600 and 4GB of ram in it. But I use it for many things other than games. And if I do play a game... I would probably fire up TOCA3 for an hour. I have better things to do with my time than keeping track of who is stuffing who in the industry.
Life is too short and I don't think I am ready to be certified yet.....
 
I agree that there is little between them now, but ATI's image quality is generally sharper & colours are more vivid... NV tend to add some kind of blurring technology to fill the gaps where ATI cards show more detail... IMO anyway

From that comment alone it's clear you have no idea what your talking about, but please try to prove this "blurring technology" exists? :rolleyes:
 
Ok there are a few arguments going on at the moment so I'm gonna try and clear up a few things.

There was an argument on IQ about a year ago between NV and ATi and we all ended up "agreeing" that they are pretty much the same! Give or take a bit of what you consider to be sharp. Iv owned the most recent series of both cards.....NV 7series and 8series and ATi's x19xx, 2xxx, 3xxx series! Now ATi used to have the best IQ when your talking the older generation as quite frankly the 7series had awful IQ and everyone knew that. However since the 8 series I have to say they have been on a par with ATi and they finally matched them in that department. In certain games both NV and ATi will differ but to be quite honest there isnt much of an argument to be had anymore and the IQ that you get on any of the cards is pretty cracking :)

There is no point in arguing about the architecture of the ATi cards, this was done deliberately and they thought it would all work out. There was something about their AA process and because no-one adopted the way it would work for them it ended up becoming software bases.....correct me if im wrong. ATi then seen massive dips in performance due to this. Basically they chose the wrong architecture this time and performance suffered because of this. It wasnt obviously changed in the 38xx series becuase the card is essentially a better engineered version of the 2*** series cards so they could reduce price and power etc yet added DX 10.1 as a new tech to make the card different.

NVidia helping prevent DX10.1 is a BIG deal. People might call it fanboyism but I've been on both sides of the fence various times and it really does suck that when we finally get to see some new tech it gets taken away from us, "for the time being". Look we wouldnt be so bitter if DX10.1 done absolutely nothing, problem is it does! It helps ATi bridge the gap performance wise and was a massive boost for them this way. People know it works and that will irritate and provoke a reaction. Nothing has happened in the last year and a half in the market and we have had nothing new to shout about. It's pretty much just refreshes of the same cards and it's getting annoying. As soon as NVidia get it together and get some DX10.1 cards out the better because then we finally will see games using the tech and the improvements ATi users unfortunately have to wait a little longer on.
 
People know it works

They think they know, the developers said the performance increase was due to a bug, so you really should save comments like this until a game has working DX10.1 support and also shows increases, because right now we don't know if the performance increase is really due to DX10.1, or a bug.
 
They think they know, the developers said the performance increase was due to a bug, so you really should save comments like this until a game has working DX10.1 support and also shows increases, because right now we don't know if the performance increase is really due to DX10.1, or a bug.

bit hard if nvid is getting the only game with that atm to remove dx10.1
 
bit hard if nvid is getting the only game with that atm to remove dx10.1

Theres no prove of that, and the developers said their removing it due to a bug which they said is also responsible for the performance increase. But I guess your another conspiracy theory subscriber. :)
 
Back
Top Bottom