• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anyone explain ATI & NVidia fanboys?

Status
Not open for further replies.
I left nV because they were ripping us off with refreshes of the 88xx series. nV had a major lead in the graphics industry after making the 88xx series but decided to sit on it's laurels. They really should have wowed us with the 98xx series but didn't.

Yeah, that makes sense, when a company has a superior product range charge less and release a much faster product when you don't need to which would directly compete with sales of your other products which they would quite like to sell.

Ignore on here is a bit useless really...

Naa, it works great. Quite funny coming into threads like this and seing user on ignore list msg over and over. :D

The funny thing is I've owned HD4850 Crossfire, a HD4870X2 and i'm planning on buying a HD5870 next month... Hilarious.
 
Last edited:
Final8y said:
10.1 was important before but its time has passed, even NV thought it was important enough to have it taken out of a game because it gave ATI more performance.

OMG, you actually believe that? "sigh"

Don't think I'll waste time replying to you in that case.
 
If we are gonna talk about DX10.1 and nVidia gimping DX10...

I'm glad to see you'd all be happy to pay 3x as much for GPUs sporting a ton of features that they didn't have a chance of running with useable performance which wouldn't be used anyway by anything during the cards lifespan...
 
If we are gonna talk about DX10.1 and nVidia gimping DX10...

I'm glad to see you'd all be happy to pay 3x as much for GPUs sporting a ton of features that they didn't have a chance of running with useable performance which wouldn't be used anyway by anything during the cards lifespan...

Anyone who buys a gaming performance card for DirectX support needs their head testing.
 
If we are gonna talk about DX10.1 and nVidia gimping DX10...

I'm glad to see you'd all be happy to pay 3x as much for GPUs sporting a ton of features that they didn't have a chance of running with useable performance which wouldn't be used anyway by anything during the cards lifespan...

Allot of people are happy with the 10.1 performance in the few games that use them & i don't know where the x3 as much comes into it as 10.1 costs nothing.
 
If we are gonna talk about DX10.1 and nVidia gimping DX10...

I'm glad to see you'd all be happy to pay 3x as much for GPUs sporting a ton of features that they didn't have a chance of running with useable performance which wouldn't be used anyway by anything during the cards lifespan...

The sad thing is physx is used allot more then DX10.1, I don't rate physx either but at least more games make use of it.
 
The sad thing is physx is used allot more then DX10.1, I don't rate physx either but at least more games make use of it.

Aside from the ability to render 8 lights in the same pass, most of the other features are only useful in bringing ATI cards upto speed, the increase in precision in some things might possibly be useful at a later date but at the moment its not a consideration for game developers.
 
Aside from the ability to render 8 lights in the same pass, most of the other features are only useful in bringing ATI cards upto speed, the increase in precision in some things might possibly be useful at a later date but at the moment its not a consideration for game developers.

So your saying that 10.1 would not of made NV cards any faster if NV cards supported 10.1.
 
Last edited:
I'm an ATI fanboy have been for years, but that doesnt stop me buying Nvidia cards owned 8800GTX one of the best cards I've ever owned.

Had the GTX 295 excellent card but Sli mode sucks in the original EverQuest the Bandwith cripples sound i.e. goes all static so have to run in single gpu mode waste of time, single gpu cards are fine with it.

The current drivers suck losing 18fps in Crysis in any 190.xx set of drivers so sold it, my Vapor X will last me a good 4 months till the 5xxx Vapor-X is released hopefully.

If Nvidia drivers get consistant i.e. they dont break something with a new release & the prices vs performance is like AMD/ATI then I might buy an Nvidia card again.
 
Last edited:
Still where does the x3 more come into it. NV had the first DX10 cards out in the 8800.

If the 2900 and 8800 cards had supported the original feature set they would have cost significantly more to develop... 3x more is an approximation at best.

So your saying that 10.1 would not of made NV cards any faster if NV cards supported 10.1.

There would have been some performance increases in some areas such as the lighting pass in deferred shaders and the nature of defferred shaders as well but in general there wouldn't have been as major performance increase on the nVidia side like there would be on the ATI side as they have less of a penalty for doing things the "old" way... and most of the performance increases (that nVidia cards would see) come from features that we aren't likely to see in games for another generation any how.
 
If the 2900 and 8800 cards had supported the original feature set they would have cost significantly more to develop... 3x more is an approximation at best.



There would have been some performance increases in some areas such as the lighting pass in deferred shaders and the nature of defferred shaders as well but in general there wouldn't have been as major performance increase on the nVidia side like there would be on the ATI side as they have less of a penalty for doing things the "old" way... and most of the performance increases (that nVidia cards would see) come from features that we aren't likely to see in games for another generation any how.

I think your making up figures with that x3 as much & if ATI was willing to cough up to the original DX10.DX10.1 spec then NV should have had the pockets to do so.
It all looks like NV design was simply not right for Dx10 & that's why they got it dumped down to meet the design what they had & deemed useless & have been holding out for so long on Dx10.1 because of the amount of redesign needed & that why they have 10.1 parts on the way even tho they say that its unimportant even when they have DX11 just around the corner as to have DX11 you must support the DX10.1 subset.

I would like to see the links related to the cost of DX10 & NV said that was the main reason.
 
Last edited:
I think your making up figures with that x3 as much & if ATI was willing to cough up to the original DX10.DX10.1 spec then NV should have had the pockets to do so.
It all looks like NV design was simply not right for Dx10 & that's why they got it dumped down to meet the design what they had & deemed useless & have been holding out for so long on Dx10.1 because of the amount of redesign needed & that why they have 10.1 parts on the way even tho they say that its unimportant even when they have DX11 just around the corner as to have DX11 you must support the DX10.1 subset.

I would like to see the links related to the cost of DX10 & NV said that was the main reason.

But what is the point of supporting a feature set which most developers will have no interest in using? nvidia work closely with developers, allot more so then ATI and were probably in a better position to gauge demend for DX10.1 from developers and choose early on in development not to include DX10.1.

I would say they were proven right, the cost and time of including this into there new line-up obviously did effect their choice.

In fact heres a link for you..

"We spoke to Ben Berraondo, Product PR at Nvidia, who was more than happy to clarify its position on DX10.1. Since Microsoft's announcement of DX10.1, it had been in touch with developers through its 'The Way it's Meant to be Played' setup about the possibility of adding the new API to its hardware.

Apparently, most of the developers were hardly even planning on implementing DX10 due to the vast array of cross platform titles, let alone DX10.1. He used the example of Race Driver: GRID, a newish title that looks and plays amazingly and yet is still thoroughly DX9.

He did admit however that some developers asked for parts of the DX 10.1 API to be included. It has then implemented some parts of the newest DirectX in some of its new hardware, although not the full listing needed to enable the cards to be certified as DX10.1 parts. So while there is a certain demand for what DX 10.1 can offer, Nvidia is obviously confident that it has the important bits covered"

Source
 
Last edited:
Because that's how technology progresses? Whether nVidia decided to support it or not, we wouldn't be on the cusp of an 11th version of DirectX without both nVidia and ATI supporting feature sets, in some cases that weren't going to be used for a very long time.

The hardware needs to support it first, then game support slowly follows after.
 
But what is the point of supporting a feature set which most developers will have no interest in using? nvidia work closely with developers, allot more so then ATI and were probably in a better position to gauge demend for DX10.1 from developers and choose early on in development not to include DX10.1.

I would say they were proven right, the cost and time of including this into there new line-up obviously did effect their choice.

In fact heres a link for you..

"We spoke to Ben Berraondo, Product PR at Nvidia, who was more than happy to clarify its position on DX10.1. Since Microsoft's announcement of DX10.1, it had been in touch with developers through its 'The Way it's Meant to be Played' setup about the possibility of adding the new API to its hardware.

Apparently, most of the developers were hardly even planning on implementing DX10 due to the vast array of cross platform titles, let alone DX10.1. He used the example of Race Driver: GRID, a newish title that looks and plays amazingly and yet is still thoroughly DX9.

He did admit however that some developers asked for parts of the DX 10.1 API to be included. It has then implemented some parts of the newest DirectX in some of its new hardware, although not the full listing needed to enable the cards to be certified as DX10.1 parts. So while there is a certain demand for what DX 10.1 can offer, Nvidia is obviously confident that it has the important bits covered"

Source

Wrong!
As the context was the original dx10 spec that NV did not adhere to. There would be no need for DX10.1
So your links are not relevant as its about the DX10 specs costing 3 times more before the dumb down.

MS had Vista ready to go & had to with what NV could run. DX10.1 was brought out later as that would have given NV time to implement what should have been in 10.0.

And even still nothing to do with the costs involved.

Whether developers are going to use the latest DX is another matter but the hardware has to be there to be able to use it in the first place, it should be the developers choice to develops software for a DX standard & not the hardware makers.
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom