• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Anyone explain ATI & NVidia fanboys?

Status
Not open for further replies.
Regardless of what it equals in your eyes does not take away the fact that you made up figures out of thin air based on thin air & that ATI have has DX10.1 on the processes that you say are to costly, anything before 40nm.
And all of this to suite your making up of an excuse for NV that did not exist.

My god your obnoxious.
 
To be fair, you came out with sweeping statements like it would cost 3x as much etc without any proof....

I wasn't expecting anyone to take my figure of speech as a serious statement of fact and get so hung up over it... next time I'll put approximatly in just to be sure...

Doesn't change the fact that implementing the original DX10 spec on an 80nm process would have hugely pushed up prices to the end user from both exxtra hardware cost and software dev.
 
Last edited:
I wasn't expecting anyone to take my figure of speech as a serious statement of fact and get so hung up over it... next time I'll put approximatly in just to be sure...

Doesn't change the fact that implementing the original DX10 spec on an 80nm process would have hugely pushed up prices to the end user from both extra hardware cost and software dev.

Even x2 of the cost would still been plan wrong as if a £200 card would have jumped to £400 and to think that Microsoft would have designed such an API that double the cost as if they don't know what they were doing.
If it was not to be taken seriously you would have said so long before now, but you have to come up with a way to save face.

And again the second part of your post, you talk about facts with the total inability to back them up when requested because you have none in the first place. There was no mention of the costs involved at 80nm from NV at the time or since then.
NV still has not moved to 10.1 on 65nm or 55nm.

You have one fact right, no one should take anything you say seriously.
 
Last edited:
Even x2 of the cost would still been plan wrong as if a £200 card would have jumped to £400 and to think that Microsoft would have designed such an API that double the cost as if they don't know what they were doing.
If it was not to be taken seriously you would have said so long before now, but you have to come up with a way to save face.

And again the second part of your post, you talk about facts with the total inability to back them up when requested because you have none in the first place. There was no mention of the costs involved at 80nm from NV at the time or since then.
NV still has not moved to 10.1 on 65nm or 55nm.

On the immediate reply to the first query of the figure I said it was approximate... get over it already...

Implementing the original DX10 specification on the 80/90nm process at the time would have hugely pushed up costs as it would have required quite a bit of extra hardware development, pushed up the die size with the extra core functionality to support the features, required more SPs, etc. to be able to run those features at a useable performance level and so on... then you come to the software development to support all these new features... all in all if they had developed cards based on the original specification and given them enough hardware to produce a useable performance level we'd be looking at a big increase in costs to the end user...

Implementing DX10.1 on 65 or 55nm processes is another ball game.
 
On the immediate reply to the first query of the figure I said it was approximate... get over it already...

Implementing the original DX10 specification on the 80/90nm process at the time would have hugely pushed up costs as it would have required quite a bit of extra hardware development, pushed up the die size with the extra core functionality to support the features, required more SPs, etc. to be able to run those features at a useable performance level and so on... then you come to the software development to support all these new features... all in all if they had developed cards based on the original specification and given them enough hardware to produce a useable performance level we'd be looking at a big increase in costs to the end user...

Implementing DX10.1 on 65 or 55nm processes is another ball game.

Your approximation is way off.
I don't think it would have hugely pushed up the cost at all and that's not the point, the point is that you claim that the cost was the reason for NV not going with it which NV would have been more than capable of saying themselves.

The cost of software development has nothing to do with it, NV is a hardware company.

If the cost of support for new features are the problem then we would not have come this far already.
 
Last edited:
Your approximation is way off.
I don't think it would have hugely pushed up the cost at all and that's not the point, the point is that you claim that the cost was the reason for NV not going with it which NV would have been more than capable of saying themselves.

I didn't claim anything of the sort... now your just making things up.

The cost of software development has nothing to do with it, NV is a hardware company.

*cough* drivers *cough*
 
This thread has now gone from fisherman to boxing:D

liam_shadowboxer_scaled.gif
 
I didn't claim anything of the sort... now your just making things up.



*cough* drivers *cough*

If we are gonna talk about DX10.1 and nVidia gimping DX10...

I'm glad to see you'd all be happy to pay 3x as much for GPUs sporting a ton of features that they didn't have a chance of running with useable performance which wouldn't be used anyway by anything during the cards lifespan...
No matter how you want to slice it you have nothing to back up anything that you have said.

Drivers are part of the course, but if its such a problem for NV then i don't hold much hope for them because development will move forward without them as they are going to stick with DX10 at the driver level because anything more is to costly & they will end up like 3DFX if they cant keep up.
 
Last edited:
No matter how you want to slice it you have nothing to back up anything that you have said.

Its called having an opinion... theres very little I've pushed forward as fact... and I haven't seen you putting any facts forward either to dispute what I've said just your opinion correct or otherwise.

Drivers are part of the course, but if its such a problem for NV then i don't hold much hope for them because development will move forward without them as they are going to stick with DX10 at the driver level because anything more is to costly & they will end up like 3DFX if they cant keep up.


No one said anything about it being a problem... but extra features = more development time, more testing, etc. = higher costs... guess who the costs get passed on to.
 
Its called having an opinion... theres very little I've pushed forward as fact... and I haven't seen you putting any facts forward either to dispute what I've said just your opinion correct or otherwise.




No one said anything about it being a problem... but extra features = more development time, more testing, etc. = higher costs... guess who the costs get passed on to.

Its very easy to make it clear that something is only an opinion right from the start, but you did not & started talking about facts.

The extra cost in development is normal & all part of progress or we would still be on the first Glide3D.
 
No, my surprise was at you saying 'his obnoxious'. or 'your obnoxious'.

I was just asking what about his obnoxious?

Do you have anything constructive to add to this thread? (coz it doesn't look like anyone else, including myself, has)
 
Its called having an opinion... theres very little I've pushed forward as fact... and I haven't seen you putting any facts forward either to dispute what I've said just your opinion correct or otherwise.
You're making a factual claim about the costs involved in implementing DirectX 10.1. When call that claim an 'opinion' all you're really saying is "this is something I believe".

It doesn't absolve you from the need to present evidence to justify your belief. And that's all people are asking in this thread: where is the cost of implementing DirectX 10.1 documented, i.e where is the source from which you obtained the cost estimate? You did call it an "approximation" after all, rather than "total stab in the darkness".

I wasn't expecting anyone to take my figure of speech as a serious statement of fact and get so hung up over it... next time I'll put approximatly in just to be sure...
If you don't know, then it's not an approximation; it's a total guess. Why isn't the cost 15% or 20% or 35% or 75% or 500%? Can you give a reason for preferring any one of these figures over the others? If you can't, then you don't know any more than anyone else, here.
 
You're making a factual claim about the costs involved in implementing DirectX 10.1. When call that claim an 'opinion' all you're really saying is "this is something I believe".

It doesn't absolve you from the need to present evidence to justify your belief. And that's all people are asking in this thread: where is the cost of implementing DirectX 10.1 documented, i.e where is the source from which you obtained the cost estimate? You did call it an "approximation" after all, rather than "total stab in the darkness".

I wasn't talking about implementing DX10.1 - I was talking about the original proposed DX10 specification.

If you don't know, then it's not an approximation; it's a total guess. Why isn't the cost 15% or 20% or 35% or 75% or 500%? Can you give a reason for preferring any one of these figures over the others? If you can't, then you don't know any more than anyone else, here.

Ok a guess then whatever - I have a reasonable idea of the cost involved as you increase the complexity of your core, increase in cost as your yield per wafer decreases, increase in cost as you have to spend a LOT more time developing your drivers and an even bigger increase in the amount of cross testing you have to do... your software development and testing workload increases exponential with increased features.
 
Status
Not open for further replies.
Back
Top Bottom