• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Let Battle Commence

Your trying to defend ATI with the likes of Hydravision, Truform and Avivo. When we're talking about CUDA which is redefining HPC. Please, it's just hilliarious :p

What i'm talking about is that Cuda is not the first use of the GPU to do other things than just display gfx.

Cuda is redefining NV as it will not become the standard unless it runs on other hardware.

So fact ATI TRuform = is now part of the standard in a more evolved form in DX11 tessellation.

Windows7 already does transcoding on the GPU.
 
Last edited:
What i'm talking about is that Cuda is not the first use of the GPU to do other things than just display gfx.

Cuda is redefining NV as it will not because the standard unless it runs on other hardware.

CUDA is the first C API to a GPU, it's the first API to be used extensively out side gaming, it's the first API to demonstrate <insert 100's of applications her> being accelerated by GPUs.

Like it or not it makes something like Turform look about as important as a spot on the face of tramp.
 
CUDA is the first C API to a GPU, it's the first API to be used extensively out side gaming, it's the first API to demonstrate <insert 100's of applications her> being accelerated by GPUs.

Like it or not it makes something like Turform look about as important as a spot on the face of tramp.

Where are those 100's.
Besides NV pay for transcoder.

One thing that your forgetting that is really important is that it only runs on NV gfx cards.
 
Last edited:
At the end of the day CUDA is more sucessful by far than anything ATI has brought to the table in actual real world useage... however as far as gaming goes thats not very relevant :S it does however go to show that CUDA is not just a useless gimmick as some people call it and could be bringing massive benefits to gaming if everyone would just play nice.
 
If you flowed the thread you would know that i don't care for what industry & specialised field are getting out of it Nether do i care about Servers. FireGL or NV Quadro because the majority don't use them.

Who the sweet jesus is a talking about FireFL or Quadro cards :confused: are you just making stuff up now?

"industry & specialised" - yeah because a sort function is ooohhhh so specialised. No one sorts data on computers that'd just be far to technical :p
 
I think CUDA's a great thing for the professional space where the processing that needs to be done will only realistically need to be run on a single set of hardware. In the gaming space, however, anything that gets supported by only one of the hardware vendors is going to damage the industry by segregating the games that each user has access to further. I mean, the PC platform already seems to have stagnated enough, now we're lucky to get much that isn't a console port. We do not need developers supporting things that are only applicable to one side of the fence, or making people who want to buy a PC for gaming worried that their hardware might not support upcoming games because of proprietary features.
 
At the end of the day CUDA is more sucessful by far than anything ATI has brought to the table in actual real world useage... however as far as gaming goes thats not very relevant :S it does however go to show that CUDA is not just a useless gimmick as some people call it and could be bringing massive benefits to gaming if everyone would just play nice.

Indeed CUDA is more successful by far than anything ATI has brought to the table in the GPGPU field, but that does not matter as CUDA will not be competing with ATI GPGPU.
 
Who the sweet jesus is a talking about FireFL or Quadro cards :confused: are you just making stuff up now?

"industry & specialised" - yeah because a sort function is ooohhhh so specialised. No one sorts data on computers that'd just be far to technical :p

I have been saying to what use to the average user in this thread from the begging & like Rroff you bring up specialised.
 
Depends... if nVidia had any sense, they'd open physx back up for any rendering front end and push developers hard to use CUDA and PhysX... selling shed loads of cheap G92 based cards in the process... I mean look how many people ask about using an old 8x00 card for physx in the forums - and thats generally just the tip of the iceburg... sadly instead they will probably just cut off their nose to spite their face.
 
Depends... if nVidia had any sense, they'd open physx back up for any rendering front end and push developers hard to use CUDA and PhysX... selling shed loads of cheap G92 based cards in the process... I mean look how many people ask about using an old 8x00 card for physx in the forums - and thats generally just the tip of the iceburg... sadly instead they will probably just cut off their nose to spite their face.

I would have no problems with that.
Before you had to use an ageia physics card now an NV physics card.. same difference.

I would personally not bother buying until i see more worth while use that impresses me.
 
Last edited:
I have been saying to what use to the average user in this thread from the begging & like Rroff you bring up specialised.

I think you might have lost the plot a little, as I've posted in this thread numerous times you can accelerate pretty much anything with CUDA code. It just lends itself *best* to certain applications, and it happens the first to jump on the bandwagon are HPC users (as it massively cuts their costs).

Like I say, if you honestly think a sort is specialised you're mistaken.
 
I think you might have lost the plot a little, as I've posted in this thread numerous times you can accelerate pretty much anything with CUDA code. It just lends itself *best* to certain applications, and it happens the first to jump on the bandwagon are HPC users (as it massively cuts their costs).

Like I say, if you honestly think a sort is specialised you're mistaken.

You basically said that before & i replyed with.
Understanding & caring are 2 different things because on this forum more people spend time playing games then doing productivity work.

Your looking at it from your own perspective & uses that others may not share.

Give the average user here access to the worlds fastest super computer & the first thing they will do is install Crysis.

More people will care when it has uses for what they like to do on a daily bases & not in rare cases.

So at that stage you know where i placed importance.
And till its used for more in general software then its place will never be important with the average consumer.
So you did not show me 100s of general software so there is no compelling reason for me or many others to get an NV card for CUDA as is will not be using anything that you have showed me thus far.
 
Last edited:
You basically said that before & i replyed with.


So at that stage you know where i placed importance.
And till its used for more in general software then its place will never be important with the average consumer.
So you did not show me 100s of general software so there is no compelling reason for me or many others to get an NV card for CUDA as is will not be using anything that you have showed me thus far.

So you will not be using any sort routines on your computer. Excellent, good luck with that Final8y:p
 
Back
Top Bottom