• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

More bad news for nvidia if true.

I must admit after reading the Semi Accurate site, That it does leave me worried and confused in the GPU market. As I've placed a pre-order for a GTX470 card I'm also debating if I made the wrong call and should have stuck with ATI and got 5850 as it is tried and tested hardware.

Or I keep mine pre-order and get a waterblock for it ?

Honestly? Unless you have a reason to prefer NV cards, be it drivers, software, linux support, CUDA, whatever then I would go for a 5850 every time over a 470 and just overclock it. If bang for buck is your bag, then that is the clear winner.

I'm getting a watercooled 480 though, for me the performance is there and I don't mind paying the extra as I have a preference for NV software/drivers/support.
 
Well maybe its missing Metro 2033.
But look at these awesome screenies. Each circled bit(link) takes you back and forth between these pages. Feel free to study them.





Most games use PhysX which does the work on the CPU, hence not needing a Nvidia GPU. I dont understand how you dont understand?
There is only a few titles that actually require a hardware accellerated physics.


But you can barely tell the difference.

I run with max x4 AA, i dont need anymore.


Honestly, for crying out loud, Physx and its SDK does not all need to be done at GPU level. For any Physx feature to be implemented in game wither GPU (Hardware) or CPU (Software) you have to have an nVidia graphics card with CUDA. It can be implemented at the software level, but you wont see anything at any level with ATI. CPU implementation still requires the Physx SDK, which still requires an nVidia card to understand whats written at software level. I dont know how much more clearer I can be with you.
 
Its a closed standard. Being a developer myself I am weary about it and think in the long term it could do more harm than good in regards to hardware/software physics. I just fear we have another IE6 on our hands....
 
Honestly, for crying out loud, Physx and its SDK does not all need to be done at GPU level. For any Physx feature to be implemented in game wither GPU (Hardware) or CPU (Software) you have to have an nVidia graphics card with CUDA. It can be implemented at the software level, but you wont see anything at any level with ATI. CPU implementation still requires the Physx SDK, which still requires an nVidia card to understand whats written at software level. I dont know how much more clearer I can be with you.

This is wrong

PhysX in software doesn't need an nVidia card to run.

This is correct
 
Honestly, for crying out loud, Physx and its SDK does not all need to be done at GPU level. For any Physx feature to be implemented in game wither GPU (Hardware) or CPU (Software) you have to have an nVidia graphics card with CUDA. It can be implemented at the software level, but you wont see anything at any level with ATI. CPU implementation still requires the Physx SDK, which still requires an nVidia card to understand whats written at software level. I dont know how much more clearer I can be with you.

How come I can turn on advanced PhysX Features in Mirror's Edge despite not having an nVidia GPU?
 
FizzX = False. Just leave him to get on with it.

What I don't understand is why Nvidia didn't just paper launch a whole range of fermi based cards at the same time. It would at least make the retail end of Nvidia problems some money.
 
Last edited:
I don't think Nvidia really know what they are having as the Fermi range. Just making it up as they go along !
 
I don't think Nvidia really know what they are having as the Fermi range. Just making it up as they go along !
Looks like it, maybe they have a whole pile of GPU's that cannot be made into GTX470/480 cards due to broken stream processors or cache or another part on the die but work fine with half the GPU disabled?
 
Back
Top Bottom