Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
![]()
Seems like it did ok?
9700 Pro was quite a big leap in performance at the time but come on, changed the industry??
Came from nowhere with astonishing performance and the first fully programmable pixel and vertex shaders ever seen in consumer level hardware. I would say it changed the industry.
Voodoo1 for sure. My 1st 3d card ever. Quake 2 in openGL was amazing!
It didn't change the industry just because it was fast and and as for it's shaders they were really just the next evolutionary step from what the GF3 introduced and we were all expecting from NV30, incidently NV30 actually had FP32 support whereas 9700 Pro was only FP24 so if anything NV30 changed the industry more.
no
GF3 was integer based which was pretty useless and just not flexible enough, it was not until the R300 that we got fully floating point shaders which made the big difference. NV30 was a big bag of fail, fast on old stuff, pathetically slow on newer games and released 6 months after the ATI effort. Only a die hard NV fanboy would try to claim otherwise.
Wiki said:Shaders
With regards to the much-touted Direct3D 9.0 shader model 2.a capabilities of the NV3x series and the related marketing claim of "cinematic effects" capabilities, the actual performance was quite poor.[5] A combination of factors combined to hamper how well NV3x could perform these calculations.
Firstly, the chips were designed for use with a mixed precision programming methodology.[4] A 64-bit precision "FP16" mode would be used for situations where high-precision math was seen as unnecessary to maintain image quality. In other cases, where mathematical accuracy was more important, a 128-bit "FP32" mode would be utilized. The ATI R300-based cards did not benefit from partial precision because they always operated at shader model 2's required minimum of 96-bit FP24 for full precision. For a game title to use FP16, the programmer had to specify which effects used the lower precision using "hints" within the code. Because ATI didn't benefit from the lower precision and the R300 performed far better on shaders overall, and because it took more effort to optimize shader code for the lower precision, the NV3x hardware was usually crippled to running full precision full-time.
The NV3x chips also used a processor architecture that relied heavily on the effectiveness of the video card driver's shader compiler.[4] Proper instruction ordering and instruction composition of shader code could dramatically boost the chip's computational efficiency. Compiler development is a long and difficult task and this was a major challenge that Nvidia tried to overcome during most of NV3x's lifetime. Nvidia released several guidelines for creating GeForce FX-optimized code and worked with Microsoft to create a special shader model called "Shader Model 2.a". This model leveraged the design of NV30 in order to extract greater performance and flexibility. Nvidia would also controversially rewrite game shader code and force the game to use their shader code instead of what the developer had written. However, such code would often result in lower final image quality.