• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia 182.06 WHQL driver released

To get HD decoding in hardware with these you need 1.9 CoreAVC which has CUDA support. Anything else will not work - just an fyi!
 
Out of interest, how does the CUDA decoding differ from the existing video decode engines that exist in the GTX and ATI4xx0 cards?

(genuine question)
 
Well the current method doesn't use pure hardware decoding unless the software allows for it which is why watching 1080p content sees a 10~% cpu time usage.

The only noticeable difference is cpu time, visual quality is no different to standard decoding through the video driver.

I suppose at the end of the day if you have a decent CPU (quad etc) then it's not going to matter to you one bit but for HTPC users or laptop users this will be a good thing to have.
 
To get HD decoding in hardware with these you need 1.9 CoreAVC which has CUDA support. Anything else will not work - just an fyi!


mrk do you know of a HD codec for the 8800GTX cards
as CoreAVC does not support it.

thanks
 
Hm not really, any FFDshow or DiVX7 will work perfectly fine however - they don't use Cuda so no pure hardware decoding but if you have a half decent CPU it won't be a problem.
 
Nvidia GeForce Driver 182.06: Performance Express-Test

http://www.xbitlabs.com/articles/video/display/geforce-driver-182-06.html

One way or another, it makes perfect sense to upgrade to the new GeForce 182.06 driver, especially for those who play any of the games we checked out today. However, you may get additional speed also in games not mentioned by Nvidia. Looks like the company focuses not on optimizing the drivers for selected games, but on improving the performance of their solutions in general, which we can clearly see from the results obtained with GeForce 182.06 driver in Enemy Territory: Quake Wars and Far Cry 2.


Unlike ATI Catalyst 9.1 that makes things better primarily for the dual-chip Radeon HD X2 card owners, the new Nvidia GeForce driver version will work well not only for those who have the latest generation GeForce GTX 295 and discrete SLI systems, but also for the users of common single-chip cards, such as GTX 285/280/260, and maybe even GeForce 9800/9600.
 
25% in L4D alone - that's impressive as I game at 19x12 with 4xFSAA so will defo try these out tonight :p

But first, GTA4 !
 
CUDA isn't a requirement for accelerated HD decoding, it's just the method CoreAVC uses.

For example, MPC-HC will use accelerated HD decoding with whatever driver version you're using on suitable hardware.
 
content = not codec, I'm on about what H264 codec are you using, ffdshow, divxhd or coreavc or other?
 
You have it set up incorrectly then, or MPC-HC doesn't support your GFX card. I can play 1080p with virtually no CPU usage with MPC-HC.

What format is this 1080p video, and what codec is it using? I've never seen anything play 1080p with almost no CPU usage. No graphics card will completely decode video, they just get CPU usage down. Atleast a little CPU usage will always be used.
 
I'm on Vista64 and here's how it is with MPC-HC out of the box (no codecs being used other than MPC built in ones) vs MPC-HC configured using my own settings.

Out of box:
mpc_default.jpg


My settings w/ external codecs:
mpc_dcodecenabled.jpg


As you can see from my findings the DIVX-HD decoder uses less CPU than the one built into MPC-HC when DXVA is not enabled or supported but....


...Curiously I checked MPC-HC output screen to see why DVXA was not enabled/working and found that I had to enable EVR mode in DirectShow output to get DXVA working as the MPC Video decoder screen (as in your screenshot) had DXVA as not supported until I did this. The out of the box option to EVR output was not enabled either, it was set to "default" which uses overlays instead of EVR.

Here's the updated findings:

MPC-HC built in codec:
mpchc_dxva.jpg


DIVX-HD (external codec, the same applies to ffdshow or anything else):
mpchc_dxvano.jpg


0-2% CPU usage :D
 
Last edited:
Scratch that, whilst the CPU usage is below 2% now on 1080p content I notice that moving the window around is juddery compared to using Default output method and also that there are too many framedrops using DXVA (EVR) especially when seeking around in the movie whereas there are zero framedrops when doing the same thing using an external decoder or with DXVA disabled so it isn't passing purely through the GPU.

I shall sacrifice some CPU % and not bother using pure hardware accel in order to keep my zero framedrop performance as I don't notice the cpu usage since it's only 2 cores out of 4... :p
 
Back
Top Bottom