• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel graphics drivers employ questionable 3DMark Vantage optimizations

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
We recently learned AMD has notified Futuremark that Intel's 15.15.4.1872 Graphics Media Accelerator drivers for Windows 7 incorporate performance optimizations that specifically target the benchmark, so we decided to investigate.

We tested 3DMark Vantage 1.0.1 with these drivers on a G41 Express-based Gigabyte GA-G41M-ES2H motherboard running the Windows 7 x64 release-to-manufacturing build, a Core 2 Duo E6300, 4GB of DDR2-800 memory, and a Raptor WD1500ADFD hard drive.

We first ran the benchmark normally. Then, we renamed the 3DMark executable from "3DMarkVantage.exe" to "3DMarkVintage.exe". And—wouldn't you know it?—there was a substantial performance difference between the two.

Intel appears to be offloading some of the work associated with the GPU tests onto the CPU in order to improve 3DMark scores. When asked for comment, Intel replied with the following:

"We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hardware available in the system. Our driver is currently in the certification process with Futuremark and we fully expect it will pass their certification as did our previous DX9 drivers."

This CPU-assisted vertex processing doesn't appear to affect Vantage's image quality. However, Intel is definitely detecting 3DMark Vantage and changing the behavior of its drivers in order to improve performance, which would appear to be a direct contravention of Futuremark's guidelines
ReadMore & benches
 
Every GPU manufacturer does this. Just look at individual game optimisations found in NV and ATI drivers. If this does not effect visual quality this is nothing more than a total non-story.
 
If they were going to fix the scores they might as well go the whole hog and implement GPGPU elements to improve the CPU tests.

Anyway, we all know 3dmark scores mean jack, so they can fix it all they like :p
 
Well, they're doing it in games, too, so it's not like they're just doing it to boost 3Dmark scores. If anything they're just trying to make 3Dmark reflect the best you can expect from the system - just like every other GPU manufacturer does.

However, it'd be better if Intel could implement the 'CPU acceleration' as you might call it into a more general solution, but that's probably quite difficult.
 
Every GPU manufacturer does this. Just look at individual game optimisations found in NV and ATI drivers. If this does not effect visual quality this is nothing more than a total non-story.

The optimisations is not the issue, its that fact that the test is not running on the hardware set by the rules of Futuremark.
 
but who really gives a crap until Intel make an integrated gpu solution that actually can run something.
 
Yeah, I could understand if it was a big powerhouse chip by Nvidia or ATi, but really Intel's integrated solutions need all the help they can get. :p
 
ATI are renowned for this, they normally get caught out about 10 minutes after they moan about someone else.
 
Yeah I think AMD are the pot calling the kettle black here, pretty much every GPU vendor has its specific optimisations for 3DMark (even S3). The difference being, Intel's GPU's are the only ones that can actually reap a performance benefit from offloading some of the graphics computations to the CPU. With even entry level ATi or Nvidia cards, the PCI-E bottleneck would make it not worth the effort.
 
There is nothing wrong with optimising but that's not what's really being done.
Its like me saying im optimising my laptops battery performance test by having the laptop plugged into the mains.
 
Last edited:
Back
Top Bottom