This is a core feature of the new 'Panther Point' Z77 boards released along with Ivy bridge.
I'm struggling to wade through marketing speak and buzzwords, to find out what it actually does for end users, and what difference it will make to me.
For context, I currently have an 880FX AMD chipset and a 1090t, running with a 580 GTX. I'm likely going to get a 3770K and a yet to be decided Z77 board.
I know that I'll be gaining a significant amount of processing power. 5980 passmarks on the 1090t, compared to 10400 passmarks on the 3770K assuming a 10% clock for clock improvement on the 2700K, and probably more as it's possible to overclock Ivy/Sandy Bridge without a scary electricity bill.
What I won't be gaining is anything where I'm playing a game that's GPU bound. Or will I?
This is where Lucid Virtu MVP comes in.
Here's a video of a Gigabyte rep being interviewed by someone from Vortez reviews.
The first thing in the video shows the Street Fighter benchmark being run without vsync at some 225 FPS, showing tearing, with vsync showing FPS dropping to 60 with no tearing, and finally with Lucid Virtu MVP "Virtual Vsync" running, it ups the FPS to 150.
To me that's utterly useless. You can't see the frames your monitor doesn't display. Vsync to 60 FPS is perfect because it stops your GPU burning electricity generating frames that won't be seen, and it stops tearing. There's no benefit I can think of in those 150 FPS over the 60 FPS.
The second thing the video shows, and it's crammed in at the very end, is removing redundant rendering tasks. Now this increases FPS in the benchmark to 460 FPS. Now that is significant. I think it's absolutely inevitable that it'll reduce quality. There's no way that I believe a third party intercepting GPU calls and removing some of them will not result in some decrease in quality. Anandtech and Tom's Hardware et all will no doubt in a few weeks/months come up with some kind of analysis of it. However it's the case that where you might be getting 30-60 FPS in a game, you could enable this and end up with 50-90 FPS in a game, which could change the whole experience. I would also guess that it might be a better experience than lowering the quality settings in game.
What I wanted to find information on, and what isn't here is on combining graphics processing power. Intel is upping the ante on Ivy Bridge, stating some 60% performance increase over Sandy Bridge. That's a fair amount of power to be sitting unused, and I want it to be used.
There's two ways I can think of it being used...
1) Kill off my discrete card unless something needs it. In other words have my discrete card go into the lowest power saving mode possible, and have the Ivy Bridge CPU handle desktop graphics. This is talked about here but I'd like to see something more concrete than the equivalent of Lucid telling me these aren't the droids I'm looking for.
2) Combine the 580 and the HD 4000 graphics performance in game. Some kind of hybrid SLI or Crossfire. The only mention I can find of this is here at Hardware Secrets where it says "Virtu Universal MVP, which allows you to combine the performance of the integrated graphics processor available in the CPU with the performance of any video card installed". That doesn't quite match what Lucid say... although Lucid seem to want to imply it, while being careful not to say it.
Edit : A bit more information in that link there. It's a preview before the Z68 boards came out of a previous version. It's talking more about how Virtu works and my understanding is a little more clear.
It sits between Windows and the graphics card drivers, and it decides who gets what. This allows it to feed normal Windows stuff to the intel GPU and only when it gets complicated stuff does it ship it out to the discrete card. This allows the discrete card to theoretically idle down and save power. At the time of that article being written there were no power savings made because the cards weren't idling down. Since then I don't expect nVidia/AMD have improved their drivers for power efficiency. I would expect at some point they will, however it might need new cards to come out for that to happen. The whole world and their dog is going to eventually have this technology, meaning the whole world and their dog who are running discrete cards will want the discrete card to power right down. I'm sure it will get covered in a review at some point after Ivy Bridge launch, I don't expect it'll be a top priority at launch.
I can't find any information about this doing some kind of hybrid sli/crossfire, I think if it was going to do it then Lucid themselves would mention it. So it appears with what I've managed to glean today that it's not going to help me directly - I don't need higher frame rates under vsync, and I don't think there's any games that'll task my current setup too hard.
I'm struggling to wade through marketing speak and buzzwords, to find out what it actually does for end users, and what difference it will make to me.
For context, I currently have an 880FX AMD chipset and a 1090t, running with a 580 GTX. I'm likely going to get a 3770K and a yet to be decided Z77 board.
I know that I'll be gaining a significant amount of processing power. 5980 passmarks on the 1090t, compared to 10400 passmarks on the 3770K assuming a 10% clock for clock improvement on the 2700K, and probably more as it's possible to overclock Ivy/Sandy Bridge without a scary electricity bill.
What I won't be gaining is anything where I'm playing a game that's GPU bound. Or will I?
This is where Lucid Virtu MVP comes in.
Here's a video of a Gigabyte rep being interviewed by someone from Vortez reviews.
The first thing in the video shows the Street Fighter benchmark being run without vsync at some 225 FPS, showing tearing, with vsync showing FPS dropping to 60 with no tearing, and finally with Lucid Virtu MVP "Virtual Vsync" running, it ups the FPS to 150.
To me that's utterly useless. You can't see the frames your monitor doesn't display. Vsync to 60 FPS is perfect because it stops your GPU burning electricity generating frames that won't be seen, and it stops tearing. There's no benefit I can think of in those 150 FPS over the 60 FPS.
The second thing the video shows, and it's crammed in at the very end, is removing redundant rendering tasks. Now this increases FPS in the benchmark to 460 FPS. Now that is significant. I think it's absolutely inevitable that it'll reduce quality. There's no way that I believe a third party intercepting GPU calls and removing some of them will not result in some decrease in quality. Anandtech and Tom's Hardware et all will no doubt in a few weeks/months come up with some kind of analysis of it. However it's the case that where you might be getting 30-60 FPS in a game, you could enable this and end up with 50-90 FPS in a game, which could change the whole experience. I would also guess that it might be a better experience than lowering the quality settings in game.
What I wanted to find information on, and what isn't here is on combining graphics processing power. Intel is upping the ante on Ivy Bridge, stating some 60% performance increase over Sandy Bridge. That's a fair amount of power to be sitting unused, and I want it to be used.
There's two ways I can think of it being used...
1) Kill off my discrete card unless something needs it. In other words have my discrete card go into the lowest power saving mode possible, and have the Ivy Bridge CPU handle desktop graphics. This is talked about here but I'd like to see something more concrete than the equivalent of Lucid telling me these aren't the droids I'm looking for.
2) Combine the 580 and the HD 4000 graphics performance in game. Some kind of hybrid SLI or Crossfire. The only mention I can find of this is here at Hardware Secrets where it says "Virtu Universal MVP, which allows you to combine the performance of the integrated graphics processor available in the CPU with the performance of any video card installed". That doesn't quite match what Lucid say... although Lucid seem to want to imply it, while being careful not to say it.
Edit : A bit more information in that link there. It's a preview before the Z68 boards came out of a previous version. It's talking more about how Virtu works and my understanding is a little more clear.
It sits between Windows and the graphics card drivers, and it decides who gets what. This allows it to feed normal Windows stuff to the intel GPU and only when it gets complicated stuff does it ship it out to the discrete card. This allows the discrete card to theoretically idle down and save power. At the time of that article being written there were no power savings made because the cards weren't idling down. Since then I don't expect nVidia/AMD have improved their drivers for power efficiency. I would expect at some point they will, however it might need new cards to come out for that to happen. The whole world and their dog is going to eventually have this technology, meaning the whole world and their dog who are running discrete cards will want the discrete card to power right down. I'm sure it will get covered in a review at some point after Ivy Bridge launch, I don't expect it'll be a top priority at launch.
I can't find any information about this doing some kind of hybrid sli/crossfire, I think if it was going to do it then Lucid themselves would mention it. So it appears with what I've managed to glean today that it's not going to help me directly - I don't need higher frame rates under vsync, and I don't think there's any games that'll task my current setup too hard.
Last edited: