Ah ok, I dug into more and found the full video. So I'm completely wrong that this is just 'multiresolution shading' renamed. THAT truly was just a flat reduction in resolution on the outside edges of the view.during the livestream, its on youtube, or here is a still image from the same
https://twitter.com/NVIDIAGeForce/status/728770647178891265
don't forget, nvidia multiprojection for VR doesn't just include rendering sections at lower resolution it also renders both eyes concurrently so there is an efficiency saving there as well, hence why they are able to hit 50% increase in FPS for maybe only a 30% saving on resoultion, so the render target becomes maybe 130% instead of 170%
This, on the other hand, seems like a clever way to use a smaller supersample in general by correcting the projection beforehand. Along with using stereo instancing for more help with the performance.
And it *sounds* like this might actually be a hardware-enabled feature for Pascal and not just a software implementation. But I'm not sure. I know stereo instancing could be done before, but maybe not all together at the same time with the projection correction with minimal performance overhead/rendering latency.
So my bad, I've been arguing, thinking we're talking about something fairly different!
Also, I dont think many(or any) games/apps use a 170% supersample. I think 140% is kind of 'the standard'.
Last edited: