Detailed report into LiquidVR
Liquid...get it?
As GDC progresses here in San Francisco, AMD took the wraps off of a new SDK for game developers to use to improve experiences with virtual reality (VR) headsets. Called LiquidVR, the goal is provide a smooth and stutter free VR experience that is universal across all headset hardware and to keep the wearer, be it a gamer or professional user, immersed.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/01.jpg
AMD's CTO of Graphics, Raja Koduri spoke with us about the three primary tenets of the LiquidVR initiative. The 'three Cs' as it is being called are Comfort, Compatibility and Compelling Content. Ignoring the fact that we have four C's in that phrase, the premise is straight forward. Comfortable use of VR means there is little to no issues with neusea and that can be fixed with ultra-low latency between motion (of your head) and photons (hitting your eyes). For compatibility, AMD would like to assure that all VR headsets are treated equally and all provide the best experience. Oculus, HTC and others should operate in a simple, plug-and-play style. Finally, the content story is easy to grasp with a focus on solid games and software to utilize VR but AMD also wants to ensure that the rendering is scalable across different hardware and multiple GPUs.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/03.jpg
To address these tenets AMD has built four technologies into LiquidVR: late data latching, asynchronous shaders, affinity multi-GPU, and direct-to-display.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/02.jpg
The idea behind late data latching is to get the absolute most recent raw data from the VR engine to the users eyes. This means that rather than asking for the head position of a gamer at the beginning of a render job, LiquidVR will allow the game to ask for it at the end of the rendering pipeline, which might seem counter-intuitive. Late latch means the users head movement is tracked until the end of the frame render rather until just the beginning, saving potentially 5-10ms of delay.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/04.jpg
Continue reading our first impressions of the new AMD LiquidVR SDK for virtual reality!!
The next feature, asynchronous shaders, is what allows LiquidVR to handle that late latch properly. Being able to use different ACEs (asynchronous compute engines) from the GCN GPU on different tasks, LiquidVR can execute VR-specific post processing while other renders are occurring. This means the time warp function that maps the head tracking movement to the rendered image can be done at the last possible moment. Time warping alters the rendered frame slightly to properly track the head movement after the frame drawing by the GPU is complete. If you have moved your head more to the right after rendering then the warp function will alter pixels to move the image to the right as well. This is a really complex process but the fundamental understanding is straight forward.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/05.jpg
Affinity multi-GPU brings us to the past - a return of SFR, split frame rendering. AMD realizes as most of us have that the ability to map a GPU to each eye makes the most sense and is surprisingly easy to integrate. The benefit again is lower latency, rather than the inherent delay in a multi-GPU alternate frame system (AFR). Developers will also benefit from lower CPU overhead thanks to a removal of duplicate common operations between the two eyes. This is not limited to just two GPUs though - AMD said that 3, 4, 5 GPUs could all be supported if the developer builds in support.
[/URL]
Finally we have direct-to-display, and this feature is mostly to promote compatibility between VR headsets. LiquidVR brings native HMD support with direct front buffer rendering and provides direct application control to the headset even in operating systems and environments that didn't plan for it. This might be less useful for Windows gaming environments where VR is expected to move but for professional applications this should ensure a better user experience.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/07.jpg
AMD then provided several examples following the rendering process to a VR headset and the progression of performance and latency with original implementations, changes since Oculus has launched and what they expect to improve with LiquidVR.
http://www.pcper.com/files/imagecache/article_max_width/review/2015-03-03/08.jpg
There is a lot of data in this one slide, but focusing on a couple of the most important points will help us understand specifically what LiquidVR does. The round green circle represents the warping function, a rendered frame is slightly modified to match the additional movement of the headset after the graphics engine started its process. The top example shows the result when a frame has correctly rendered inside the Vsync window, has time to "late-latch" from the data from the CPU/head tracking and also has time to properly time warp the frame before output to a frame buffer. Everything is great and works as expected.
The bottom example in that slide shows what happens when a frame doesn't render in time to meet the Vsync; and actually in this case does not render fast enough to meet the Vsync + time warp requirement. In this case, the previous frame (that was already warped and then output to the user) would be re-warped again with the additional head tracking data and sent to the user. Obviously the more this occurs the more likely you are to run into artifacting and edge issues on the output image, but that depends on the speed and distance of the motion.
http://www.pcper.com/reviews/Graphics-Cards/AMD-LiquidVR-SDK-Aims-Silky-Smooth-VR-all-Headsets
** Comment removed **
Host your own images and stop with the baiting please.
We only ask nicely so many times.
Thanks
Surveyor