A while ago I was working on a fully immersive stereoscopic “remote head”. On one side the user wears a HMD and on the other side there would be a servo-actuated stereoscopic camera programmed to match the orientation of the user’s head-tracking device. Sadly even if I used very fast servos it wouldn’t be possible to move the camera fast enough to accurately match the perspective of the user. The head-tracking lag would be quite disorienting, unacceptably so even before we factor in the network lag.
On the second prototype, I decided to use a monoscopic 360 degree camera instead. The remote head would transmit the whole image-sphere to the user’s machine and I would clip the viewport on the client-side using the tracker information – effectively eliminating head-tracking lag by doing it locally. The overall experience should be great even though the video feed from the remote head could be several milliseconds behind real-time.
And here is how all of this intersects with VR: a cloud-VR server could render a 360 degree image sphere around the player, transmit the whole frame to the client which would then clip the viewport based on the orientation of the user’s head. It could even be done adaptively to save bandwidth – instead of transmitting the whole image-sphere it could send only a portion of it based on how fast the user is likely to turn his head in the next N milliseconds or at a reduced frame rate, and since the raster viewport is clipped by the client, the user would still be able to look around at 60fps. Input-to-display lag would still exist but developers could overcome at least some of it by designing around this limitation.
The potential end-game could be something like a cloud-powered Oculus Rift style HMD with no console or PC required – in other words: no hassle VR that is just plug & immerse. The required tech is already available, both NVIDIA and AMD have announced support for GPU cloud rendering, OculusVR is finally shipping the $300 Rift development kits worldwide and all the required client-side processing could easily be handled by a $100 Android board.
So who is going to be the pioneer who will make cloud VR a reality?