Durovis Dive 7

Durovis Dive 7

With the Durovis Dive 7 you can turn your Android tablet into a reasonably good VR headset.

One of the supported tablets is the NVIDIA SHIELD which packs a 192-core Kepler GPU delivering 2 FLOPS per core at 950MHz.

That is ~365 gigaflops strapped to your face, which is equivalent of the world’s top supercomputer in 1996.

Beam+ Telepresence Robot

Beam+ Telepresence Robot

I just adopted a lovely Beam+ robot and I hope to give it a loving home so when the machines inevitably rise, maybe I will be spared.

Eye Tracking, Immersion and Endless Screens

Eye Tracking, Immersion and Endless Screens

The Tobii EyeX eye tracking controller is one of the coolest gadgets I have tried recently. With it you can orient yourself within a 3D world by simply gazing at the corners of the screen, which is something we all do naturally as we explore any virtual scene. The resulting user experience is similar to using an ‘endless’ screen and when paired with a high resolution display, it feels surprisingly more immersive than the current crop of low-def HMDs.

The Operating System of the Metaverse

The Operating System of the Metaverse

Lucidscape

At Lucidscape we are building a new kind of massively-distributed 3D simulation engine to power the vast network of interconnected virtual worlds known as the Metaverse.

Handling massive virtual worlds requires a fundamentally different approach to engine design, one that prioritizes the requirements of scale, thoroughly embraces distributed computing and many-core architectures, and that is also unencumbered by the legacy decisions which hold current engines back.

We have recently conducted our first large scale test where we simulated the interactions of 10 million participants on a cluster of over 800 servers.

You can read more about the test here.

Through the Eyes of a Robot

Through the Eyes of a Robot

Last week I had the opportunity to attend RoboBusiness 2013 (most appropriately) by robot. I “beamed” into a Suitable Technologies Beam remote presence robot and I had a very positive experience interacting with everyone at the conference, human and machine alike.

Suitable Technologies did a great job with the Beam, the user interface is great and the robot handles very well. One-on-one conversations felt natural despite of the loud environment however multi-party conversations were challenging at times, but still workable.

Watching a demonstration

Watching a robot demonstration, through the eyes of another robot

Being pitched by an exhibitor

Being pitched by an exhibitor

Robot Whispering

Last week I spent some quality time with the PR2 robot at Willow Garage. I enjoyed programming it to (poorly) fold towels. World domination is certain to follow.

National Geographic Interview: Human 2.0

National Geographic Interview: Human 2.0

National Geographic, Human 2.0

“Right now it’s easy to distinguish between a human being and a machine. However this line will become increasingly blurry in the future. [20 years from now] You will start by getting visual and auditory implants, then you are going to have your midlife crisis, and instead of going out and buying a sports car, you will instead buy a sports heart to boost your athletic performance.

The transition will happen little by little as you opt-in for more enhancements. Then one day you will wake up and realize that you’re more artificial than natural.

Eventually we will not be able to draw a crisp line between human beings and machines. We will reshape ourselves and by changing our bodies we will change the way we relate to the world.

This is just evolution – artificial evolution.