Wednesday, December 18, 2013

On META's AR headset

Several days ago I posted about AR, specifically the gamut of new companies entering the head-mounted AR display market. META was one of those companies I mentioned, and they started with a pretty cool idea with some serious limitations. Their META.01 version is just a display and has some limited distance sensing and display capability. The other day, META announced a new version of their headset, called the META Pro.

Highway to the AR zone
As you can see in the image, this new design actually looks fairly cool, if you're into aviators. I hope they come up with some new frame designs for the rest of us that look much less like hotshot pilots and more like backwater law enforcement when we sport these kinds of eyewear.

New aesthetics aside, the Pro version boast some pretty impressive stats. Twin 1280x720 pixel displays with a binocular 40° field of view give you 15x the screen real estate that Google's Project Glass offers. The sensor suite includes a 9-degrees of freedom with an accelerometer, a magnetometer, and a gyroscope, allowing the glasses to know exactly where they're facing. Twin color cameras and 2 distance sensors allow the glasses to recognize your hands and your surroundings and gives you the ability to use gestures as an input device. The whole thing is powered by a smartphone-sized pocket computer sporting a quadcore intel processor, 4 GB of RAM and a 32Whr battery, enough to handle some pretty sizeable computing.

Take a look at the video, which was shot through the glasses themselves.



One of the things that becomes immediately apparent is that the gesture recognition has improved a ton since the original developer edition video. They keep making the reference to the Iron Man movies where Tony Stark has a holographic interface and I think the comparison is warranted. Yeah, it requires that you wear their glasses, but it's still fairly incredible. The ability to interact with your phone and laptop in a completely holographic environment is neat, but, as I mentioned in my previous AR post, this tech will let us do things in a completely new way. Why do we need the visual outline of a laptop or an iPhone when you could just have icons floating around the periphery of your display? Move your hand over to one side to tap a menu button that opens a list of available apps. You don't need a holographic computer, just have the info on a floating window. I suppose some people will prefer less change, but I'm all for exploring how much we can make this tech change our world.

The META software is designed to be used with Unity 3D, which, as I've mentioned before, is basically a video game creation program. I've played around with Unity quite a bit and I really like the interface. You basically can create 3D worlds without ever writing a line of code, though if you want stuff to move around and interact, you'll need to be a proficient programmer. But this means that it's not terribly difficult to start making applications for META, and there are so many possibilities. The video depicted the user shaping the cone of a rocket engine and adding it to an assembly. Already on the website there are hundreds of app ideas, and since META has decided to allow 3rd party companies to develop software for their device, we can expect 100s more as time goes on. Some of the ideas are pretty cool. I've always loved the concept of using AR for LARPing so you can have computer-controlled enemies, give the mages some actual magic, even produce a scene that can't be easily created IRL.

One of the ideas I had (and posted on the website) was to incorporate the glasses with a flight simulator. Many people build their own simulator cockpits with varying degrees of complexity, which is pretty neat, but there are still only two options, each with limitations.

First, you use monitors in place of windscreens. This allows you to see the interior of your cockpit, use cool digital indicators and the like, but the problem is that the image won't be 3D and even with decent head tracking, it won't mimic the feel of actually being in an airplane since looking around won't provide the same changing direction of view.

Second, you can use a VR headset like the Oculus Rift. This gives you a 3D image with good head tracking, but since you can't see anything besides what's on the screen, why even bother building a sim cockpit except to have the feel and the location of the controls? Plus you can't see your hands unless you wear motion tracking gloves and have the program show you an approximation, so you'll spend time fumbling for switches.

My concept allowed you to build a fairly simple sim cockpit with just the controls and the actual interior, but with a reference pattern printed on any surface designed to be a window or windscreen. Then the AR glasses display the rendered game image anywhere they see that reference pattern. That way you can still see your hands and the interior of your cockpit, even your instruments, but you also get a 3D view with proper head tracking. The software could render the exterior view in a sphere around the player location so you'd be able to look all around, even stretch or lean to the sides and see just like if you were in a real vehicle.

Obviously the list of possibilities is pretty long. This kind of tech opens up many new doors, even doors we didn't know existed, and that's absolutely fantastic.


No comments:

Post a Comment