Whether it is Apple releasing the new iPad, Rovio releasing the next iteration of their hugely successful Angry Birds line of casual games, or (provide a non-mobile reference like a hybrid car or Tesla or something) Tesla Motors releasing its new Model S, companies look to move the bar when they show off new innovations. MicroVision is doing just that.
We announced our new PicoP® Gen2 HD laser display engine based on direct green lasers(PicoP Gen2) at CES 2012. It’s a 720p HD display engine that enables a suite of advanced features. One of the most intriguing is Touch Interaction. This video that we’ve put together demonstrates in a simple manner the experience of touch with a projected image.
What appears as a simple human interaction with pictures on the wall from a small pico projector is a complex technical process. Because our PicoP display engine uses a raster scan method to create a picture pixel by pixel, line by line, it knows at any given time where those pixels are being painted by the engine. This allows us to couple a photo detector with the engine which then can be used to track a reflective surface by tracking the XY coordinate of the light that is reflected back. In the video, that reflective surface is provided by the small white ring at the end of the finger, but it can also be a metal pen in the hand or a piece of jewelry on the finger, not a special object.
There are API’s that extract the XY coordinates for where the reflector is in relation to the image. These coordinates can then be mapped to gesture controls, mouse controls, etc. By mapping you can then use the native controls the user is used to whether it be gesture based or mouse based. This ensures a user experience that has a low learning curve. One of the key benefits of using the PicoP technology for Touch is the low computational load it takes to track the reflector. Other technologies need to map the entire image plane, overlay the image and then do lots of computations to track all the movements. For example the Microsoft Kinect “sees” the player and builds a skeleton for the player, it then overlays an image on that skeleton to represent the player. Through a series of complex algorithms the Xbox figures out the logic so that your left hand isn’t where your right hand should be or your head isn’t in the middle of your chest. This process takes a lot of computational horsepower and if not managed well can lead to a disappointing user experience. Kinect has mastered this application for console based gaming. For a mobile experience MicroVision believes that our low computation load method of enabling touch interaction offers a solution with tremendous potential. Especially because our Touch projection doesn’t get disrupted if you have objects in the foreground or if your projection surface is irregular. This goes back to those API’s that can provide exact XY coordinates of the reflector within the image. Our patent pending method of enabling touch with the PicoP Gen2 display engine can definitely move the bar for OEMs looking to tap into the hot mobile gaming market.