Kinect

“InAir TV”. SeeSpace Blurs the Lines of Your TV Experience.

A few years ago as CES wrapped up I posted about the beginning of the convergence between TV, your computer, and other content mediums. While the progress has been slow, it is definitely underway. Now days almost every TV, Blueray player, and DVD player is net connected with smart apps that allow for additional content delivery. Other hardware devices like the Roku box, Logitech’s Review, and Google’s Fiber initiative are continuing to blur the lines between a passive TV experience, and a deeper, richer interactive experience.

As CES winds down, one of the more exciting things to come from it this year is SeeSpace’s InAir TV, which brings an augmented reality experience to your TV set. InAir TV places Web content inline with the consumers’ TV viewing experience, without having to switch to a second screen. This creates a completely new dynamic medium similar to the UI?UX in the movie “Minority Report”.

The InAir TV which is still in KickStarter phase, involves using augmented reality technology to overlay additional content analyzed from what you are watching. The secondary content floats on a 3D layer between the TV picture and the viewer if you have a 3D TV. This second layer of content creates a more interactive, intuitive, and dynamic viewing experience. For example, if you are watching a Formula 1 race, you would be able to pull up the drivers, their stats, track conditions, leader info, points and standings, plus more. Instead of shrinking the picture on your TV screen to fit the additional information in, it would float above, or beside enhancing the viewing experience.

Interface-1

The hardware is controlled in two ways. The first allows you to control InAir TV using your smartphone using its screen as a track pad. The second takes it to the next level by allowing you to use gesture control via a Kinect, or Leap controller.

KeynoteScreenSnapz003

SeeSpace will be launching a Kickstarter campaign later this month and will be available to pre-order for $99 later this year.

Advertisements

Structure Sensor. A 3D scanner for your iPad.

Structure sensor has been designed from the ground up to be a fully functional 3D scanner for your tablet. Unlike other 3D scanner technology that is designed to work with gaming consoles, or desktop computer systems, Structure sensor is optimized for mobile. The device requires no external power and attaches to the lighting connector on your iPad. It has a mobile optimized range making it ideal for field use. Structure allows the end user to quickly capture objects and the surrounding environment in digital form and export the data to CAD programs for 3D printing or additional modeling and rigging.

Using structured light, the Structure Sensor generates a VGA depth stream at 30 frames per second, where each pixel represents the distance to a real-world point. Structure Sensor’s depth sensing is powered by PrimeSense technology.

20131127-110837.jpg

20131127-110844.jpg

20131127-110852.jpg

20131127-110859.jpg

Fracture IO. A New Kind of Photo Booth.

MPC Digital has reinvented the photo booth with Fracture IO, an installation that creates generative 3D animated artwork based on the human form. Fracture IO made its debut at the One Show Awards last week at the Bowery Hotel in New York. MPC Digital was approached by JWT and asked to create an innovative piece for the show. The result is an update to something that has remained fairly unchanged for the last 100 years.

Fracture IO (Process) from MPC Digital on Vimeo.

MPC Digital used the Microsoft Kinect to capture both image and depth data in order to create a high-res 3D scan and then generate a full 3D model of whomever was in the booth. Using a combination of computer vision algorithms originally developed for robotic navigation, MPC Digital stitched together the image and depth data from 4 Kinects into a complete, accurate 3D model of whomever or whatever was captured.

Captured geometry was utilized as the basis for dynamically generated artwork. Each artwork created was then posted online with mobile-friendly, shareable unique URL that allowed visitors to experience it in 3D.

Razorfish Emerging Experiences – KinectiChord

 

About a week ago at the Cannes Lions International Festival of Creativity, Razorfish Emerging Experiences Lab debuted our their latest creation, the KinectiChord. KinectiChord is a multiuser, multisensory experience that blends physical and digital experiences in an unexpected and delightful way.

The device was on display in the Microsoft Advertising Beach Club over the course of the festival and participants were encouraged to interact with the device. The KinectiChord experience allows multiple users to see, hear and feel technology like never before. It’s a really nice blend of art and technology that extends the overall user experience. It’ll be interesting to see where Razorfish takes this in the future.

Meet “Illumiroom”, and Immersive Gaming.

A few months ago Microsoft demoed “Illumiroom” and while it looked promising, the preview didn’t go into a whole lot of detail. The five minute YouTube video below from Microsoft goes into greater detail, and really shows where Microsoft is pushing the next level of gaming. This is some pretty slick technology, and has a ton of potential well beyond the gaming world. It’ll be interesting to see how long it takes this to make it to your living room, and what the price point will be.

Firewall

This morning I find myself in a situation that says “wait”. Since I have a WiFi connection and my iPad, it gives me the opportunity to spend some time surfing the web looking for cool stuff I wish I had thought of. Firewall is one of them.

Developed by Aaron Sherwood, with Mike Allison using a stretched piece of spandex for the interface, Processing as the software language and Arduino and a Kinect as the controllers they created something pretty spectacular.

The Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered. An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.

Razorfish is Planning to Enhance Your Consumer Experience.

While parts of this video might seem like an impractical way to experience a shopping experience, I guarantee you this is in your near future. As smart phones, tablets, interactive signage, and devices like Microsoft’s Surface and Kinect become more ubiquitous, this kind of experience will be more common. The example below centers around shopping for clothes, and actually eliminates trying things on. I doubt that step will ever go away, but this kind of digital interaction combined with real world experiences is coming.