Processing

Firewall

This morning I find myself in a situation that says “wait”. Since I have a WiFi connection and my iPad, it gives me the opportunity to spend some time surfing the web looking for cool stuff I wish I had thought of. Firewall is one of them.

Developed by Aaron Sherwood, with Mike Allison using a stretched piece of spandex for the interface, Processing as the software language and Arduino and a Kinect as the controllers they created something pretty spectacular.

The Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered. An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.

Advertisements

Cascades by NYTimes R&D.

ca_storyview1

OK my inner design geek is going to come out now.

As a designer that works with interactive content, web enabled content, and content that links back to and through social media, I’ve often wanted a way to visually track the sharing of content. I want to be able to see what happens when a piece of content is shared, and how that shared content propagates across the social media sphere.

The New York Times R and D lab have done just that. Using Processing they have developed a dynamic visual application ( Cascades ) to track what happens with each article published on the New York Times website. The video below explains it.

Clouds, an Interactive Film from James George and Jonathan Minard.

There is a brave new world of art meets programming that has been emerging out of the shadows for a few years now. Artists are using open source programming languages like Processing and Arduino to take complex ideas and render them into real works of interactive art.

The video below from  is an interactive film that documents and interviews a number of these emerging artist.  James George and Jonathan Minard have launched a Kickstarter about 6 months ago to complete this interactive film and I hope they meet their goal. With 39 days left they have attained about a third of the money they need.

The trailer for”Clouds” has a great look, and listening to the artists that are interviewed are truly inspiring.  Take five minutes on this Friday morning and give it a look.

“Over the last year the team has captured interviews with over 30 new media artists, curators, designers, and critics, using this new 3D cinema format called RGBD. CLOUDS presents a generative portrait of this digital arts community in a videogame-likeenvironment. The artists inhabit a shared space with their code-based creations, allowing you to follow your curiosity through a network of stories. The interview subjects in CLOUDS include Bruce SterlingCasey ReasDaniel ShiffmanGolan LevinGreg BorensteinJer ThorpJesse Louis-RosenbergJessica RosenkrantzJosh NimoyKarolina SobeckaKarsten “Toxi” SchmidtKyle McDonaldLindsay HowardRegine DebattySatoru HigaShantell MartinTheodore WatsonVera GlahnZachary Lieberman and many more.”

Pattern Studio’s Interactive Chemistry Table.

I am so envious of children’s education these days. I can’t imagine what it would be like to be in school, and have computers, iPads, iPhones, touchscreen tables, interactive content, etc. at my finger tips all day long. Dynamic content that extends the learning experience beyond text books, and lectures.

Pattern Studio has created an interactive exhibit for the Museum of Science and Industry in Chicago. Using RFID chips embedded in the table, as well as tactile pucks, the system allows users to build and create chemical reactions by moving objects on the surface.

Working with Sensetable hardware, the platform allows you to bring the computer interface off the screen, and create a more tactile, engaging, interactive experience. Sensetable tracks the position of the pucks on the table surface, and uses them as interface inputs to display content which is not only lit from underneath, but projected from above as well. The Sensetable concept and initial prototypes were developed by the Tangible Media Group at the MIT Media Lab. Patten Studio has developed a robust and affordable implementation of the Sensetable platform for a variety of commercial applications. Pattern Studio

This is the kind of technology, and human factors design that really makes me think the future of education is good. I know there are studies and people who say, our memories are being effected, that people are learning less now, that education is suffering from technology. I don’t believe any of  that is true. I think it means things are changing and not necessarily for the worse.

Sensetable uses the LusidOSC API for application development. This open source API makes it easy for application developers to use tools such as Processing to develop Sensetable applications.

Projection Mapping On My New Kicks.

Projection mapping on buildings, and products is quite the rage these days it seems. Outside of the United States that is. I’ve seen this kind of thing in Vegas being done by companies like Monster Media, but nothing like they are doing in Europe and Asia. The process isn’t that difficult to do anymore thanks to major advances in hardware and software over the last few years, which might explain why we are seeing more of it these days.

New Balance hired Smithssi and Flurry Interactive in Korea to create a projection mapped project for a new shoe they will be releasing later this year. I’m not sure where this will be applied, but I could see it being used in store fronts and in store displays to promote the product. What would be really interesting is if they combined this with an application built using something like processing to create an interactive experience, based on human contact with the shoe, and it’s position to the projection source.

This however shows how simple and easy, and effective projection mapping is, and why you’ll be seeing more of it in the near future.

Miska Knapek Sculptures of Wind.

Artist Miska Knapek has taken the open source programming language “processing” and used it to create some amazing wooden sculptures that are based on sensor data of wind movement. The processing application reads in data and then transfers it to an NC Milling machine which carves a wooden block to physically represent the air mass at the time the data is captured. The processing application measures wind direction, velocity and temperature over a set period of time.

This image represents five days of air and wind movement captured at a specific location. The direction of the line is the wind’s direction. The width and speed of movement is the wind speed. And the height is the temperature.

Video of the sculpture creation can be found here.