I’m a Mac … and I’ve Got a Dirty Secret
This is a great little video showing what I do as a User Experience Architect…
beats my usual analogy of designing blue-prints for web-sites like architects draw blue-prints for buildings.
Squiggle is an iPad app that allows you to draw lines, and play them as strings.
Vodpod videos no longer available.
Reading this article and viewing the g-Speak demo (below) some related internet surfing was inspired where I investigated other gestural and other ways of interfacing with computers.
I have compiled some links to some of the quirkier things I discovered below.
1. Thought as interface controller
Firstly, I found a really good report on six revisions entitled the “future of user interfaces” which showed various mind control projects i.e. “thought helmets” which are being developed by the US army.
Honda Research Institute have created a way to control the robot they created called “Asimo” by thought.
“The control method has been named Brain Machine Interface (BMI) and just requires the user to wear a head piece covered in sensors monitoring electrical activity and blood flow. Thinking about a specific action will then be transmitted to the robot who will carry out the action. Honda state that the accuracy rate is over 90%, but that the action commands aren’t yet real-time.”
2. Playing music by Gesture
For the muso’s out there check out this synth/musical instrument that is played via gestures…awesome! Seems like an evolved version of the Theremin used by the Beach Boys in the 60’s
“The Multi- Laser Gestural Interface is an open source and modular “free-gesture” controller that uses beams of laser light along with photo resistors to create a physical, fluid musical instrument.
With the MLGI, Wiley is attempting to bring a physical interactivity to electronic music performance. By removing the performer from behind the laptop, the audience becomes aware of the performer’s interaction with the controller, which creates an instant visual connection between the sound and the performer.”
For more information visit www.cyclespersecond.net.
[ SRC: Synthtopia.com ]
3. Augmented Reality
Augmented reality is discussed a lot in this blog…and utilising augmented reality via mobile devices is all the rage…..
BUT a very interesting project that has come to my attention is augmented reality via a contact lens!
It is being used to layer medical data over vision…such as blood sugar levels for diabetics…awesome techology.
4. Touch screens – touch is the new click.
It was designed prior to the launch of Apples’ touch-pad products like the Iphone and Ipod Touch and alternative -non-touch – interaction patterns were used. The device and plane had an extended production life-cycle and Lisa noted in her talk that it was really interesting to watch how people try and expect to interact with this device using the same gestures as they would use on an Iphone.
I must say, after having an Iphone for some time, I recently caught myself swiping the screen of my 5th generation ipod.
A precedent has been set for touch interaction.
Check out the video about Skinput which is a new techonology whereby you can control devices through tapping and pinching your skin. It is meant to extend the size of mobile device interfaces without any overhead.
“Skinput uses a bio-acoustic sensing array coupled with a wrist-mounted pico-projector to turn your skin into a touch-screen. Confused? Don’t be. It’s amazinly simple”
See YouTube for more about this and how it works
6. Moving pixels
Check the video it’s really cool
[ SRC: http://creativity-online.com ]
7. General banter about gestural interfaces
Dan Saffer has a talk up called “Tap is the new click” about gestural interfaces which is very interesting.
He states that “we are in the midst of an interaction design revolution….interaction design with our bodies“. Of you find this topic interesting, check out his talk – it’s really quite thought provoking.
MIT SENSEable City Laboratory in collaboration with ARES Lab (Aerospace Robotics and Embedded Systems Laboratory) have created a cool project called Flyfire where they have liberated pixels from the 2 dimensional.
By putting LED sensors on small helicopter devices they have created “smart pixels” thorough which created some interesting free-form displays can be created…showing a spatially animated viewing experience.
Some beautiful data visualisation by AaronKoblin –
from 24 hours of flight data…. some beautiful imagery and animation.
Check out the “flight patterns” data represented on a Google map with filters….
Another really cool work that he was involved with with Takashi Kawashima is the ten thousand cents project.
10,000 people were paid 1c each to re-draw a small square of a $100 US bill.
All proceeds from the project went to the one laptop child project….
You can register to volunteer to be one of the 2000 required volunteers.