Whoa. Just, whoa.
An old high school classmate of mine recently posted this surprising link on Facebook: LED Lights Make Augmented Vision a Reality. The article dates back to January so I may be behind the times, but this is the first I’ve heard of any breakthrough like this.
My first reaction: It’s like Minority Report! Soon we’ll be watching TV with contact lenses and later we’ll be interacting with interfaces with just our eyeballs. Nifty.
The article mentions augmented reality, but aside from that, just think of what implications this will have in HCI. Interfaces floating in front of our eyes… The design of such will likely change as the way we use them changes. No longer inhibited by monitors and their constraints, designers would have to accommodate for this. However, I imagine design principles to remain the same since they are based on human perception, but they would likely be applied differently.
The only concern I have is that floating interactive interfaces, such as ones in Minority Report, seem to require more effort to use. Grabbing a window physically by extending your arm seems to ask a lot of the user. Perhaps this won’t be the case, however, and that technology will only require miniscule movements to function – a twitch of a finger, perhaps.