MIT Technology Review put a brief article up at the end of June describing some of the unrealized potential in the iPhone. Turns out that, in addition to having the interface to kill all portable interfaces, it is tricked out with a number of just slightly utilized sensors; specifically an accelerometer, an ambient light meter, and an IR motion sensor. While Apple has applied these to the admirable goal of rotating your screen and adjusting your brightness for you, some other smart people have already been busy using them for more creative ends. Like learning about human nature.
Now, take a step back: Accelerometers are motion detectors--they get used to help measure distance walked (pedometers) and the intensity of car crashes (impact meters), among other things. Some creative designers have figured out how to make them fun (Nintendo Wii). It's not a huge stretch to combine this sort of data with light, motion and sound sensing to start getting a picture of what a user is doing all day, moment to moment. Standing, sitting, and walking have recognizable signatures, and from there it's a short computational step to recognizing when a user is cooking, working, hanging out, shopping, etc. It's like a diary, but honest. It's like Twitter, but less irritating.Now, take another step back: Once again, MIT researchers are way ahead of us. Here's a study group called Reality Mining that's been gathering data in this manner from study participants since 2004, combining it with data on proximity sensing between users, and analyzing the hell out of it. Findings are ongoing, but what's already there is massively intriguing. Social networking in the real world has a statistical signature, and measurable patterns called Eigenbehaviors start emerging. It's still mostly in the realm of statisticians and analysts, but the trajectory points insistently toward a new and powerful tool for designers.
Potential applications are significant for....well, who aren't they significant for? Consumer electronics designers looking for new interface methods; medical and fitness product designers looking for better ways to get information from users to devices; design researchers who want higher quality data from a less-intrusive method: pay attention. Things are changing.
Create a Core77 Account
Already have an account? Sign In
By creating a Core77 account you confirm that you accept the Terms of Use
Please enter your email and we will send an email to reset your password.
Comments
I think only once we have collected enough of this kind of data, will it make sense to start developing 3D interfaces that will actually be self-evident (low learning curve, if at all) and faster to use than our current 2D desktops.