[FoRK] Skinput: Appropriating the Body as an Input Surface

Stephen Williams sdw at lig.net
Mon Apr 12 09:50:12 PDT 2010


Very nice!  Clearly a big win.

http://www.chrisharrison.net/projects/skinput/
>
> Devices with significant computational power and capabilities can now 
> be easily carried on our bodies. However, their small size typically 
> leads to limited interaction space (e.g., diminutive screens, buttons, 
> and jog wheels) and consequently diminishes their usability and 
> functionality. Since we cannot simply make buttons and screens larger 
> without losing the primary benefit of small size, we consider 
> alternative approaches that enhance interactions with small mobile 
> systems.
>
> One option is to opportunistically appropriate surface area from the 
> environment for interactive purposes. For example, Scratch Input 
> <http://www.chrisharrison.net/projects/scratchinput/index.html> is 
> technique that allows a small mobile device to turn tables on which it 
> rests into a gestural finger input canvas. However, tables are not 
> always present, and in a mobile context, users are unlikely to want to 
> carry appropriated surfaces with them (at this point, one might as 
> well just have a larger device). However, there is one surface that 
> has been previous overlooked as an input canvas, and one that happens 
> to always travel with us: our skin.
>
> Appropriating the human body as an input device is appealing not only 
> because we have roughly two square meters of external surface area, 
> but also because much of it is easily accessible by our hands (e.g., 
> arms, upper legs, torso). Furthermore, proprioception (our sense of 
> how our body is configured in three-dimensional space) allows us to 
> accurately interact with our bodies in an eyes-free manner. For 
> example, we can readily flick each of our fingers, touch the tip of 
> our nose, and clap our hands together without visual assistance. Few 
> external input devices can claim this accurate, eyes-free input 
> characteristic and provide such a large interaction area.
>
> In the paper linked below, we present our research on Skinput – a 
> method that allows the body to be appropriated for finger input using 
> a novel, non-invasive, wearable bio-acoustic sensor.
>
>
>
> Download Paper (to be released April 12 @ CHI 2010)
>
> Harrison, C., Tan, D. Morris, D. 2010. Skinput: Appropriating the Body 
> as an Input Surface. To appear in Proceedings of the 28th Annual 
> SIGCHI Conference on Human Factors in Computing Systems (Atlanta, 
> Georgia, April 10 - 15, 2010). CHI '10. ACM, New York, NY.
>
> Researchers
>
> Chris Harrison <mailto:chris.harrison at cs.cmu.edu> - Carnegie Mellon 
> University <http://www.cmu.edu>
> Desney Tan <http://research.microsoft.com/en-us/um/people/desney/> - 
> Microsoft Research <http://research.microsoft.com>
> Dan Morris <http://research.microsoft.com/en-us/um/people/dan/> - 
> Microsoft Research <http://research.microsoft.com>
>

http://www.pddnet.com/news-student-uses-skin-as-input-for-mobile-devices-040710/
>
>
>   Student Uses Skin As Input For Mobile Devices
>
> By Carnegie Mellon University
> Wednesday, April 07, 2010
>
>     * Digg Digg
>       <http://digg.com/submit?phase=2&url=http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260&title=Student%20Uses%20Skin%20As%20Input%20For%20Mobile%20Devices%20%7C%20Product%20Design%20and%20Development>
>
>     * Delicious Delicious
>       <http://del.icio.us/post?url=http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260&title=Student%20Uses%20Skin%20As%20Input%20For%20Mobile%20Devices%20%7C%20Product%20Design%20and%20Development>
>
>     * Add to Google Google It
>       <http://www.google.com/bookmarks/mark?op=edit&output=popup&bkmk=http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260&title=Student%20Uses%20Skin%20As%20Input%20For%20Mobile%20Devices%20%7C%20Product%20Design%20and%20Development>
>
>     * Add To furl furl It
>       <http://www.furl.net/storeIt.jsp?u=http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260&t=Student%20Uses%20Skin%20As%20Input%20For%20Mobile%20Devices%20%7C%20Product%20Design%20and%20Development>
>
>     * Add to Technorati Technorati
>       <http://technorati.com/faves?add=http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260>
>
>     * Email This Email
>       <mailto:?body=Hi,%20check%20out%20this%20article:%20http%3a%2f%2fwww.pddnet.com%2farticle.aspx%3fid%3d87260&subject=Student%20Uses%20Skin%20As%20Input%20For%20Mobile%20Devices%20%7C%20Product%20Design%20and%20Development>
>
>     * Print This Print <javascript: void(0)>
>
>  Share <http://www.addthis.com/bookmark.php>
> [-] <http://www.pddnet.com/#> Text <http://www.pddnet.com/#> [+] 
> <http://www.pddnet.com/#>
> Read/Post Comments 
> <http://www.pddnet.com/news-student-uses-skin-as-input-for-mobile-devices-040710/#comments> 
>
> Loading...
>
> Skinput_technology_for_portable_devices
> Chris Harrison demonstrates Skinput technology. (Credit: Image 
> courtesy of Carnegie Mellon University)
>
>
>
> A combination of simple bio-acoustic sensors and some sophisticated 
> machine learning makes it possible for people to use their fingers or 
> forearms -- potentially, any part of their bodies -- as touchpads to 
> control smart phones or other mobile devices.
>
> The technology, called Skinput, was developed by Chris Harrison, a 
> third-year Ph.D. student in Carnegie Mellon University's 
> <http://www.cmu.edu/> Human-Computer Interaction Institute (HCII), 
> along with Desney Tan and Dan Morris of Microsoft Research. Harrison 
> will describe the technology in a paper to be presented on April 12, 
> at CHI 2010, the Association for Computing Machinery's annual 
> Conference on Human Factors in Computing Systems in Atlanta, Ga.
>
> Skinput, www.chrisharrison.net/projects/skinput/ 
> <http://www.chrisharrison.net/projects/skinput/>, could help people 
> take better advantage of the tremendous computing power now available 
> in compact devices that can be easily worn or carried. The diminutive 
> size that makes smart phones, MP3 players and other devices so 
> portable, however, also severely limits the size and utility of the 
> keypads, touchscreens and jog wheels typically used to control them.
>
> "With Skinput, we can use our own skin -- the body's largest organ -- 
> as an input device," Harrison says "It's kind of crazy to think we 
> could summon interfaces onto our bodies, but it turns out to make a 
> lot of sense. Our skin is always with us, and makes the ultimate 
> interactive touch surface."
>
> ADVERTISEMENT
> click here 
> <http://ad.doubleclick.net/click;h=v8/3976/0/0/%2a/o;223034909;0-0;0;15044005;237-250/250;35851460/35869314/1;;%7Esscs=%3fhttp://www.endevco.com> 
>
>
> In a prototype developed while Harrison was an intern at Microsoft 
> Research last summer, acoustic sensors are attached to the upper arm. 
> These sensors capture sound generated by such actions as flicking or 
> tapping fingers together, or tapping the forearm. This sound is not 
> transmitted through the air, but by transverse waves through the skin 
> and by longitudinal, or compressive, waves through the bones.
>
> Harrison and his colleagues found that the tap of each fingertip, a 
> tap to one of five locations on the arm, or a tap to one of 10 
> locations on the forearm produces a unique acoustic signature that 
> machine learning programs could learn to identify. These computer 
> programs, which improve with experience, were able to determine the 
> signature of each type of tap by analyzing 186 different features of 
> the acoustic signals, including frequencies and amplitude.
>
> In a trial involving 20 subjects, the system was able to classify the 
> inputs with 88 percent accuracy overall. Accuracy depended in part on 
> proximity of the sensors to the input; forearm taps could be 
> identified with 96 percent accuracy when sensors were attached below 
> the elbow, 88 percent accuracy when the sensors were above the elbow. 
> Finger flicks could be identified with 97 percent accuracy.
>
> "There's nothing super sophisticated about the sensor itself," 
> Harrison says, "but it does require some unusual processing. It's sort 
> of like the computer mouse -- the device mechanics themselves aren't 
> revolutionary, but are used in a revolutionary way." The sensor is an 
> array of highly tuned vibration sensors -- cantilevered piezo films.
>
> The prototype armband includes both the sensor array and a small 
> projector that can superimpose colored buttons onto the wearer's 
> forearm, which can be used to navigate through menus of commands. 
> Additionally, a keypad can be projected on the palm of the hand. 
> Simple devices, such as MP3 players, might be controlled simply by 
> tapping fingertips, without need of superimposed buttons; in fact, 
> Skinput can take advantage of proprioception -- a person's sense of 
> body configuration -- for eyes-free interaction.
>
> Though the prototype is of substantial size and designed to fit the 
> upper arm, the sensor array could easily be miniaturized so that it 
> could be worn much like a wristwatch, Harrison said.
>
> Testing indicates the accuracy of Skinput is reduced in heavier, 
> fleshier people and that age and sex might also affect accuracy. 
> Running or jogging also can generate noise and degrade the signals, 
> the researchers report, but the amount of testing was limited and 
> accuracy likely would improve as the machine learning programs receive 
> more training under such conditions.
>
> Harrison, who delights in "blurring the lines between technology and 
> magic," is a prodigious inventor. Last year, he launched a company, 
> Invynt LLC, to market a technology he calls "Lean and Zoom," which 
> automatically magnifies the image on a computer monitor as the user 
> leans toward the screen. He also has developed a technique to create a 
> pseudo-3D experience for video conferencing using a single webcam at 
> each conference site. Another project explored how touchscreens can be 
> enhanced with tactile buttons that can change shape as virtual 
> interfaces on the touchscreen change.
>
> Skinput is an extension of an earlier invention by Harrison called 
> Scratch Input, which used acoustic microphones to enable users to 
> control cell phones and other devices by tapping or scratching on 
> tables, walls or other surfaces.
>
> "Chris is a rising star," says Scott Hudson, HCII professor and 
> Harrison's faculty adviser. "Even though he's a comparatively new 
> Ph.D. student, the very innovative nature of his work has garnered a 
> lot of attention both in the HCI research community and beyond."
>
> For more information visit www.cmu.edu <http://www.cmu.edu>
>

sdw





More information about the FoRK mailing list