MIT graduate student Robert Wang and Associate Professor Jovan Popović developed a gesture-based computing system with cheap hardware: an ordinary webcam and a pair of $1 multicolored Lycra gloves.

Other low-cost prototypes, i.e. the wearable SixthSense, have used tape on the fingertips. Wang said those were limited to “2D information” where “you don’t even know which fingertip [the tape] is corresponding to.”

Wang and Popović’s system can translate the 3D configuration of your hands & fingers on-screen with almost no lag time. (Screencap from the proof of concept video shown below).

Their software compares glove webcam images against a reference database of gestures. When a match is found, the software renders the corresponding hand position in a fraction of a second.

Hand-tracking is made possible by the distinctive glove design. The patchwork arrangement is unique to the front and back of the glove; the colors are distinguishable from each other and from background objects–under a range of lighting conditions.

Possible applications are in video games or in engineering. For example, designers could use this system to manipulate 3D models of commercial products or civic structures.

Wang is expanding his idea and plans to design similarly patterned shirts for use in whole-body motion capture.

—–
Those gloves are pretty rad on their own…is it bad that I want a pair to wear and not compute with? A shirt would be fantastic.

Advertisements