Touchscreens have revolutionized the way we communicate with electronics, but sometimes they can get a little cramped — wouldn’t it be great if the iPhone‘s screen was just a little bit bigger? One creative solution is Skinput, a device that uses a pico projector to beam graphics (keyboards, menus, etc.) onto a user’s palm and forearm, transforming the skin into a computer interface.
The device, built as a collaboration between researchers at Carnegie Mellon University and Microsoft, uses the different sounds emitted when we tap various parts of our skin (AKA acoustic patterns) to figure out what icon, menu, or key is tapped. Skinput contains five piezoelectric cantilevers to detect sound frequencies and respond to different “skin buttons”. The system is surprisingly on target. It can detect 5 skin locations with 95.5% accuracy–about the same as many actual touchscreen devices.
No word on when Skinput might actually be integrated into electronic devices, but we wonder if the technology could one day eliminate — or at least cut down on — the use of actual touchscreens in mobile devices. Check out the video below.