TapType: AI-assisted hand tracking using only accelerometers

TapType: AI-assisted hand tracking using only accelerometers

The team at the Sensing, Interaction & Perception Laboratory at ETH Zurich, Switzerland, has created TapType, an interesting text input method that relies exclusively on a pair of wrist devices that detect acceleration values ​​when the wearer write on any old surface. By entering the acceleration values ​​from a pair of sensors on each wrist into a neural network such as the Bayesian inference classification, which in turn feeds a traditional probabilistic language model (predictive text, for you and me), the text The result can be entered up to 19 WPM with an average error of 0.6%. TapTypers Expert reports speeds of up to 25 WPM, which could be quite usable.

The details are a bit sparse (it’s a research project, after all), but the real hardware seems pretty simple, based on Dialog DA14695, which is a nice SoC based on Bluetooth Low Energy, based on Cortex M33. This is an interesting device in itself, which contains a “sensor node controller” block, which is capable of manipulating the sensor devices connected to its interfaces, independent of the main CPU. The sensor device used is the Bosch BMA456 3-axis accelerometer, which is notable for its low power consumption of only 150 μA.

The user can “type” on any convenient surface.

The bracelet units themselves appear to be a combination of a main PCB that houses the BLE chip and the support circuit, connected to a flexible PCB with a pair of accelerometer devices at each end. The assembly was then introduced into a flexible bracelet, probably made of 3D printed TPU, but we really guess, because the progress from the first built-in platform to the portable prototype is unclear.

What is clear is that the bracelet itself is just a bad device for streaming data, and all the intelligent processing is done on the connected device. The training of the system (and the subsequent selection of the most accurate classification architecture) was performed by recording volunteers “typing” on an A3-sized keyboard image, with finger movements tracked with a motion tracking camera, while recording streams of acceleration data from both joints. There are some additional details in the paper published for those interested in delving a little deeper into this research.

The eagle eye may remember something similar from last year, from the same team, which correlated the detection of bone conduction with the tracking of the VR hand to generate events of entry into a VR environment.

Leave a Comment