Google announced its Project Jacquard recently. The project covers several research and development topics leading to the mass production of fabric sensors that can detect hand gestures. I think it’s a defining moment in e-textile research as it’s attracted the popular press and introduced the field of e-textiles to a consumer audience that might not have been aware of it before.
However. The announcement came via a video largely starring Ivan Poupyrev, the project’s founder. I don’t really think of project managers in Mountain View when I think about e-textiles. I think of the amazing work grown from a grassroots level that is shared with the community. Work that has historically had far more women involved than what is typically seen in tech spheres. Project Jacquard feels a little like an erasure of those contributions, so I wanted to honour them in my hack.
I decided to make a soft sensor that detects hand gestures in 24 hours at the Barcelona Music Hack Day. This is Project Jane.
The Concept
The scarf is entirely fabric and thread, with the exception of a single hard circuit board in the centre. A USB micro plugs into the board connecting the board to a computer.
The wearer can choose the gestures they would like to use – touching the scarf in different areas, poking it with one finger, or any other hand position that feels natural. A piece of software on the computer observes these different positions and learns them.
The example application used at the Hack Day is a drum machine, but this is just a single use case. The wearer controls the drum machine playback by creating the hand positions that they just had the computer learn.
Hopefully will be able to add a video of it in action later, but here’s a still from my presentation and a video of the presentation (start at 1:11:14):
The Details
The sensor inside the scarf is 12 hand-stitched lines of conductive thread. They are sewn by hand only because there wasn’t a working sewing machine available to use. The 12 lines create a 6 x 6 grid at the bottom of the scarf. This is the most sensitive part of the sensor, but the stitching all the way to the circuit board also acts as a sensor.
The 12 lines of conductive thread are sewn to the 12 electrodes of the Touch Board by Bare Conductive. The board reads in the capacitance values of each electrodes and outputs them via serial. A Python script I wrote runs on the computer and reads in that serial data then outputting it as OSC messages. The fun really starts then.
The OSC messages are sent to the Wekinator, an amazing tool for quickly adding machine learning to any data stream. The software listens to the sensor data and trains a model to classify the hand positions. After training the model, it runs a real-time classifier and outputs an OSC message indicating which hand position was detected. That can be integrated into any application that can understand OSC. I used a demo drum machine that Wekinator provides.
I was able to prototype the sensor twice before sewing the final one. I’m definitely getting better at pacing a project through a hack day.
The sensor was sewn in between two layers of conductive fabric and then quilted by hand to keep the internal sensor from folding and shorting itself.
The source code for the Python script to generate OSC messages and it’s matching Touch Board Arduino sketch are available on github.