Since I was young, I've had a subtle habit of often tapping the beat of a song with my front teeth. My right canines, and the rhythm of the bass drum, specifically. As I got interested in mechanical engineering, I thought it would be a fun idea to embed contact sensors in my mouth, so that I could actually make drum sounds with this tick. My dad's an orthodontist, so this isn't that far out.
But software is eating hardware. Instead of embedding sensors in the mouth, why not use the iPhone X's amazing face tracking camera and software to achieve the same effect? Above is my proof of concept of this technology. I'm working on mapping the MIDI instruments to discrete facial gestures: squint for the snare, blink for the hi-hat, and mouth "pop" for the kick.
I'd like to take this project even further. Are there other instruments worthy of face mapping? Could I use the iPhone X to write and play a song, just with my face? Could the tech be used by a physically impaired musician? I'd say all of the above.