
This environment can be used to rapidly construct musical instruments, sample tasks were designed and used in human-subject usability study. In these example, a musical keyboard and multi-touch mixer instrument is built.

Tools: urMus, Lua, iOS
Publication pending
Using Kinect Depth sensor to augment traditional keyboard instrument with a 3D gesture space, and top-down projection is used for visual feedback at the site of the gesture interaction.
This novel interaction model enables us to explore different visualizations:
Tools: Kinect, Processing, OpenFrameworks, OpenCV
Evaluating Gesture-Augmented Piano Performance, Qi Yang, Georg Essl CMJ 2014 PDF
Visual Associations in Augmented Keyboard Performance, Qi Yang, Georg Essl NIME 2013
Augmented Piano Performance using a Depth Camera, Qi Yang, Georg Essl NIME 2012
Visualizing contact network between 6 dormitories, by hour of day
As part of the ExFlu study by University of Michigan School of Public Health, I cleaned and analyzed multi-sensory data collected from 100 phones over 3 month. These include bluetooth and wifi contacts, accelerometer, and battery. Between-phone Bluetooth contact data are used to visualize social contact between study participants.
I also coordinated the collection of GPS position of local wifi access points. These data enabled me to localize and visualize activities on-campus and heat spots.
Tools: Python, MSSQL, kml, matplotlib
I developed as part of the web development team of Harvest Mission Community Church. I also lead the upcoming redesign of the web presence of the nonprofit organization.
Tools: PHP, html+CSS, Sketch
In collaboration with Sang Won Lee, I designed user interface and product concepts for a web-based writing application and corresponding mobile app which supports timed playback of the writing process, as well as enabling rich text-based expressions. The companion mobile app can capture nuanced typing gestures to enrich texting-like communications.


