Shaking has become a widely integrated gesture in smartphones. We could imagine a mobile device recognizing a much wider set of gestures. What gestures can be recognized using a standard smartphone accelerometers and gyroscope? How can the sensor data be visualized to communicate gesture characteristics to the user, such as the speed, force and pattern of the current gesture compared to previous gestures? This project will include iterative design and usability evaluation.
Lab allocations have not been finalised