Skip to content

Research at St Andrews

WatchMI: pressure touch, twist and pan gesture input on unmodified smartwatches

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Open Access permissions

Open

Author(s)

Hui Shyong Yeo, Juyoung Lee, Andrea Bianchi, Aaron John Quigley

School/Research organisations

Abstract

The screen size of a smartwatch provides limited space to enable expressive multi-touch input, resulting in a markedly difficult and limited experience. We present WatchMI: Watch Movement Input that enhances touch interaction on a smartwatch to support continuous pressure touch, twist, pan gestures and their combinations. Our novel approach relies on software that analyzes, in real-time, the data from a built-in Inertial Measurement Unit (IMU) in order to determine with great accuracy and different levels of granularity the actions performed by the user, without requiring additional hardware or modification of the watch. We report the results of an evaluation with the system, and demonstrate that the three proposed input interfaces are accurate, noise-resistant, easy to use and can be deployed on a variety of smartwatches. We then showcase the potential of this work with seven different applications including, map navigation, an alarm clock, a music player, pan gesture recognition, text entry, file explorer and controlling remote devices or a game character.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services
Subtitle of host publicationMobileHCI '16
PublisherACM
Pages394-399
ISBN (Electronic)978-1-4503-4408-1
DOIs
StatePublished - 7 Sep 2016
Event18th International Conference on Human-Computer Interaction with Mobile Devices and Services - Florence, Italy

Conference

Conference18th International Conference on Human-Computer Interaction with Mobile Devices and Services
Abbreviated titleMobileHCI 2016
CountryItaly
CityFlorence
Period6/09/169/09/16
Internet address

    Research areas

  • Wearable, Mobile, Interaction, HCI, Novel interaction

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Workshop on object recognition for input and mobile interaction

    Yeo, H. S., Laput, G., Gillian, N. & Quigley, A. J. 16 Mar 2017 Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. TiTAN: exploring midair text entry using freehand input

    Yeo, H. S., Phang, X-S., Ha, T., Woo, W. & Quigley, A. J. 6 May 2017 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York: ACM, p. 3041-3049

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices

    Yeo, H. S., Phang, X-S., Castellucci, S. J., Kristensson, P. O. & Quigley, A. J. 2 May 2017 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems . New York: ACM, p. 4194-4202

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Out of sight: a toolkit for tracking occluded human joint positions

    Wu, C-J., Quigley, A. J. & Harris-Birtill, D. C. C. Feb 2017 In : Personal and Ubiquitous Computing. 21, 1, p. 125-135 11 p.

    Research output: Contribution to journalArticle

  5. RadarCat : Radar Categorization for input & interaction

    Yeo, H. S., Flamich, G., Schrempf, P., Harris-Birtill, D. C. C. & Quigley, A. J. 16 Oct 2016 Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, p. 833-841

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 245646463