Skip to content

Research at St Andrews

Sidetap & slingshot gestures on unmodified smartwatches

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Hui Shyong Yeo, Juyoung Lee, Andrea Bianchi, Aaron John Quigley

School/Research organisations

Abstract

We present a technique for detecting gestures on the edge of an unmodified smartwatch. We demonstrate two exemplary gestures, i) Sidetap - tapping on any side and ii) Slingshot - pressing on the edge and then releasing quickly. Our technique is lightweight, as it relies on measuring the data from the internal Inertial measurement unit (IMU) only. With these two gestures, we expand the input expressiveness of a smartwatch, allowing users to use intuitive gestures with natural tactile feedback, e.g., for the rapid navigation of a long list of items with a tap, or act as shortcut commands to launch applications. It can also allow for eyes-free interaction or subtle interaction where visual attention is not available.
Close

Details

Original languageEnglish
Title of host publicationAdjunct Proceedings of the 29th Annual Symposium on User Interface Software and Technology
Place of PublicationNew York
PublisherACM
Pages189-190
ISBN (Print)9781450345316
DOIs
Publication statusPublished - 16 Oct 2016
Event29th ACM User Interface Software and Technology Symposium - Hitotsubashi Hall, National Center of Sciences Building, Tokyo, Japan
Duration: 16 Oct 201619 Oct 2016
Conference number: 29
http://uist.acm.org/uist2016/

Conference

Conference29th ACM User Interface Software and Technology Symposium
Abbreviated titleUIST
CountryJapan
CityTokyo
Period16/10/1619/10/16
Internet address

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 245678878

Top