Skip to content

Research at St Andrews

Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Hui Shyong Yeo, Xiao-Shen Phang, Steven J. Castellucci, Per Ola Kristensson, Aaron John Quigley

School/Research organisations

Abstract

The popularity of mobile devices with large screens is making single-handed interaction difficult. We propose and evaluate a novel design point around a tilt-based text entry technique which supports single handed usage. Our technique is based on the gesture keyboard (shape writing). However, instead of drawing gestures with a finger or stylus, users articulate a gesture by tilting the device. This can be especially useful when the user’s other hand is otherwise encumbered or unavailable. We show that novice users achieve an entry rate of 15 words- per-minute (wpm) after minimal practice. A pilot longitudinal study reveals that a single participant achieved an entry rate of 32 wpm after approximate 90 minutes of practice. Our data indicate that tilt-based gesture keyboard entry enables walk-up use and provides a suitable text entry rate for occasional use and can act as a promising alternative to single-handed typing in certain situations.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 2017 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM
Pages4194-4202
ISBN (Print)9781450346559
DOIs
Publication statusPublished - 2 May 2017
EventACM CHI 2017 Conference on Human Factors in Computing Systems - Colorado Convention Center, Denver, United States
Duration: 6 May 201711 May 2017
https://chi2017.acm.org/

Conference

ConferenceACM CHI 2017 Conference on Human Factors in Computing Systems
Abbreviated titleCHI
CountryUnited States
CityDenver
Period6/05/1711/05/17
Internet address

    Research areas

  • Text entry, Single-handed, Tilt, Shape writing, Gesture keyboard, Phablets

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 248900588

Top