Skip to content

Research at St Andrews

SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Hyung-il Kim, Juyoung Lee, Hui Shyong Yeo, Aaron John Quigley, Woontack Woo

School/Research organisations

Abstract

In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed Reality head-mounted displays. Robust tracking of hand and finger with egocentric camera remains a challenging problem, especially with self-occlusion – for example, when user tries to grab a virtual object in midair by closing the palm. Our approach leverages the use of a common smart watch worn on the wrist to provide a more reliable palm and wrist orientation data, while fusing the data with camera to achieve robust hand motion and orientation for interaction.
Close

Details

Original languageEnglish
Title of host publicationAdjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages428-429
Number of pages2
ISBN (Electronic)9781538675922
DOIs
Publication statusPublished - 25 Apr 2019
Event17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018 - Munich, Germany
Duration: 16 Oct 201820 Oct 2018
https://www.ismar2018.org/

Conference

Conference17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018
CountryGermany
CityMunich
Period16/10/1820/10/18
Internet address

    Research areas

  • Augmented reality, Wearable computing, 3D user interfaces, Hand interaction, Virtual 3D object manipulation

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  3. Deb8: collaborative fact checking

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. G. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  4. RotoSwype: word-gesture typing using a ring

    Gupta, A., Ji, C., Yeo, H. S., Quigley, A. J. & Vogel, D., 2 May 2019, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). New York: ACM, 12 p. 14

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Augmented sports for learning using wearable head-worn and wrist-worn devices

    Yeo, H. S., Koike, H. & Quigley, A. J., 24 Mar 2019, (Accepted/In press) The First IEEE VR Workshop on Human Augmentation and Its Applications. IEEE Computer Society

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 257369125