Skip to content

Research at St Andrews

Automated data gathering and training tool for personalized "Itchy Nose"

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Juyoung lee, Hui Shyong Yeo, Thad Starner, Aaron John Quigley, Kai Kunze, Woontack Woo

School/Research organisations

Abstract

In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses for data gathering and uses machine-learning technique to classify different gestures. Here we further propose an automated training and visualization tool for its classifier. This tool guides the user to make the gesture in proper timing and records the sensor data. It automatically picks the ground truth and trains a machine-learning classifier with it. With this tool, we can quickly create trained classifier that is personalized for the user and test various gestures.
Close

Details

Original languageEnglish
Title of host publicationAH '18 Proceedings of the 9th Augmented Human International Conference
Place of PublicationNew York
PublisherACM
Number of pages3
ISBN (Electronic)9781450354158
DOIs
Publication statusPublished - 7 Feb 2018
Event9th Augmented Human International Conference - Seoul, Korea, Democratic People's Republic of
Duration: 7 Feb 20189 Feb 2018
Conference number: 9
http://www.sigah.org/AH2018/

Conference

Conference9th Augmented Human International Conference
Abbreviated titleAH '18
CountryKorea, Democratic People's Republic of
CitySeoul
Period7/02/189/02/18
Internet address

    Research areas

  • Nose gesture, Subtle interaction, EOG, Wearable computer, Smart eyeglasses, Smart eyewear, Training tool, Online classification

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  3. Deb8: collaborative fact checking

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. G. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  4. RotoSwype: word-gesture typing using a ring

    Gupta, A., Ji, C., Yeo, H. S., Quigley, A. J. & Vogel, D., 2 May 2019, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). New York: ACM, 12 p. 14

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

    Kim, H., Lee, J., Yeo, H. S., Quigley, A. J. & Woo, W., 25 Apr 2019, Adjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018. Institute of Electrical and Electronics Engineers Inc., p. 428-429 2 p. 8699201

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 252305862