Skip to content

Research at St Andrews

Automated data gathering and training tool for personalized "Itchy Nose"

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Juyoung lee, Hui Shyong Yeo, Thad Starner, Aaron John Quigley, Kai Kunze, Woontack Woo

School/Research organisations

Abstract

In "Itchy Nose" we proposed a sensing technique for detecting finger movements on the nose for supporting subtle and discreet interaction. It uses the electrooculography sensors embedded in the frame of a pair of eyeglasses for data gathering and uses machine-learning technique to classify different gestures. Here we further propose an automated training and visualization tool for its classifier. This tool guides the user to make the gesture in proper timing and records the sensor data. It automatically picks the ground truth and trains a machine-learning classifier with it. With this tool, we can quickly create trained classifier that is personalized for the user and test various gestures.
Close

Details

Original languageEnglish
Title of host publicationAH '18 Proceedings of the 9th Augmented Human International Conference
Place of PublicationNew York
PublisherACM
Number of pages3
ISBN (Electronic)9781450354158
DOIs
StatePublished - 7 Feb 2018
Event9th Augmented Human International Conference - Seoul, Korea, Democratic People's Republic of
Duration: 7 Feb 20189 Feb 2018
Conference number: 9
http://www.sigah.org/AH2018/

Conference

Conference9th Augmented Human International Conference
Abbreviated titleAH '18
CountryKorea, Democratic People's Republic of
CitySeoul
Period7/02/189/02/18
Internet address

    Research areas

  • Nose gesture, Subtle interaction, EOG, Wearable computer, Smart eyeglasses, Smart eyewear, Training tool, Online classification

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Exploring tangible interactions with radar sensing

    Yeo, H. S., Minami, R., Rodriguez, K., Shaker, G. & Quigley, A. J. 27 Dec 2018 In : Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2, 4, 25 p., 200

    Research output: Contribution to journalArticle

  2. SWAG Demo: Smart Watch Assisted Gesture Interaction for Mixed Reality Head-mounted Displays

    Kim, H., Lee, J., Yeo, H. S., Quigley, A. J. & Woo, W. 16 Oct 2018 p. 428 429 p.

    Research output: Contribution to conferenceOther

  3. Towards end-user development for chronic disease management

    Rough, D. J. & Quigley, A. J. 1 Oct 2018 Designing Technologies to Support Human Problem Solving: A Workshop in Conjunction with VL/HCC 2018 in Lisbon, Portugal, Oct. 1, 2018. IEEE Computer Society

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. End-user development in social psychology research: factors for adoption

    Rough, D. J. & Quigley, A. J. 1 Oct 2018 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC 2018). Cunha, J., Fernandes, J. P., Kelleher, C., Engels, G. & Mendes, J. (eds.). IEEE, p. 75-83 8506573

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Change blindness in proximity-aware mobile interfaces

    Brock, M. O., Quigley, A. J. & Kristensson, P. O. 21 Apr 2018 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18). New York, NY: ACM, 7 p. 43

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 252305862