Skip to content

Research at St Andrews

Breathin: a breath pattern sensing approach for user computer interaction

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Rohan Hundia, Aaron Quigley

School/Research organisations


New interaction modalities in human computer interaction often explore common sensory inputs including touch, voice, gesture or motion. However, these modalities are not inclusive of the entire population type, and cannot be utilized by a group of people who suffer from any limitation of that sensory input. Here we propose BreathIn: an interface tool for enabling interaction with computer applications by using discreet exhalation patterns. The intent is that such patterns can be issued by anyone who can breathe. Our concept is based on detecting a user's forced exhalation patterns in a time duration using a MEMS microphone placed below the user's nose. We breakdown the signal into FFT components and identify peak frequencies for forced voluntary "breath events" and use that in real-time to distinguish between "exhalation events" and noise. We show two major applications of such an interaction tool: a) adaptation of computer applications using breath, b) using the breath interface as a discreet, emergency signal for prospective victims of crime.


Original languageEnglish
Title of host publicationOZCHI'19
Subtitle of host publicationProceedings of the 31st Australian Conference on Human-Computer-Interaction
Place of PublicationNew York
ISBN (Electronic)9781450376969
Publication statusPublished - 2 Dec 2019
EventOZCHI'19: 31st Australian Conference on Human-Computer Interaction - Perth/Freemantle, Australia
Duration: 2 Dec 20195 Dec 2019
Conference number: 31


Abbreviated titleOZCHI'19
Internet address

    Research areas

  • Breath, BreathIn, Breath sensing, Exhale

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  3. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

  5. WRIST: Watch-Ring Interaction and Sensing Technique for wrist gestures and macro-micro pointing

    Yeo, H. S., Lee, J., Kim, H., Gupta, A., Bianchi, A., Vogel, D., Koike, H., Woo, W. & Quigley, A. J., 1 Oct 2019, Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '19). New York: ACM, 15 p. 19

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 265750444