Skip to content

Research at St Andrews

Gaze-based awareness in complex healthcare environments

Research output: Contribution to conferencePaper

Author(s)

Juan E. Garrido, Victor M R Penichet, María D. Lozano, P. O. Kristensson, Aaron Quigley

School/Research organisations

Abstract

Medical staff work in collaborative environments and require information regarding workmates, patients and resources as well as data related to the completion of ongoing tasks. Healthcare systems provide a large quantity of information and current applications usually involve the simultaneous use of many different devices. A system might monitor several patients, provide alerts and warnings as well as information on pending tasks, and many other demanding workloads. It is therefore an open question whether a professional is able to attend a rehabilitation process involving technology and still be able to remain aware of all notifications provided by different devices. In this paper, gaze-based awareness is presented as a natural evolution, through current technology, of the common awareness concept. The key concept consists of considering users' gaze as fundamental to personalizing the way to subtle notify users about changes on unattended screens. To this end, different levels of subtlety of notification are considered based on where the user is looking together and the user's work conditions. We present a realization of gaze-based awareness using a real healthcare system named Ubi4health in which this awareness has been considered an essential element during development.

Close

Details

Original languageEnglish
Pages410-413
Number of pages4
DOIs
Publication statusPublished - 1 Jan 2014
EventWorkshop on REHAB 2014 - Oldenburg, Germany
Duration: 20 May 2014 → …

Conference

ConferenceWorkshop on REHAB 2014
CountryGermany
CityOldenburg
Period20/05/14 → …

    Research areas

  • Awareness, Collaboration, Context-awareness, Gaze-based, Healthcare

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 255434187

Top