Skip to content

Research at St Andrews

Subtle gaze-dependent techniques for visualising display changes in multi-display environments

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Author(s)

Jakub Dostal, Per Ola Kristensson, Aaron John Quigley

School/Research organisations

Abstract

This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013)
PublisherACM
Number of pages11
ISBN (Print)978-1-4503-1965-2
DOIs
Publication statusAccepted/In press - 2013

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. WRIST: Watch-Ring Interaction and Sensing Technique for wrist gestures and macro-micro pointing

    Yeo, H. S., Lee, J., Kim, H., Gupta, A., Bianchi, A., Vogel, D., Koike, H., Woo, W. & Quigley, A. J., 1 Oct 2019, Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '19). New York: ACM, 15 p. 19

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Augmented learning for sports using wearable head-worn and wrist-worn devices

    Yeo, H. S., Koike, H. & Quigley, A., 15 Aug 2019, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., p. 1578-1580 3 p. 8798054

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

ID: 43517251

Top