Skip to content

Research at St Andrews

Subtle gaze-dependent techniques for visualising display changes in multi-display environments

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Author(s)

Jakub Dostal, Per Ola Kristensson, Aaron John Quigley

School/Research organisations

Abstract

This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013)
PublisherACM
Number of pages11
ISBN (Print)978-1-4503-1965-2
DOIs
Publication statusAccepted/In press - 2013

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  3. Deb8: collaborative fact checking

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. G. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  4. RotoSwype: word-gesture typing using a ring

    Gupta, A., Ji, C., Yeo, H. S., Quigley, A. J. & Vogel, D., 2 May 2019, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). New York: ACM, 12 p. 14

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

    Kim, H., Lee, J., Yeo, H. S., Quigley, A. J. & Woo, W., 25 Apr 2019, Adjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018. Institute of Electrical and Electronics Engineers Inc., p. 428-429 2 p. 8699201

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 43517251