Skip to content

Research at St Andrews

Subtle gaze-dependent techniques for visualising display changes in multi-display environments

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Author(s)

Jakub Dostal, Per Ola Kristensson, Aaron John Quigley

School/Research organisations

Abstract

This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013)
PublisherACM
Number of pages11
ISBN (Print)978-1-4503-1965-2
DOIs
Publication statusAccepted/In press - 2013

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Exploring tangible interactions with radar sensing

    Yeo, H. S., Minami, R., Rodriguez, K., Shaker, G. & Quigley, A. J., 27 Dec 2018, In : Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2, 4, 25 p., 200.

    Research output: Contribution to journalArticle

  2. SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

    Kim, H., Lee, J., Yeo, H. S., Quigley, A. J. & Woo, W., 16 Dec 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE Computer Society, p. 428-429

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Towards end-user development for chronic disease management

    Rough, D. J. & Quigley, A. J., 1 Oct 2018, Designing Technologies to Support Human Problem Solving: A Workshop in Conjunction with VL/HCC 2018 in Lisbon, Portugal, Oct. 1, 2018. IEEE Computer Society

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. End-user development in social psychology research: factors for adoption

    Rough, D. J. & Quigley, A. J., 1 Oct 2018, IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC 2018). Cunha, J., Fernandes, J. P., Kelleher, C., Engels, G. & Mendes, J. (eds.). IEEE, p. 75-83 8506573

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. The evolution of SIGCHI conferences and the future of CHI

    Terveen, L., Mentis, H., Quigley, A. & Palanque, P., 22 Sep 2018, Interactions, 25, 5, p. 84-85 2 p.

    Research output: Contribution to specialist publicationArticle

ID: 43517251