Skip to content

Research at St Andrews

Subtle gaze-dependent techniques for visualising display changes in multi-display environments

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Standard

Subtle gaze-dependent techniques for visualising display changes in multi-display environments. / Dostal, Jakub; Kristensson, Per Ola; Quigley, Aaron John.

Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013). ACM, 2013.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Dostal, J, Kristensson, PO & Quigley, AJ 2013, Subtle gaze-dependent techniques for visualising display changes in multi-display environments. in Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013). ACM. https://doi.org/10.1145/2449396.2449416

APA

Dostal, J., Kristensson, P. O., & Quigley, A. J. (Accepted/In press). Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013) ACM. https://doi.org/10.1145/2449396.2449416

Vancouver

Dostal J, Kristensson PO, Quigley AJ. Subtle gaze-dependent techniques for visualising display changes in multi-display environments. In Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013). ACM. 2013 https://doi.org/10.1145/2449396.2449416

Author

Dostal, Jakub ; Kristensson, Per Ola ; Quigley, Aaron John. / Subtle gaze-dependent techniques for visualising display changes in multi-display environments. Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013). ACM, 2013.

Bibtex - Download

@inproceedings{648c6af4ac6c43f087e3b902f2f24f7b,
title = "Subtle gaze-dependent techniques for visualising display changes in multi-display environments",
abstract = "This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.",
author = "Jakub Dostal and Kristensson, {Per Ola} and Quigley, {Aaron John}",
note = "This work was supported by the Engineering and Physical Sciences Research Council (grant number EP/H027408/1) and the Scottish Informatics and Computer Science Alliance",
year = "2013",
doi = "10.1145/2449396.2449416",
language = "English",
isbn = "978-1-4503-1965-2",
booktitle = "Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013)",
publisher = "ACM",
address = "United States",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - Subtle gaze-dependent techniques for visualising display changes in multi-display environments

AU - Dostal, Jakub

AU - Kristensson, Per Ola

AU - Quigley, Aaron John

N1 - This work was supported by the Engineering and Physical Sciences Research Council (grant number EP/H027408/1) and the Scottish Informatics and Computer Science Alliance

PY - 2013

Y1 - 2013

N2 - This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.

AB - This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.

U2 - 10.1145/2449396.2449416

DO - 10.1145/2449396.2449416

M3 - Conference contribution

SN - 978-1-4503-1965-2

BT - Proceedings of the 18th ACM International Conference on Intelligent User Interfaces (IUI 2013)

PB - ACM

ER -

Related by author

  1. WRIST: Watch-Ring Interaction and Sensing Technique for wrist gestures and macro-micro pointing

    Yeo, H. S., Lee, J., Kim, H., Gupta, A., Bianchi, A., Vogel, D., Koike, H., Woo, W. & Quigley, A. J., 1 Oct 2019, Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '19). New York: ACM, 15 p. 19

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Augmented learning for sports using wearable head-worn and wrist-worn devices

    Yeo, H. S., Koike, H. & Quigley, A., 15 Aug 2019, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., p. 1578-1580 3 p. 8798054

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

  5. Deb8: collaborative fact checking

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. G. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

ID: 43517251

Top