Skip to content

Research at St Andrews

Perceptual and social challenges in body proximate display ecosystems

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Open Access permissions

Open

Author(s)

Aaron John Quigley, Jens Grubert

School/Research organisations

Abstract

Coordinated multi-display environments from the desktop, second-screen to gigapixel display walls are increasingly common. Personal and intimate display devices such as head-mounted displays, smartwatches, smartphones and
tablets are rarely part of such a multi-display ecosystem. This presents an opportunity to realise “body proximate” display environments, employing on and around the body displays. These can be formed by combining multiple handheld,
head-mounted, wrist-worn or other personal or appropriated displays. However, such an ecosystem encapsulating evermore interaction points, is not yet well understood. For example, does this trap the user in an “interaction bubble”
even more than interaction with individual displays such as smartphones? Within this paper, we investigate the perceptual and social challenges that could inhibit the adoption and acceptance of interactive proximate display ecosystems. We conclude with a series of research questions raised in the consideration of such environments.
Close

Details

Original languageEnglish
Title of host publicationMobileHCI '15 Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
Place of PublicationNew York
PublisherACM
Pages1168-1174
Number of pages7
ISBN (Electronic)9781450336536
DOIs
StatePublished - 24 Aug 2015
Scopus citations3
EventMobile Collocated Interactions with Wearables: Workshop at MobileHCI 2015 - Copenhagen, Denmark

Workshop

WorkshopMobile Collocated Interactions with Wearables: Workshop at MobileHCI 2015
CountryDenmark
CityCopenhagen
Period24/08/15 → …
Internet address

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Workshop on object recognition for input and mobile interaction

    Yeo, H. S., Laput, G., Gillian, N. & Quigley, A. J. 16 Mar 2017 Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. TiTAN: exploring midair text entry using freehand input

    Yeo, H. S., Phang, X-S., Ha, T., Woo, W. & Quigley, A. J. 6 May 2017 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York: ACM, p. 3041-3049

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Investigating tilt-based gesture keyboard entry for single-handed text entry on large devices

    Yeo, H. S., Phang, X-S., Castellucci, S. J., Kristensson, P. O. & Quigley, A. J. 2 May 2017 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems . New York: ACM, p. 4194-4202

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Out of sight: a toolkit for tracking occluded human joint positions

    Wu, C-J., Quigley, A. J. & Harris-Birtill, D. C. C. Feb 2017 In : Personal and Ubiquitous Computing. 21, 1, p. 125-135 11 p.

    Research output: Contribution to journalArticle

  5. RadarCat : Radar Categorization for input & interaction

    Yeo, H. S., Flamich, G., Schrempf, P., Harris-Birtill, D. C. C. & Quigley, A. J. 16 Oct 2016 Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, p. 833-841

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 204343425