Skip to content

Research at St Andrews

Face-to-Face Collaborative Interfaces

Research output: Chapter in Book/Report/Conference proceedingChapter

Author(s)

Aaron J. Quigley, Florin Bodea

School/Research organisations

Abstract

This chapter provides an introduction to research and developments in multitouch input technologies that can be used to realize large interactive tabletop or "surface user interfaces." Such hardware systems, along with supporting software, allow for applications that can be controlled through direct touch or multitouch. Further, a review of gestural interactions and design guidelines for surface user interface design for collaboration is also provided. Multitouch surface user interfaces (SUIs) can be mounted vertically on walls or horizontally on tables. They are capable of sensing the location of finger(s) when contact with the surface is made. SUIs are used in public places (kiosks, ATMs) or in small personal devices (PDAs, iPhones) where a separate keyboard and mouse cannot or should not be used. Basic SUIs have been common for over 20 years in the form of interactive kiosks, ATMs, and point-of-sale systems, which rely on touch-screen technology with simple button interfaces. The current generation of SUIs suitable for face-to-face interaction are built on LCD displays or form-factored into walls or coffee tables. In their current form they cannot be considered a basic object. However, display technologies are now ubiquitous, and if SUI interaction styles can be woven into the environments and activities of everyday life and their industrial design improved, invisibility in action can be achieved. The ultimate goal of surface user interfaces in collaborative face-to-face activities is for people not to feel they are using a computer; instead, the visual elements should naturally support their actions. Ultimately, SUIs will become so commonplace in everyday life that no one will notice their presence. They will be aesthetic, powerful, and enhance our lives but so too will they be commonplace, obvious, and boring.

Close

Details

Original languageEnglish
Title of host publicationHuman-Centric Interfaces for Ambient Intelligence
PublisherElsevier Inc.
Pages3-32
Number of pages30
ISBN (Print)9780123747082
DOIs
Publication statusPublished - 1 Dec 2010

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 255435486

Top