Skip to content

Research at St Andrews

Exploring tangible interactions with radar sensing

Research output: Contribution to journalArticle

DOI

Open Access permissions

Open

Author(s)

Hui Shyong Yeo, Ryosuke Minami, Kirill Rodriguez, George Shaker, Aaron John Quigley

School/Research organisations

Abstract

Research has explored miniature radar as a promising sensing technique for the recognition of gestures, objects, users’ presence and activity. However, within Human-Computer Interaction (HCI), its use remains underexplored, in particular in Tangible User Interface (TUI). In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic. With a focus on planar objects, we report on a series of studies which demonstrate the suitability of this approach. This exploration is grounded in both a characterization of the radar sensing and our rigorous experiments which show that such sensing is accurate with minimal training. With these techniques, we envision both realistic and future applications and scenarios. The motivation for what we refer to as Solinteraction, is to demonstrate the potential for radar-based interaction with objects in HCI and T
Close

Details

Original languageEnglish
Article number200
Number of pages25
JournalProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Volume2
Issue number4
DOIs
Publication statusPublished - 27 Dec 2018

    Research areas

  • Tangible interaction, Tangible user interface, Radar sensing, Soli, Ubiquitous computing, Context-aware interaction, Token+constraint, Machine learning

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. SWAG demo: smart watch assisted gesture interaction for mixed reality head-mounted displays

    Kim, H., Lee, J., Yeo, H. S., Quigley, A. J. & Woo, W., 16 Dec 2018, 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE Computer Society, p. 428-429

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. Towards end-user development for chronic disease management

    Rough, D. J. & Quigley, A. J., 1 Oct 2018, Designing Technologies to Support Human Problem Solving: A Workshop in Conjunction with VL/HCC 2018 in Lisbon, Portugal, Oct. 1, 2018. IEEE Computer Society

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. End-user development in social psychology research: factors for adoption

    Rough, D. J. & Quigley, A. J., 1 Oct 2018, IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC 2018). Cunha, J., Fernandes, J. P., Kelleher, C., Engels, G. & Mendes, J. (eds.). IEEE, p. 75-83 8506573

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. The evolution of SIGCHI conferences and the future of CHI

    Terveen, L., Mentis, H., Quigley, A. & Palanque, P., 22 Sep 2018, Interactions, 25, 5, p. 84-85 2 p.

    Research output: Contribution to specialist publicationArticle

  5. Change blindness in proximity-aware mobile interfaces

    Brock, M. O., Quigley, A. J. & Kristensson, P. O., 21 Apr 2018, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18). New York, NY: ACM, 7 p. 43

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 256914564