Skip to content

Research at St Andrews

Exploring tangible interactions with radar sensing

Research output: Contribution to journalArticle

DOI

Open Access permissions

Open

Standard

Exploring tangible interactions with radar sensing. / Yeo, Hui Shyong; Minami, Ryosuke; Rodriguez, Kirill; Shaker, George; Quigley, Aaron John.

In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 2, No. 4, 200, 27.12.2018.

Research output: Contribution to journalArticle

Harvard

Yeo, HS, Minami, R, Rodriguez, K, Shaker, G & Quigley, AJ 2018, 'Exploring tangible interactions with radar sensing', Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 2, no. 4, 200. https://doi.org/10.1145/3287078

APA

Yeo, H. S., Minami, R., Rodriguez, K., Shaker, G., & Quigley, A. J. (2018). Exploring tangible interactions with radar sensing. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(4), [200]. https://doi.org/10.1145/3287078

Vancouver

Yeo HS, Minami R, Rodriguez K, Shaker G, Quigley AJ. Exploring tangible interactions with radar sensing. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2018 Dec 27;2(4). 200. https://doi.org/10.1145/3287078

Author

Yeo, Hui Shyong ; Minami, Ryosuke ; Rodriguez, Kirill ; Shaker, George ; Quigley, Aaron John. / Exploring tangible interactions with radar sensing. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2018 ; Vol. 2, No. 4.

Bibtex - Download

@article{f708db6336b5446584d74a5913c96e28,
title = "Exploring tangible interactions with radar sensing",
abstract = "Research has explored miniature radar as a promising sensing technique for the recognition of gestures, objects, users’ presence and activity. However, within Human-Computer Interaction (HCI), its use remains underexplored, in particular in Tangible User Interface (TUI). In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic. With a focus on planar objects, we report on a series of studies which demonstrate the suitability of this approach. This exploration is grounded in both a characterization of the radar sensing and our rigorous experiments which show that such sensing is accurate with minimal training. With these techniques, we envision both realistic and future applications and scenarios. The motivation for what we refer to as Solinteraction, is to demonstrate the potential for radar-based interaction with objects in HCI and T",
keywords = "Tangible interaction, Tangible user interface, Radar sensing, Soli, Ubiquitous computing, Context-aware interaction, Token+constraint, Machine learning",
author = "Yeo, {Hui Shyong} and Ryosuke Minami and Kirill Rodriguez and George Shaker and Quigley, {Aaron John}",
year = "2018",
month = "12",
day = "27",
doi = "10.1145/3287078",
language = "English",
volume = "2",
journal = "Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies",
issn = "2474-9567",
publisher = "ACM",
number = "4",

}

RIS (suitable for import to EndNote) - Download

TY - JOUR

T1 - Exploring tangible interactions with radar sensing

AU - Yeo, Hui Shyong

AU - Minami, Ryosuke

AU - Rodriguez, Kirill

AU - Shaker, George

AU - Quigley, Aaron John

PY - 2018/12/27

Y1 - 2018/12/27

N2 - Research has explored miniature radar as a promising sensing technique for the recognition of gestures, objects, users’ presence and activity. However, within Human-Computer Interaction (HCI), its use remains underexplored, in particular in Tangible User Interface (TUI). In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic. With a focus on planar objects, we report on a series of studies which demonstrate the suitability of this approach. This exploration is grounded in both a characterization of the radar sensing and our rigorous experiments which show that such sensing is accurate with minimal training. With these techniques, we envision both realistic and future applications and scenarios. The motivation for what we refer to as Solinteraction, is to demonstrate the potential for radar-based interaction with objects in HCI and T

AB - Research has explored miniature radar as a promising sensing technique for the recognition of gestures, objects, users’ presence and activity. However, within Human-Computer Interaction (HCI), its use remains underexplored, in particular in Tangible User Interface (TUI). In this paper, we explore two research questions with radar as a platform for sensing tangible interaction with the counting, ordering, identification of objects and tracking the orientation, movement and distance of these objects. We detail the design space and practical use-cases for such interaction which allows us to identify a series of design patterns, beyond static interaction, which are continuous and dynamic. With a focus on planar objects, we report on a series of studies which demonstrate the suitability of this approach. This exploration is grounded in both a characterization of the radar sensing and our rigorous experiments which show that such sensing is accurate with minimal training. With these techniques, we envision both realistic and future applications and scenarios. The motivation for what we refer to as Solinteraction, is to demonstrate the potential for radar-based interaction with objects in HCI and T

KW - Tangible interaction

KW - Tangible user interface

KW - Radar sensing

KW - Soli

KW - Ubiquitous computing

KW - Context-aware interaction

KW - Token+constraint

KW - Machine learning

U2 - 10.1145/3287078

DO - 10.1145/3287078

M3 - Article

VL - 2

JO - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

JF - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

SN - 2474-9567

IS - 4

M1 - 200

ER -

Related by author

  1. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. WRIST: Watch-Ring Interaction and Sensing Technique for wrist gestures and macro-micro pointing

    Yeo, H. S., Lee, J., Kim, H., Gupta, A., Bianchi, A., Vogel, D., Koike, H., Woo, W. & Quigley, A. J., 1 Oct 2019, Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '19). New York: ACM, 15 p. 19

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Augmented learning for sports using wearable head-worn and wrist-worn devices

    Yeo, H. S., Koike, H. & Quigley, A., 15 Aug 2019, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, VR 2019 - Proceedings. Institute of Electrical and Electronics Engineers Inc., p. 1578-1580 3 p. 8798054

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Deb8: a tool for collaborative analysis of video

    Carneiro, G., Nacenta, M., Toniolo, A., Mendez, G. & Quigley, A. J., 4 Jun 2019, Proceedings of the 2019 ACM International Conference on Interactive Experiences for TV and Online Video (TVX '19). ACM, p. 47-58

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Walkers—encoding multivariate data into human motion sequences

    Carson, I., Hinrichs, U. & Quigley, A. J., 5 May 2019.

    Research output: Contribution to conferencePaper

ID: 256914564

Top