Skip to content

Research at St Andrews

SpeCam: sensing surface color and material with the front-facing camera of mobile device

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

Hui Shyong Yeo, Juyoung Lee, Andrea Bianchi, David Harris-Birtill, Aaron John Quigley

School/Research organisations

Abstract

SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support discreet micro-interactions to avoid the numerous distractions that users daily face with today's mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
Place of PublicationNew York, NY
PublisherACM
Number of pages9
ISBN (Print)9781450350754
DOIs
StatePublished - 4 Sep 2017
Event19th International Conference on Human-Computer Interaction with Mobile Devices and Services - Aula der Wissenschaft – Hall of Science, Vienna, Austria
Duration: 4 Sep 20177 Sep 2017
Conference number: 19
https://mobilehci.acm.org/2017/index.html

Conference

Conference19th International Conference on Human-Computer Interaction with Mobile Devices and Services
Abbreviated titleMobileHCI
CountryAustria
CityVienna
Period4/09/177/09/17
Internet address

    Research areas

  • Surface detection, Color detection, Material detection

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Out of sight: a toolkit for tracking occluded human joint positions

    Wu, C-J., Quigley, A. J. & Harris-Birtill, D. C. C. Feb 2017 In : Personal and Ubiquitous Computing. 21, 1, p. 125-135 11 p.

    Research output: Contribution to journalArticle

  2. RadarCat : Radar Categorization for input & interaction

    Yeo, H. S., Flamich, G., Schrempf, P., Harris-Birtill, D. C. C. & Quigley, A. J. 16 Oct 2016 Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, p. 833-841

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Change blindness in proximity-aware mobile interfaces

    Brock, M. O., Quigley, A. J. & Kristensson, P. O. 21 Apr 2018 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18). New York, NY: ACM, 7 p. 43

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. AdaM: adapting multi-user interfaces for collaborative environments in real-time

    Park, S., Gebhardt, C., Rädle, R., Feit, A., Vrzakova, H., Dayama, N., Yeo, H. S., Klokmose, C., Quigley, A. J., Oulasvirta, A. & Hilliges, O. 21 Apr 2018 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI'18). New York, NY: ACM, 14 p. 184

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Automated data gathering and training tool for personalized "Itchy Nose"

    lee, J. U., Yeo, H. S., Starner, T., Quigley, A. J., Kunze, K. A. & Woo, W. O. 7 Feb 2018 AH '18 Proceedings of the 9th Augmented Human International Conference. New York: ACM, 3 p. 43

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 249811373