Skip to content

Research at St Andrews

SpeCam: sensing surface color and material with the front-facing camera of mobile device

Research output: Chapter in Book/Report/Conference proceedingConference contribution

DOI

Open Access permissions

Open

Author(s)

Hui Shyong Yeo, Juyoung Lee, Andrea Bianchi, David Harris-Birtill, Aaron John Quigley

School/Research organisations

Abstract

SpeCam is a lightweight surface color and material sensing approach for mobile devices which only uses the front-facing camera and the display as a multi-spectral light source. We leverage the natural use of mobile devices (placing it face-down) to detect the material underneath and therefore infer the location or placement of the device. SpeCam can then be used to support discreet micro-interactions to avoid the numerous distractions that users daily face with today's mobile devices. Our two-parts study shows that SpeCam can i) recognize colors in the HSB space with 10 degrees apart near the 3 dominant colors and 4 degrees otherwise and ii) 30 types of surface materials with 99% accuracy. These findings are further supported by a spectroscopy study. Finally, we suggest a series of applications based on simple mobile micro-interactions suitable for using the phone when placed face-down.
Close

Details

Original languageEnglish
Title of host publicationProceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
Place of PublicationNew York, NY
PublisherACM
Number of pages9
ISBN (Print)9781450350754
DOIs
StatePublished - 4 Sep 2017
Event19th International Conference on Human-Computer Interaction with Mobile Devices and Services - Vienna, Austria

Conference

Conference19th International Conference on Human-Computer Interaction with Mobile Devices and Services
Abbreviated titleMobileHCI
CountryAustria
CityVienna
Period4/09/177/09/17
Internet address

    Research areas

  • Surface detection, Color detection, Material detection

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Out of sight: a toolkit for tracking occluded human joint positions

    Wu, C-J., Quigley, A. J. & Harris-Birtill, D. C. C. Feb 2017 In : Personal and Ubiquitous Computing. 21, 1, p. 125-135 11 p.

    Research output: Contribution to journalArticle

  2. RadarCat : Radar Categorization for input & interaction

    Yeo, H. S., Flamich, G., Schrempf, P., Harris-Birtill, D. C. C. & Quigley, A. J. 16 Oct 2016 Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, p. 833-841

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Jeeves - an Experience Sampling study creation tool

    Rough, D. J. & Quigley, A. J. 6 Sep 2017 BCS Health Informatics Scotland (HIS). BCS, (BCS Electronic Workshops in Computing (eWiC))

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  4. Workshop on object recognition for input and mobile interaction

    Yeo, H. S., Laput, G., Gillian, N. & Quigley, A. J. 4 Sep 2017 Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. ACM

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Overcoming mental blocks: a blocks-based approach to Experience Sampling studies

    Rough, D. J. & Quigley, A. J. 17 Aug 2017 2017 IEEE Blocks and Beyond Workshop. IEEE, 4 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

ID: 249811373