Skip to content

Research at St Andrews

Jeeves – a visual programming environment for mobile experience sampling

Research output: Contribution to conferencePaper

Author(s)

Daniel John Rough, Aaron John Quigley

School/Research organisations

Abstract

The Experience Sampling Method (ESM) captures participants’ thoughts and feelings in their everyday environments. Mobile and wearable technologies afford us opportunities to reach people using ESM in varying contexts. However, a lack of programming knowledge often hinders researchers in creating ESM applications. In practice, they rely on specialised tools for app creation. Our initial review of these tools indicates that most are expensive commercial services, and none utilise the full potential of sensors for creating context-aware applications. We present “Jeeves”, a visual language to facilitate ESM application
creation. Inspired by successful visual languages in literature, our block-based notation enables researchers to visually construct ESM study specifications. We demonstrate its applicability by replicating existing ESM studies found in medical and psychology literature. Our preliminary study with 20 participants demonstrates that both non-programmers and programmers are able to successfully utilise Jeeves. We discuss future work in extending Jeeves with alternative mobile technologies.
Close

Details

Original languageEnglish
Number of pages9
Publication statusPublished - 20 Oct 2015
Event IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) - USA, Atlanta, United States
Duration: 18 Oct 201522 Oct 2015

Conference

Conference IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC)
CountryUnited States
CityAtlanta
Period18/10/1522/10/15

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 215194314

Top