Skip to content

Research at St Andrews

TAPping into mental models with blocks

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Author(s)

D. Rough, A. Quigley

School/Research organisations

Abstract

Trigger-Action Programming (TAP) has been shown to support end-users' rule-based mental models of context-aware applications. However, when desired behaviours increase in complexity, this can lead to ambiguity that confuses events, states, and how they can be combined in meaningful ways. Blocks programming could provide a solution, through constrained editing of visual triggers, conditions and actions. We observed slips and mistakes by users performing TAP with Jeeves, our domain-specific blocks environment, and propose solutions.
Close

Details

Original languageEnglish
Title of host publication2017 IEEE Blocks and Beyond Workshop (B&B)
PublisherIEEE
Pages115-116
Number of pages2
ISBN (Electronic)9781538624807
ISBN (Print)9781538624814
DOIs
Publication statusPublished - 10 Oct 2017
Event2nd Workshop on Lessons and Directions for First Programming Environments - Raleigh, United States
Duration: 10 Oct 2017 → …
Conference number: 2
http://cs.wellesley.edu/~blocks-and-beyond/

Workshop

Workshop2nd Workshop on Lessons and Directions for First Programming Environments
CountryUnited States
CityRaleigh
Period10/10/17 → …
Internet address

    Research areas

  • Cognition, Mobile computing, Jeeves, TAP, Blocks programming, Context-aware applications, Domain-specific blocks environment, Rule-based mental models, Trigger-action programming, Visual actions, Visual conditions, Visual triggers, Cognitive science, Programming, Smart phones, Usability, Visualization

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

ID: 252534733

Top