Skip to content

Research at St Andrews

Effective temporal graph layout: A comparative study of animation versus static display methods

Research output: Contribution to journalArticle



Michael Farrugia, Aaron John Quigley

School/Research organisations


Graph drawing algorithms have classically addressed the layout of static graphs. However, the need to draw evolving or dynamic graphs has brought into question many of the assumptions, conventions and layout methods designed to date. For example, social scientists studying evolving social networks have created a demand for visual representations of graphs changing over time. Two common approaches to represent temporal information in graphs include animation of the network and use of static snapshots of the network at different points in time. Here, we report on two experiments, one in a laboratory environment and another using an asynchronous remote web-based platform, Mechanical Turk, to compare the efficiency of animated displays versus static displays. Four tasks are studied with each visual representation, where two characterise overview level information presentation, and two characterise micro level analytical tasks. For the tasks studied in these experiments and within the limits of the experimental system, the results of this study indicate that static representations are generally more effective particularly in terms of time performance, when compared to fully animated movie representations of dynamic networks. Information Visualization (2011) 10, 47-64. doi: 10.1057/ivs.2010.10



Original languageEnglish
Pages (from-to)47-64
Number of pages18
JournalInformation Visualization
Issue number1
Publication statusPublished - Jan 2011

Discover related content
Find related publications, people, projects and more using interactive charts.

View graph of relations

Related by author

  1. Breathin: a breath pattern sensing approach for user computer interaction

    Hundia, R. & Quigley, A., 2 Dec 2019, OZCHI'19: Proceedings of the 31st Australian Conference on Human-Computer-Interaction. New York: ACM, p. 581-584

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  2. SnapChart: an augmented reality analytics toolkit to enhance interactivity in a collaborative environment

    Jing, A., Xiang, C., Kim, S., Billinghurst, M. & Quigley, A., 14 Nov 2019, Proceedings - VRCAI 2019: 17th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry. Spencer, S. N. (ed.). New York: Association for Computing Machinery, Inc, 2 p. 55

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  3. Making waves and combining strands at CHI 2021

    Sari, E., Quigley, A. & Kitamura, Y., Nov 2019, In : Interactions. 26, 6, p. 84-85 2 p.

    Research output: Contribution to journalComment/debate

  4. Opisthenar: hand poses and finger tapping recognition by observing back of hand using embedded wrist camera

    Yeo, H. S., Wu, E., Lee, J., Quigley, A. J. & Koike, H., 17 Oct 2019, Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (UIST 2019). New York: ACM, p. 963-971 9 p.

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

  5. Special issue on highlights of ACM intelligent user interface (IUI) 2018

    Billinghurst, M., Burnett, M. & Quigley, A., 1 Oct 2019, In : ACM Transactions on Interactive Intelligent Systems. 10, 1, 1.

    Research output: Contribution to journalEditorial

Related by journal

  1. Introduction to the special section on Visual Movement Analytics

    Demšar, U., Slingsby, A. & Weibel, R., 8 Nov 2018, In : Information Visualization. Online First, 5 p.

    Research output: Contribution to journalEditorial

  2. Exploring the spatio-temporal dynamics of geographical processes with geographically weighted regression and geovisual analytics

    Demsar, U., Fotheringham, A. S. & Charlton, M., 2008, In : Information Visualization. 7, 3-4, p. 181-197 17 p.

    Research output: Contribution to journalArticle

ID: 17986939