This work investigates the current research on in-situ visualisations for running: visualisations about data that are referred to during the running activity. We analyse 47 papers from 33 Human-Computer Interaction and Visualisation venues and identify six dimensions of a design space of in-situ running visualisations. Our analysis of this design space highlights an emerging trend: a shift from on-body, peripersonal visualisations (i.e., in the space within direct reach, such as visualisations on a smartwatch or a mobile phone display) towards extrapersonal displays (i.e., in the space beyond immediate reach, such as visualisations in immersive augmented reality displays) that integrate data in the runner’s surrounding environment. We explore this opportunity by conducting a series of workshops with 10 active runners in total, eliciting design concepts for running visualisations and interactions beyond conventional 2D displays. We find that runners show a strong interest for visualisation designs that favour more context-aware, interactive, and unobtrusive experiences that seamlessly integrate with their run. These findings inform a set of design considerations for future immersive running visualisations and highlight directions for further research.
@inproceedings{Li25_EmbeddedRunningVis,author={Li, Ang and Perin, Charles and Demartini, Gianluca and Viller, Stephen and Knibbe, Jarrod and Cordeil, Maxime},title={Running with Data: a Survey of the Current Research and a Design Exploration of Future Immersive Visualisations},year={2025},publisher={IEEE TVCG},doi={10.1109/TVCG.2025.3634631},booktitle={IEEE TVCG (IEEE VIS Special Issue)},keywords={running, jogging, survey, taxonomy, human-subjects qualitative studies, personal visual analytics, mobile, augmented/mixed/extended reality, immersive},location={Vienna, Austria},series={IEEE VIS '25},dimensions={true},}
Embedded and Situated Visualisation in Mixed Reality to Support Interval Running
Ang Li, Charles Perin, Jarrod Knibbe, Gianluca Demartini, Stephen Viller, and Maxime Cordeil
In Computer Graphics Forum (EuroVis Special Issue), 2025
We investigate the use of mixed reality visualisations to help pace tracking for interval running. We introduce three immersive visual designs to support pace tracking. Our designs leverage two properties afforded by mixed reality environments to display information: the space in front of the user and the physical environment to embed pace visualisation. In this paper, we report on the first design exploration and controlled study of mixed reality technology to support pacing tracking during interval running on an outdoor running track. Our results show that mixed reality and immersive visualisation designs for interval training offer a viable option to help runners (a) maintain regular pace, (b) maintain running flow, and (c) reduce mental task load.
@inproceedings{Li25_EmbeddedRunningVit,author={Li, Ang and Perin, Charles and Knibbe, Jarrod and Demartini, Gianluca and Viller, Stephen and Cordeil, Maxime},title={Embedded and Situated Visualisation in Mixed Reality to Support Interval Running},year={2025},publisher={Wiley},doi={10.1111/cgf.70133},booktitle={Computer Graphics Forum (EuroVis Special Issue)},keywords={running, immersive analytics, mixed reality},location={Luxembourg City, Luxembourg},series={EuroVis '25},dimensions={true},}
2023
GestureExplorer: Immersive Visualisation and Exploration of Gesture Data
Ang Li, Jiazhou Liu, Maxime Cordeil, Jack Topliss, Thammathip Piumsomboon, and Barrett Ens
In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 2023
This paper presents the design and evaluation of GestureExplorer, an Immersive Analytics tool that supports the interactive exploration, classification and sensemaking with large sets of 3D temporal gesture data. GestureExplorer features 3D skeletal and trajectory visualisations of gestures combined with abstract visualisations of clustered sets of gestures. By leveraging the large immersive space afforded by a Virtual Reality interface our tool allows free navigation and control of viewing perspective for users to gain a better understanding of gestures. We explored a selection of classification methods to provide an overview of the dataset that was linked to a detailed view of the data that showed different visualisation modalities. We evaluated GestureExplorer with two user studies and collected feedback from participants with diverse visualisation and analytics backgrounds. Our results demonstrated the promising capability of GestureExplorer for providing a useful and engaging experience in exploring and analysing gesture data.
@inproceedings{10.1145/3544548.3580678,author={Li, Ang and Liu, Jiazhou and Cordeil, Maxime and Topliss, Jack and Piumsomboon, Thammathip and Ens, Barrett},title={GestureExplorer: Immersive Visualisation and Exploration of Gesture Data},year={2023},isbn={9781450394215},publisher={Association for Computing Machinery},address={New York, NY, USA},url={https://doi.org/10.1145/3544548.3580678},doi={10.1145/3544548.3580678},booktitle={Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems},articleno={380},numpages={16},keywords={gesture elicitation study, immersive analytics, virtual reality},location={Hamburg, Germany},series={CHI '23},dimensions={true},video={https://youtu.be/_5Y5uh3a_pE},}
A Visual Survey of In-Situ Running Analytics
Ang Li, Stephen Viller, Gianluca Demartini, and Maxime Cordeil
@inproceedings{123456,author={Li, Ang and Viller, Stephen and Demartini, Gianluca and Cordeil, Maxime},title={A Visual Survey of In-Situ Running Analytics},year={2023},publisher={IEEE},keywords={jogging, immersive analytics, survey (To be appear in IEEE VIS 2023 Posters)},location={Melbourne, Australia},series={VIS '23},dimensions={true},}
2022
Initial Evaluation of Immersive Gesture Exploration with GestureExplorer
Ang Li, Jiazhou Liu, Max Cordeil, and Barrett Ens
In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2022
@inproceedings{9757455,author={Li, Ang and Liu, Jiazhou and Cordeil, Max and Ens, Barrett},booktitle={2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},title={Initial Evaluation of Immersive Gesture Exploration with GestureExplorer},year={2022},volume={},number={},pages={580-581},doi={10.1109/VRW55335.2022.00141},dimensions={true},}
Demonstrating Immersive Gesture Exploration with GestureExplorer (Honourable Mention Award)
Ang Li, Jiazhou Liu, Max Cordeil, and Barrett Ens
In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2022
@inproceedings{9757612,author={Li, Ang and Liu, Jiazhou and Cordeil, Max and Ens, Barrett},booktitle={2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)},title={Demonstrating Immersive Gesture Exploration with GestureExplorer (Honourable Mention Award)},year={2022},volume={},number={},pages={980-981},doi={10.1109/VRW55335.2022.00341},dimensions={true},}