Integral immersive analytics for cancer biology data

This project develops integral immersive analytics methods with visualisation and interaction methods to support effective data analytics for deep analysis and exploration. The project uses extended reality (XR) including virtual reality (VR), augmented reality (AR), or mixed reality (MR) to focus on specific genetic and biological information in large groups of young people with cancer in an effort to identify unique traits, compare patients of interest to learn what treatments are effective.

D. R. Catchpoole, S. J. Simoff, P. J. Kennedy, and Q. V. Nguyen, Data Driven Science for Clinically Actionable Knowledge in Diseases. CRC Press, 2024.

C. W. Lau, Z. Qu, D. Draper, R. Quan, A. Braytee, A. Bluff, D. Zhang, A. Johnston, P. J. Kennedy, S. Simoff, Q. V. Nguyen, and D. Catchpoole, “Virtual reality for the observation of oncology models (VROOM): immersive analytics for oncology patient cohorts,” Scientific Reports, vol. 12, no. 1, p. 11337, 2022/07/05 2022, doi: 10.1038/s41598-022-15548-1.

Q. V. Nguyen, N. H. Khalifa, P. Alzamora, A. Gleeson, D. Catchpoole, P. J. Kennedy, and S. Simoff, “Visual Analytics of Complex Genomics Data to Guide Effective Treatment Decisions,” Journal of Imaging, vol. 2, no. 4, pp. 1-17, 2016, doi: 10.3390/jimaging2040029.

  

Interpretability and trust for cancer biology data analytics

This project aims to make complex computational models, such as deep learning, more transparent and interpretable with user-centric visualisations to reduce the learning curve, improving the adoption and trust of the domain users in computational models and processes with user-centric visualisations. The project also focuses on generating interpretable presentations and visualisations to domain specialists.

Z. Qu, Y. Tegegne, S. J. Simoff, P. J. Kennedy, D. R. Catchpoole, and Q. V. Nguyen, “Enhancing Understandability of Omics Data with SHAP, Embedding Projections and Interactive Visualisations,” Sydney, Australia, 2022: Springer Nature Singapore, in Data Mining, pp. 58-72.

Z. Qu, Q. V. Nguyen, Y. Zhou, and D. R. Catchpoole, “Using Visualization to Illustrate Machine Learning Models for Genomic Data,” presented at the ACSW 2019 Proceedings of the Australasian Computer Science Week Multiconference, Sydney, NSW, Australia, 2019.

Screenshot

Biological and clinical domain user behaviour and user studies in immersive environments

This project studies the intricacies of clinical decisions around oncology data and interconnected technologies and translate them into holistic visualisations for an immersive system by engaging domain experts and clinicians in domain-specific narratives and empirical experiments.

A. Gronowski, D. Arness, J. Ng, Z. Qu, C. W. Lau, D. Catchpoole, and Q. V. Nguyen, “The Impact of Presence on User Experience and Performance in Virtual and Augmented Reality,” Virtual Reality, vol. 28, no. 3, p. 133, 2024, doi: https://doi.org/10.1007/s10055-024-01032-w.

J. Ng, D. Arness, A. Gronowski, Z. Qu, C. W. Lau, D. Catchpoole, and Q. V. Nguyen, “Exocentric and Egocentric Views for Biomedical Data Analytics in Virtual Environments—A Usability Study,” Journal of Imaging, vol. 10, no. 1, 2024, doi: https://doi.org/10.3390/jimaging10010003.

Z. Qu, Q. V. Nguyen, C. W. Lau, A. Johnston, P. J. Kennedy, S. Simoff, and D. Catchpoole, “Understanding Cancer Patient Cohorts in Virtual Reality Environment for Better Clinical Decisions: A Usability Study,” BMC Medical Informatics and Decision Making, 2023.

Analytical models and framework for cancer biology data analytics

This project develops new analytical models and framework that connect immersive analytics components with interaction for analysing large and complex information in real-time, focusing on 3D extended reality (XR) including virtual reality (VR), augmented reality (AR), or mixed reality (MR).

Federated omniverse facilities for smart digital futures

Together with Macquarie University, The Sydney University, The University of Newcastle, University of Wollongong, and CSIRO, our goal is to develop AuVerse into a federated, cross-disciplinary, cross-domain smart omniverse research network. The project will enable future-oriented facilities for designing, intelligentising, integrating, governing and showcasing smart metaverse devices, platforms, technologies, capabilities, and case studies. AuVerse will include infrastructure and facilities for: designing omniverse systems, such as immersive omniverse platforms and digital twins, 3D modeling and interaction, interactive XR and HCI, and visualisation; intelligentising the smart omniverse, such as AI, analytics and deep learning-oriented storage, computing, analytics and learning and their conversation, interaction, visualisation and automation in a large-scale cloud-based, automated visual environment; integrating omniverse ecotech, such as cloud/edge computing, networking, and communication between AuVerse Central and nodes and between nodes.

AuVerse Node at Western Sydney University is providing devices and facilities for 3D creation, design, mapping, modelling and reconstruction and large-scale visualisation. This will facilitate the future design, development, and demonstration of omniverse technologies and AuVerse facilities and functions in medical data analytics and other application.

Intelligent assistance with game theory

This project focuses on enhancing decision-making and intuitive interaction in immersive environment using game theory. By mimicing a real-world situation where multiple professionals have differing opinions, game theory can help guide them while maximising their payoff. This will facilitate decision-making processes that guide which options are preferred for a particular scenario with given information.

C. W. Lau, D. Catchpoole, S. Simoff, D. Zhang, and Q. V. Nguyen, “A Game-Theoretical Approach to Clinical Decision Making with Immersive Visualisation,” Applied Sciences, vol. 13, no. 18, p. 10178, 2023, doi: https://doi.org/10.3390/app131810178.

Intelligent interaction in immersive environment

Voice command interfaces are necessary for immersive environments when hand interactions cannot be used due to unfamiliarity and cognitive overhead of domain specialists on using the interaction devices. This project explores the realm of immersive technology, combined with generative AI voice command, aiming to create an intuitive and adaptable interface for the domain specialists to conduct intelligent interaction.

Flow Cytometry data analytics and visualisation

In collaboration with J. Craig Venture Institute, USA, this project develops machine learning, gating, and interactive visualisation methods to support the diagnosis of cancer cases using flow cytometry data.

Y. Tegegne, Z. Qu, Y. Qian, and Q. V. Nguyen, “Parallel Nonlinear Dimensionality Reduction Using GPU Acceleration,” in 19th Australasian Conference on Data Mining (AusDM2021) , Brisbane, Australia, 2021, pp. 3-15.

Human perception on charts on geographical maps

Geographic visualisation is an effective way to display quantitative information on a map that utilises the familiarity of the geographical information. The overlayed of visual cues, such as glyphs or charts, provide further information for individual items thanks to their visual compactness. This project studies human perception on geographic visualisation to evaluate the effectiveness and user preference of charts on the maps for multi-dimensional data. Results will benefit future designs to ensure effective learning and perception and may encourage the adoption of using geographic visualisation for data analysis.

Information Visualisation

This research includes various projects in developing methods, model and tools in information visualisation domain, such as i) Visual Analytics Tool for tabular data, ii) relational data visualisation, iii) diagrammatic visualisation for multi-dimensional port data.

Q. V. Nguyen, N. Miller, D. Arness, W. Huang, M. L. Huang, and S. Simoff, “Evaluation on interactive visualization data with scatterplots,” Visual Informatics, vol. 4, no. 4, pp. 1-10, 2020, doi: 10.1016/j.visinf.2020.09.004.

Q. V. Nguyen, D. Arness, C. J. Sanderson, S. Simoff, and M. L. Huang, “Enabling Effective Tree Exploration Using Visual Cues,” Journal of Visual Languages and Computing, vol. 47, pp. 44-61, 2018, doi: 10.1016/j.jvlc.2018.06.001.

Q. V. Nguyen, K. Zhang, and S. Simoff, “Unlocking the Complexity of Port Data With Visualization,” IEEE Transactions on Human-Machine Systems, vol. 45, no. 2, pp. 272-279, 2015, doi: 10.1109/thms.2014.2369375.