Research at the intersection of artificial intelligence (AI) and extended reality (XR) has amounted to substantial literature over the past 20 years. Applications cover a broad spectrum, for example, visualising neural networks in virtual reality or interacting with conversational agents. However, a systematic overview is currently missing.

This micro-project addresses this gap with a scoping review covering two main objectives: First, it aims to give an overview of the research conducted at the intersection of AI and XR. Secondly, we are particularly interested in revealing how XR can be used to improve interactive grounding in human-AI interaction. In summary, the review focuses on the following guiding questions: Which are the typical AI methods used in XR research? Which are the main use cases at the intersection of AI and XR? How can XR serve as a tool to enhance interactive grounding in human-AI interaction?


Conference or journal paper co-authored by the proposers (possibly with et. partners)

Dataset of the papers including codes

Project Partners:

  • Københavns Universitet (UCPH), Teresa Hirzle
  • Københavns Universitet (UCPH), Kasper Hornbæk
  • Ludwig-Maximilians-Universität München (LMU), Florian Müller


Primary Contact: Teresa Hirzle, University of Copenhagen, Department of Computer Science

Results Description

We conducted a scoping review covering 311 papers published between 2017 and 2021. First, we screened 2619 publications from 203 venues to cover the broad spectrum of XR and AI research. For the search, we I inductively built a set of XR and AI terms. The venues include research from XR, AI, Human-Computer Interaction, Computer Graphics, Computer Vision, and others. After a two-phase screening process, we reviewed and extracted data from 311 full papers based on a code book with 26 codes about the research direction, contribution, and topics of the papers, as well as the algorithms, tools, datasets, models, and data types the researchers used to address research questions on XR and AI. The extracted data for these codes form the basis for our predominantly narrative synthesis. As a result, we found fve main topics at the intersection of XR and AI: (1) Using AI to create XR worlds (28.6%), (2) Using AI to understand users (19.3%), (3) Using AI to support interaction (15.4%), (4) Investigating interaction with intelligent virtual agents (IVAs) (8.0%), and (5) Using XR to Support AI Research (2.3%). The remaining 23.8% of the papers apply XR and AI to an external problem, such as for medical training applications (3.5%) or for simulation purposes (3.0%). Finally, we summarise our findings in 13 research opportunities and present ideas and recommendations for how to address them in future work. Some of the most pressing issues are a lack of generative use of AI to create worlds, understand users, and enhance interaction, a lack of generalisability and robustness, and a lack of discussion about ethical and societal implications.
In terms of the call topics, we analysed whether XR can serve as a tool to establish and enhance interactive grounding in human-AI interaction. Here, we found that there is a lack of understanding user experience during human-AI interaction using XR technology. Typically, AI is used for content creation and to enhance interaction techniques. We did, however, not find a lot of papers that use XR to support human-AI interaction. There are some works that look into artificial agents and how an interaction with them can be realised through XR. However, most of these works do not yet work in real=time and are mostly based on mock-up scenes.


Teresa Hirzle, Florian Müller, Fiona Draxler, Martin Schmitz, Pascal Knierim, and Kasper Hornbæk. 2023. When XR and AI Meet – A Scoping Review on Extended Reality and Artifcial Intelligence. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 45 pages.

Links to Tangible results

Reviewed Papers and Coding Spreadsheet:
Videos: There will be talk videos at a later stage.