Contact person: Teresa Hirzle (tehi@di.ku.dk)

Internal Partners:

  1. UCPH, Teresa Hirzle, kash@di.ku.dk
  2. LMU, Florian Müller, albrecht.schmidt@ifi.lmu.de

External Partners:

  1. Saarland University, Martin Schmitz
  2. Universität Innsbruck, Pascal Knierim  

 

Research on Extended Reality (XR) and Artificial Intelligence (AI) is booming, which has led to an emerging body of literature in their intersection. However, the main topics in this intersection are unclear, as are the benefits of combining XR and AI. This paper presents a scoping review that highlights how XR is applied in AI research and vice versa. We screened 2619 publications from 203 international venues published between 2017 and 2021, followed by an in-depth review of 311 papers. Based on our review, we identify five main topics at the intersection of XR and AI, showing how research at the intersection can benefit each other. Furthermore, we present a list of commonly used datasets, software, libraries, and models to help researchers interested in this intersection. Finally, we present 13 research opportunities and recommendations for future work in XR and AI research.

Results Summary

We conducted a scoping review covering 311 papers published between 2017 and 2021.

First, we screened 2619 publications from 203 venues to cover the broad spectrum of XR and AI research. For the search, we inductively built a set of XR and AI terms. The venues include research from XR, AI, Human-Computer Interaction, Computer Graphics, Computer Vision, and others. After a two-phase screening process, we reviewed and extracted data from 311 full papers based on a code book with 26 codes about the research direction, contribution, and topics of the papers, as well as the algorithms, tools, datasets, models, and data types the researchers used to address research questions on XR and AI. The extracted data for these codes form the basis for our predominantly narrative synthesis. As a result, we found five main topics at the intersection of XR and AI: (1) Using AI to create XR worlds (28.6%), (2) Using AI to understand users (19.3%), (3) Using AI to support interaction (15.4%), (4) Investigating interaction with intelligent virtual agents (IVAs) (8.0%), and (5) Using XR to Support AI Research (2.3%).

The remaining 23.8% of the papers apply XR and AI to an external problem, such as for medical training applications (3.5%), or for simulation purposes (3.0%). Finally, we summarise our findings in 13 research opportunities and present ideas and recommendations for how to address them in future work. Some of the most pressing issues are a lack of generative use of AI to create worlds, understand users, and enhance interaction, a lack of generalisability and robustness, and a lack of discussion about ethical and societal implications.

In terms of the call topics, we analysed whether XR can serve as a tool to establish and enhance interactive grounding in human-AI interaction. Here, we found that there is a lack of understanding user experience during human-AI interaction using XR technology. Typically, AI is used for content creation and to enhance interaction techniques. We did, however, not find a lot of papers that use XR to support human-AI interaction. There are some works that look into artificial agents and how an interaction with them can be realised through XR. However, most of these works do not yet work in real-time and are mostly based on mock-up scenes.

Tangible Outcomes

  1. Teresa Hirzle, Florian Müller, Fiona Draxler, Martin Schmitz, Pascal Knierim, and Kasper Hornbæk. 2023. When XR and AI Meet – A Scoping Review on Extended Reality and Artifcial Intelligence. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 45 pages. https://doi.org/10.1145/3544548.3581072 https://thirzle.com/pdf/chi23_xrai_scoping_review_hirzle.pdf 
  2. Reviewed Papers and Coding Spreadsheet: https://thirzle.com/supplement/chi23_xrai_scoping_review_hirzle.zip 
  3. CHI’23 conference presentation: https://youtu.be/VDg-2Pz9lj8?feature=shared