We propose to research how autobiographical recall can be detected in virtual reality (VR). In particular, we experimentally investigate what physiological parameters accompany interaction with autobiographical memories in VR. We consider VR as one important representation of Human-AI collaboration.

For this, we plan to (1) record an EEG data set of people’s reaction and responses when recalling an autobiographical memory, (2) label the data set, and (3) do an initial analysis of the dataset to inform the design of autobiographical VR experiences. We would try to automate data collection as much as possible to make it easy to add more data over time.

This will contribute to a longer-term effort in model and theory formation. The main Contribution is to WP3. This is set in Task 3.2: Human-AI Interaction/collaboration paradigms and aims at better understanding user emotion in VR to model self-relevance in AI collaboration Task 3.4.

Output

dataset on autobiographical recall in VR

a manuscript describing the data set and initial insights into autobiographical recall in VR

Presentations

Project Partners:

  • Ludwig-Maximilians-Universität München (LMU), Albrecht Schmidt
  • German Research Centre for Artificial Intelligence (DFKI), Paul Lukowicz and Patrick Gebhard

Primary Contact: Albrecht Schmidt, Ludwig-Maximilians-Universität München

Main results of micro project:

We have developed VR experiences for research on autobiographical recall in virtual reality (VR). This allows us to experimentally investigate what physiological parameters accompany self-relevant memories elicited by digital content. We have piloted the experiment and are currently recording more data on the recall of autobiographical memories. After data collection is complete, we will label the data set, and do an initial analysis of the dataset to inform the design of autobiographical VR experiences. We have also co-hosted a Workshop on AI and human memory.

Contribution to the objectives of HumaneAI-net WPs

The main Contribution is to WP3. This is set in Task 3.2: Human-AI Interaction/collaboration paradigms and aims at better understanding user emotion in VR to model self-relevance in AI collaboration Task 3.4. The VR experience is implemented in Unity and we are happy to share this in the context of a joint project.

Tangible outputs