Contact person: Eric Blaudez, (eric.blaudez@thalesgroup.com)
Internal Partners:
- Thales, Eric Blaudez, eric.blaudez@thalesgroup.com
- Unibo, Paolo Torrini, p.torroni@unibo.it
- CNRS
External Partners:
- LISN, Christophe Servan c.servan@qwant.com
The micro-project provides a demonstration of the hierarchical framework for collaboration described in the Humane-AI Net revised strategic work plan, by constructing a multimodal and multilingual conversational agents focused on search. The framework is based on hierarchical levels of abilities:
- Reactive (sensori-motor) Interaction: Interaction is tightly-coupled perception-action where actions of one agent are immediately sensed and interpreted as actions of the other. Examples include greetings, polite conversation and emotional mirroring
- Situated (Spatio-temporal) Interaction Interactions are mediated by a shared model of objects and relations (states) and shared models for roles and interaction protocols.
In this micro-project, we focused on the 2 first levels (Reactive and Situational) and designed the global framework architecture to show a Proof of Concept (PoC).
Results Summary
We show that the proposed approach provides high-quality semantic segmentation from the robot’s perspective, with accuracy comparable to the original one. In addition, we exploited the gained information and improved the recognition performance of the deep network for the lower viewpoints and showed that the small robot alone is capable of generating high-quality semantic maps for the human partner. The computations are close to real time, so the approach enables interactive applications.
Tangible Outcomes
- T-KEIR: https://github.com/ThalesGroup/t-keir
- erc-unibo-module: https://github.com/helemanc/erc-unibo-module