Adapting user interfaces (UIs) requires taking into account both positive and negative effects that changes may have on the user. A carelessly picked adaptation may impose high costs — for example, due to surprise or relearning effort. It is essential to consider differences between users as the effect of an adaptation depends on the user’s strategies, e.g. how each user searches for information in a UI. This microproject extends an earlier collaboration between partners on model-based reinforcement learning for adaptive UIs by developing methods to account for individual differences. Here, we first develop computational models to explain and predict users’ visual search and pointing strategies when searching within a UI. We apply this model to infer user strategies based on interaction history, and adapt UIs accordingly. The outcomes of this project will be (1) a publication at the ACM CHI conference and (2) integration in our platform for adaptive UIs.


Model of visual search and pointing in menus. The code will be available on GitHub

The integration of the model in our platform for adaptive UI. The code will be available on GitHub

A demo of the system will be available online

A publication at the conference ACM CHI


Project Partners:

  • Sorbonne Université, Gilles Bailly
  • Aalto University, Kashyap Todi


Primary Contact: Gilles Bailly, Sorbonne Université, CNRS, ISIR

Main results of micro project:

This micro-project reinforces the collaborations between Sorbonne Université, Aalto University and University of Luxembourg with weekly meetings. It aims at elaborating computational models of visual search in adaptive User Interfaces. We defined different visual search strategies in adaptive menus as well as promising interactive mechanisms to revisit how to to design menus. The Elaboration of the model is in progress.

Contribution to the objectives of HumaneAI-net WPs

The micro-project aims at empowering humans with advanced user interfaces with the capacity to adapt to the user goals. More precisely, it aims at designing User Interfaces increasing usability and allowing humans and machines to better collaborate. We plan to integrate our model in our platform for adaptive UIs and make a demo of the system available online.

Tangible outputs