Intention recognition for multiple agents

Zhang Zhang, Yifeng Zeng*, Wenhui Jiang, Yinghui Pan*, Jing Tang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
14 Downloads (Pure)


Discovering common intentions of multiple agents is one of the important ways to detect the tendency of their collaborative behaviours. Existing work mainly focuses on intention recognition in a single-agent setting and uses a descriptive model, e.g. Bayesian networks, in the recognition process. In this article, we develop a new approach of identifying intentions for multiple agents through analysing their behaviours over time. We first define a prescriptive, behavioural model for a single agent that represents the agent's behaviours where their intentions are hidden in the plan execution. We introduce landmarks into the behavioural model therefore enhancing informative features to identify common intentions for multiple agents. Subsequently, we refine the model by focusing only on action sequences in their plans and provide a light model for identifying and comparing their intentions. The new model provides a simple approach of grouping agents’ common intentions upon partial plans observed in agents’ interactions. After that, we transform the intention recognition into an un-supervised learning problem and adapt a clustering algorithm to group intentions of multiple agents through comparing their behavioural models. We conduct the clustering process through measuring similarity of probability distributions over potential landmarks in the behavioural models so as to discover agents’ common intentions. Finally, we examine the new intention recognition approaches in two problem domains. We demonstrate importance of recognising common intentions of multiple agents in achieving their goals and provide experimental results to show performance of the new approaches.

Original languageEnglish
Pages (from-to)360-376
Number of pages17
JournalInformation Sciences
Early online date20 Feb 2023
Publication statusPublished - 1 May 2023

Cite this