The Centre for Speech Technology Research, The university of Edinburgh

AMI: Augmented Multiparty Interaction

Project Summary

AMI is an EU Integrated Project about computer enhanced multi-modal interaction in the context of meetings

Project Details

AMI is concerned with new multimodal technologies to support human interaction, in the context of instrumented meeting rooms and remote meeting assistants. The project aims to enhance the value of multimodal meeting recordings and to make human interaction more effective in real time. These goals are being achieved by developing new tools for computer supported cooperative work and by designing new ways to search and browse meetings as part of an integrated multimodal group communication, captured from a wide range of devices.

This Integrated Project is addresses a wide range of critical multi-disciplinary activities and applications, including: multimodal input interfaces (primarily speech and visual input); integration of modalities and coordination among modalities, eg (asynchronous) multi-channel processing; meeting dynamics and human-human interaction modelling; content abstraction, including multimodal information indexing, summarising, and retrieval; technology transfer; and training activities, including an international exchange programme.

Examples of the research being carried out in Edinburgh for the AMI project may be seen in the following demonstration videos : Beamforming and Summarisation and Segmentation

Edinburgh is the joint coordinator of the project (with IDIAP) and researchers from both CSTR and HCRC are involved. Our involvment includes meeting data collection and annotation (using the CSTR Instrumented Meeting Room), speech recognition, microphone arrays, multimodal content extraction, and development and support of the NITE XML annotation toolkit.

Project homepage:


Funding Source

EU IST programme (Framework VI)