AMI: Augmented Multiparty Interaction
AMI is an EU Integrated Project about computer enhanced multi-modal interaction in the context of meetings
AMI is concerned with new multimodal technologies to support human interaction, in the context of instrumented meeting rooms and remote meeting assistants. The project aims to enhance the value of multimodal meeting recordings and to make human interaction more effective in real time. These goals are being achieved by developing new tools for computer supported cooperative work and by designing new ways to search and browse meetings as part of an integrated multimodal group communication, captured from a wide range of devices.
This Integrated Project is addresses a wide range of critical multi-disciplinary activities and applications, including: multimodal input interfaces (primarily speech and visual input); integration of modalities and coordination among modalities, eg (asynchronous) multi-channel processing; meeting dynamics and human-human interaction modelling; content abstraction, including multimodal information indexing, summarising, and retrieval; technology transfer; and training activities, including an international exchange programme.
Edinburgh is the joint coordinator of the project (with IDIAP) and researchers from both CSTR and HCRC are involved. Our involvment includes meeting data collection and annotation (using the CSTR Instrumented Meeting Room), speech recognition, microphone arrays, multimodal content extraction, and development and support of the NITE XML annotation toolkit.
Project homepage: http://www.amiproject.org
- Steve Renals
- Jean Carletta
- Johanna Moore
- Mike Lincoln (Quorate Technology)
- Giulia Garau
- Weiqun Xu (HCRC)
- Alfred Dielmann
- Jonathan Kilgour
EU IST programme (Framework VI)