Universal Cognitive User Interface (UCUI, 2015–2018)

Abstract

Recent speech dialog systems and cognitive user interfaces allow natural verbal human-machine-interaction and achieve an excellent performance. However, leading commercial solutions heavily rely on transmitting sensitive user information (personal data, voice recordings, etc.) through public networks and on processing, storing and analyzing these data on servers of service providers.

The goal of the UCUI project is to develop a cognitive user interface for intuitive interaction with arbitrary electronic devices which ensures privacy by design. That means that all information processing is done on the device and that no user data ever leave the interface. To this end we develop a stand-alone hardware module doing all signal, speech and cognitive information processing. The main technical challenge lies in building a small, energy-efficient system which achieves an acceptable performance without relying on external computational power and memory.

Interaction with the UCUI device takes place through speech, acoustic and visual symbols, and gestures. The system shall be capable of learning from the behavior of users in order to improve its function. Multiple modules will be able to cooperate (distributed microphone array, task assignment, etc.) over a strongly encrypted wireless connection. The system design is based on a study of user-machine interactions in a real home-automation scenario and will take into account relevant legal and ethical aspects.

Matthias Wolff, 2016/09/23

Project Facts

Term06/2015 - 05/2018
Partners
FundingBMBF, IKT 2020 - Forschung und Innovationen
  • Total subsidy: 1.6 Mio. EUR
  • BTU budget: 454,000 EUR (grant #16SV7304)
ContactProf. Dr.-Ing. habil. Matthias Wolff