Design of a Multimodal Hearing System

Bernd Tessendorf1, Matjaz Debevc2, Peter Derleth3, Manuela Feilner3, Franz Gravenhorst1, Daniel Roggen1, Thomas Stiefmeier1 and Gerhard Tröster1

  1. Wearable Computing Lab., ETH Zurich
    Gloriastr. 35, 8092 Zurich, Switzerland
  2. University of Maribor
    Smetanova ulica 17, 2000 Maribor, Slovenia
  3. Phonak AG
    Laubisrütistrasse 28, 8712 Stäfa, Switzerland


Hearing instruments (HIs) have become context-aware devices that analyze the acoustic environment in order to automatically adapt sound processing to the user’s current hearing wish. However, in the same acoustic environment an HI user can have different hearing wishes requiring different behaviors from the hearing instrument. In these cases, the audio signal alone contains too little contextual information to determine the user’s hearing wish. Additional modalities to sound can provide the missing information to improve the adaption. In this work, we review additional modalities to sound in HIs and present a prototype of a newly developed wireless multimodal hearing system. The platform takes into account additional sensor modalities such as the user’s body movement and location. We characterize the system regarding runtime, latency and reliability of the wireless connection, and point out possibilities arising from the novel approach.

Key words

multimodal hearing instrument, assistive technology

Digital Object Identifier (DOI)

Publication information

Volume 10, Issue 1 (Januar 2013)
Year of Publication: 2013
ISSN: 1820-0214 (Print) 2406-1018 (Online)
Publisher: ComSIS Consortium

Full text

DownloadAvailable in PDF
Portable Document Format

How to cite

Tessendorf, B., Debevc, M., Derleth, P., Feilner, M., Gravenhorst, F., Roggen, D., Stiefmeier, T., Tröster, G.: Design of a Multimodal Hearing System. Computer Science and Information Systems, Vol. 10, No. 1. (2013),