FHWS students win Best Paper Award during conference Advances in Computer-Human Interactions (ACHI 2020)

Fri, 26 Jun 2020 | newsletter, master's studies

In face to face conversations, a multitude of channels of communication is used. However, not everybody is able to perceive a communication situation on all of these channels. For example, not all non-verbal information is available to the visually impaired. This can lead to a situation where a visually impaired person can only analyse a communication situation to a limited extent.

The professor for Socio-Informatics at the University of Applied Sciences Würzburg-Schweinfurt is currently working on a solution to this problem. A work-in-progress paper on this topic has been submitted for the conference Advances in Computer-Human Interactions (ACHI 2020). ACHI 2020 is the 13th international conference dedicated to this topic. Anna Kushnir is currently doing her Master’s in Information Systems at the Faculty of Computer Science and Business Information Systems at FHWS and was honoured with the Best Paper Award in the category “Interfaces” for her project conducted in the area of the research professorship Socio-Informatics.

In her contribution, she describes the aim to develop a wearable vision substitution prototype for the blind and visually impaired that assists in everyday conversations. The paper outlines the idea of as well as the work in progress for this substitution prototype which converts visual stimuli into tactile stimuli with the help of the Facial Action Coding System (FACS) so that they become perceivable for its user. The underlying idea is that already widespread substitution solutions for the visually impaired mostly use audio-based feedback. Such solutions provide support in various life situations, but are not suitable for supporting communication. The reason for this is that audio-based feedback can mix with verbal communication and thus distract from it.

Therefore, Anna Kushnir’s substitution solution uses tactile feedback. The status of the prototype at the time the paper was produced is shown in the first picture (bottom left). The task of the system is to convert the emotional valence, which can be recognized by the facial expressions of the interlocutor, into tactile stimuli and thus to make them perceptible to the impaired user. Vibration rings are used for the tactile interface which are worn on the fingers. In order to analyse and classify facial expressions the FACS-assisted facial expression analysis software FaceReader by Noldus is used.

The prototype is currently being expanded with additional functionalities as part of a master’s thesis and will then be evaluated with the help of a qualitative study. The second picture (bottom right) shows the current status of the prototype how it was presented during the FutureCode Conference in Würzburg. With the aim to provide the user with even more detailed information about the communication situation, the team is currently working on extending the prototype by the seven basic emotions according to Ekman.