Applewhite, TimothyZhong, JiaDornberger, Rolf2024-04-192024-04-192021978-1-7281-9048-810.1109/SSCI50451.2021.9659935https://irf.fhnw.ch/handle/11654/43162Multimodal interaction is an essential prerequisite for affective human-robot engagement. Research on bidirectional, affective multimodal interaction systems investigates systems that recognize a user's affect and generate emotional response based on this user's affect. The presented work investigates a novel bidirectional, affective multimodal interaction system using a social robot and an open-source dialogue system framework, developing a prototype based here on Pepper and Rasa. Compared to special lab robotics systems, the proposed system is more attainable, while incorporating, alongside speech and facial expression, eye gaze as one of the major features to convey emotions as input channels. The system generates and emulates emotional output behaviors based on a user's affect using speech, gestures and emojis. This paper describes the concrete implementation and evaluation of the proposed system. Results of the evaluation show that, although the recognition accuracy of the input channels perform differently well, the system can derive well-defined rule-based emotional output behaviors with a high multimodal accuracy rate in the given test scenarios.en330 - WirtschaftNovel bidirectional multimodal system for affective human-robot engagement04B - Beitrag Konferenzschrift