Our projects on affective robot design focus on integrating moods, feelings, and attitudes with robot design to improve human-robot interaction.


Emotion Perception and Modeling of Robot Sound

We study sound as a modality through which robots can express emotion, or which can affect people’s perceptions of robots. With the former aim in mind, we are implementing an Emotion Recognition Algorithm onto the robot Haru. It will be tested with both individual and group settings to monitor interaction. The purpose of this research is to use emotion-based data to improve design of “at-home” robots to create a better, more positive and natural interaction between humans and robots.

We are also interested in how the sounds robots make affect people’s perceptions and attitudes toward robots. In one study, we looked at whether the mismatch between the robots’ appearance (android, humanoid, minimalist) and their voice (human or robotic)  affected people’s evaluations of the discomfort, competence, eeriness, warmth, humanness, and attractiveness of these robots. This research helps to better understand the ideal types of appearances and sounds that are most receptive to the general populace when people interact with robots.

Researchers: Kyrie Jig Amon, Selma Šabanović


Minimalist Robot for Affective Expressions (MiRAE)

MiRAE (Minimalist Robot for Affective Expressions) is a robot face that can perform an array of facial expressions and neck motions. We use it to study various aspects of robotic face design and the cognitive mechanisms behind social human-robot interaction, as well as to develop inexpensive and replicable robotic faces for experimental purposes. The robot is specifically designed to be easily reproducible by incorporating affordable consumer electronics components. Through experimental studies, we seek to understand how certain design components –facial expressions, the degree of realism of facial features, and rhythmicity and coordination – affect human-robot interaction so that we can build robots that better interact with people in more natural ways. We explore the perception of audio as well as visual components of interaction using the robot in collaboration with researchers from Bielefeld University. 

Researchers: Casey Bennett, Angelika Hoenemanm, Christopher Myles, Selma Šabanović, Marlena Fraune, Katherine Shaw

Publications:

  • Hönemann, A., Bennett, C., Wagner, P., & Sabanovic, S. (2019). Audio-visual synthesized attitudes presented by the German speaking robot SMiRAE. In Proceedings of the 15th International Conference on Auditory-Visual Speech Processing.
  • Bennett, C. C. (2015). Robotic Faces: Exploring Dynamical Patterns of Social Interaction between Humans and Robots. PhD Dissertation, IUB.
  • Doyle, L., Bennett, C. C., & Šabanović, S. (2015, March). MiRAE: My Inner Voice. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (pp. 287-287). ACM.
  • Bennett, C. C. (2015). The effects of culture and context on perceptions of robotic facial expressions. Interaction Studies, 16(2), 272-302.
  • Bennett, C. C., Šabanović, S., Fraune, M. R., & Shaw, K. (2014, August). Context congruency and robotic facial expressions: Do effects on human perceptions vary across culture?. In The 23rd IEEE international symposium on robot and human interactive communication (pp. 465-470). IEEE.
  • Bennett, C. C., & Šabanović, S. (2014). Deriving minimal features for human-like facial expressions in robotic faces. International Journal of Social Robotics, 6(3), 367-381.
  • Bennett, C. C., & Šabanović, S. (2013, March). Perceptions of affective expression in a minimalist robotic face. In Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction (pp. 81-82). IEEE Press.

Lifelikeness in Robot Design

Robotic technologies seem to occupy a space somewhere between animacy and inanimacy; they are at the same time subject and object, tool and agent. An important issue for robot design is understanding how different factors relating to the design of the robot and the context of human-robot interaction affect people’s perceptions of a robot’s lifelikeness. Through several studies of robot motion and interactive behavior, as well as people’s predispositions toward empathizing and anthropomorphizing robots, we seek to better understand how lifelikeness can be designed and used in human-robot interaction.  

Researchers: Haodan Tan, Selma Šabanović

Publications

  • Tan, H., Wang, D., & Sabanovic, S. (2018, August). Projecting Life Onto Robots: The Effects of Cultural Factors and Design Type on Multi-Level Evaluations of Robot Anthropomorphism. In 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 129-136). IEEE.
  • Tan, H., & Sabanovic, S. (2017, March). Designing Lifelikeness in Interactive and Robotic Objects. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (pp. 381-382). ACM.
  • Tan, H., Sun, L., & Šabanović, S. (2016, August). Feeling green: Empathy affects perceptions of usefulness and intention to use a robotic recycling bin. In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 1051-1056). IEEE.
  • Tan, H., Tiab, J., Šabanović, S., & Hornbæk, K. (2016, June). Happy moves, sad grooves: Using theories of biological motion and affect to design shape-changing interfaces. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (pp. 1282-1293). ACM.