Affect Computing: Mimicking Human Behavior

Synchrony can be thought of as a behavioral dance that occurs when we communicate with each other [1,2]. It is a natural part of communication that includes mimicking each other in language and movement during conversation [1,2]. Through synchrony, we are able to display a form of engagement and understanding of each other when we communicate with one and another [1,2]. Synchrony can be seen through out nature in different forms. However, we are going to focus on human-to-human communication styles and how it translates to human-to-computer. So, what happens when we take human-to-human behavioral practices and begin integrating those concepts into computing?

In the sci-fi and horror genre, the all to real machines that display synchrony are often portrayed as villains within the story and rarely the hero. These types of machines have the uncanny effect in which the human like machines radiate creep vibes. Programing machines with lifelike features that display the same level of conversational engagement often leaves the human within this scenario with the feeling of unease. The human participant is often left with the question of how much of the interaction is genuine vs how much of it is programmed/ artificial?

Curiosity and pushing the limits of comfort, are traits that humans have utilized and harnessed to drive advancements within society. Being able to harness the power of synchrony will allow computers to become more integrated and ubiquitous within our day-to-day lives. Synchrony has the ability to provide a more natural way of interacting with technology, potentially increasing access to technology and allowing for more meaningful communication.

We can already see this being implemented in voice assisted technology to some capacity. We are able to call upon a disembodied voice and have a conversation or engage with an artificial being. The disembodied voices elicit voice inflection, various accent styles and all while learning how “hear” the human voice and providing feedback. While the disembodied voice may not be able to hold a full conversation or have a physical presence other then a box, it’s just the beginning of developing systems that incorporate synchrony.

Developing and understanding synchrony requires a multi-discipline approach. By incorporating other disciplines in the development of machine-human synchrony, we can hopefully place appropriate checks and balances. One of the biggest hurdles that developers, designers, and researchers will have to face are the numerous ethical considerations and questions involving AI with synchrony abilities.

Unlike other disciplines that involve the human element, there is no official governing body, moral or ethical code when it comes to technology and the various off shoots. While HCC/HCI (Human-Centered Computing or Interaction) relies on other disciplines to fill knowledge gaps, borrow practices and theories from, there is no governing body that provides guidance for developing technology that impacts people. We rely on best judgement and the hope that we simply won’t take it too far.

Developing technology that mimics human behavior need to be governed or at the very least monitored. While some may argue the good out weighs the harm. We need to consider what happens when tech with highly sophisticated synchrony abilities is used to manipulate and harm vulnerable populations. While the therapeutic implications are significant, so are the social and economical implications.

We currently live in a society that is divided by technology through memes and tweets. Technology that has the ability to mimic human communication will have far reaching consequences. Sci-if and horror genres may depict the extreme scenarios, what makes them successful is their ability to speak the human psyche. As our interaction with technology evolves, the best and worse parts of humanity will be highlighted through our interaction with technology that is to mimic life.

Photo by Possessed Photography on Unsplash

Article References

[1] Emilie Delaherche, Mohamed Chetouani, Ammar Mahdhaoui, Catherine Saint-Georges, Sylvie Viaux, and David Cohen. 2012. Interpersonal Synchrony: A Survey of Evaluation Methods across Disciplines. IEEE Transactions on Affective Computing 3, 3 (July 2012), 349–365. DOI:https://doi.org/10.1109/t-affc.2012.12

[2] Petr Slovák, Paul Tennent, Stuart Reeves, and Geraldine Fitzpatrick. 2014. Exploring skin conductance synchronisation in everyday interactions. Proceedings of the 8th Nordic Conference on Human-Computer Interaction Fun, Fast, Foundational – NordiCHI ’14 (2014). DOI:https://doi.org/10.1145/2639189.2639206

Leave a comment