It was 1982 and E.T. the Extra-Terrestrial was hitting the box office. Hollywood was showing us a gentler side of a hypothetical and unknown form of life that the media had portrayed as evil for many years. That led to a whole new way of thinking about aliens and how they might come to interact with us humans. No longer were aliens pictured as being totally evil. Instead, they were seen as complex organisms, capable of experimenting emotions and empathizing with other beings. The possibility of aliens interacting with humans in an emotional way was something that the mainstream embraced.
We are currently at the crossroads of technology where something similar is happening. After decades of picturing computers as an emotionless threat to humanity, we are finally opening up to the possibility of humans interacting with computers in ways that resemble empathic interactions with other living beings. The idea of computer-human interaction is no longer limited to a screen, a keyboard and a mouse. Thanks to new technologies that allow computers to understand the emotional state of humans, we can now think of a world where Affective Computing exists.
In this post we will discuss what Affective Computing is, how it can impact the app world, and its implications for your business.
What is Affective Computing
The term Affective Computing was first coined by MIT Media Lab professor Rosalind Picard in 1997. In her revolutionary paper, Picard referred to Affective Computing as “computing that relates to, arises from, or deliberately influences emotion or other affective phenomena.” This introduced the world to the idea of a form of computing that was able to interact with human emotions. In other words, it introduced the notion of computer emotional intelligence. Since then, technology has evolved a great deal. Many disciplines working together have been able to put into practice some of Picard’s ideas, as these are shaped at the intersection of computer science, psychology and cognitive science.
In practice, Affective Computing means that a computer is able to recognize certain patterns in humans. This can be done by using special devices that help computers retrieve data in order to process it and simulate empathy. This allows computers to identify a user’s emotional state, but it also works the other way. A computer can transmit emotions to the user based on the input received. For example, if a user is feeling sad, the computer can identify this and adapt to the user’s emotions, performing actions like changing the color of the screen to counteract it, or setting the light of the room to a different setting. The key element in this entire process is Artificial Intelligence, more specifically, Machine Learning.
Machine Learning Powered Affective Computing
We’re still a long way from having robots that emulate humans perfectly (if we ever get there), but the world has made quite some progress. Thanks to powerful Machine Learning tools and apps, as well as the vast amounts of data computers can collect and process, it is possible for algorithms to identify human emotions through pattern recognition. It’s not that the algorithm knows that someone is happy, it just learns to identify happiness thanks to the data it is able to process. One of the most popular ways to do so is through a method called supervised learning. In it, the Machine Learning algorithm is fed correct and incorrect examples of the case being considered. In this way, it is possible to recognize and evaluate what certain expressions or gestures mean.
Affective Computing can go beyond facial expressions. Its principles can be applied to analyze biometrics like pulse or temperature, and even variations in speech and voice. As a result, computers can process and simulate emotions from a variety of data points.
This is not something new. A now famous example of biometrics being measured and used in a video game is the case of Tetris 64. Being released for Nintendo 64 in 1998, it included a biosensor that measured whether the player was stressed out. Using this metric, the game would increase the difficulty of the gameplay, displaying a reaction to an emotion in a simple but creative way.
Affection detection devices have come a long way since then. Currently, a wide range of sensors exist that can be used to detect emotions in a number of ways. Some of the most famous are keyboards that monitor the typing speed and accuracy, detecting the user’s emotional state.
Design plays a major role in Affective Computing. It has the power to make interactions more potent by identifying a user’s emotional state, but also by influencing it. This is something that apps will need to consider in the future, as it will surely become an industry best practice. At a certain level, this is already being done. UX designers take into account many factors when designing interfaces and interactions, but Affective Computers will add levels of complexity. As a result, UIs will need to rethink many of its best practices.
An Affective Computing Practical Example
A popular example can help understand better some of the practical applications of Affective Computing. Imagine that you are texting someone about something that just happened to you. Sometimes the text is not enough to make this obvious to the reader, making it seem that the message is lost somewhere in the process. By using Affective Computing, a messaging app can identify, by analyzing the user’s face or other data, whether the message is a happy or a sad one. In this case, being the message a happy one, the app might format the message to have a certain background representing ‘happiness’.
Affective Computing Apps
We expect to see Affective Computing gaining relevance in coming years. In particular, we are looking forward to apps that use empathy-related features to deliver affective interactions. Apps are the most obvious and easy way for users to interact with technology, especially in mobile devices. Since this technology relies on Machine Learning, there are many ways in which it can be implemented. Although practically any industry may implement and benefit from Affective Computing, here are some we can foresee will get the most out of it.
There is a great potential for education technologies to implement features based on the user’s emotional states. By considering them, education apps can identify how a user is feeling, thus accommodating the learning experience to the emotional context.
FinTech traders and investors are able to make use of Affective Computing in order to help them make better decisions. An algorithm could help identify when a user is not qualified to make a decision based on his emotions.
This is one of the industries with the greatest potential. Together with recent findings in behavioral sciences, empathy algorithms can help deliver better customer service and journey experiences.
Wrapping It Up
As with many other HiTech developments, Affective Computing is sure to revolutionize our world. There are great expectations about how it will evolve, but also uncertainty. The idea of true emotional computers is far from being real, but the huge advances made cannot be denied.
One of the things that interests us the most at Koombea is how businesses will respond to it. There is a lot of room for creativity, as there are no playbooks defining how this needs to be implemented. However, one thing to keep in mind is that data privacy must always be ensured to users. This aside, the world could make use of more empathy, even if it comes from apps.
Empathic app development is something relatively new, but like any good product, it starts with great design. At Koombea we have more than 12 years of experience developing and designing world-class apps. Proof of this are the Indigo Awards won by three of our clients. If you want to make your app more empathic, contact us. We’d be happy to help you out!