Do you ever wish your phone understood wat you wanted without having to type it out or resorting to voice command? What if the technology around you could function in a way that reflect your mood and work to improve it? Rana el Kaliouby, co-founder of a tech start up called Affectiva, has been working to find ways to incorporate emotions into technology. This technology, called “affective computing,” adds the component of human emotion to computers.
While El Kaliouby sees a variety of ways that this new advance could benefit people, her main goal is to apply the technology to healthcare. Specifically, she thinks that a face recognition and emotion detector could help researchers more accurate feedback regarding clinical trials, so patients do not feel like they need to please the doctor. If they are uncomfortable or feel pain, the software would detect that and an accurate adjustment can be made to the product.
The program is trained to recognize “action units,” or tiny muscle movements happening across the face, twenty times per second. These action units include blinks, winks, lip puckers, inner and outer brow raises, and many more tiny movements. Affectiva’s program analyzes the movements and categorizes into seven basic emotions happiness, sadness, surprise, fear, anger, disgust, and contempt. After analyzing countless videos of facial expressions, the company database has archived over 40 billion “emotion data points.” Ideally, computers, phones, and even our refrigerators would use this technology to recognize how we are feeling and react accordingly, or at least make a suggestion.
Emotion-processing technology could drastically change the way we use technology, the way we interact with technology, and the way it interacts with us. Would you want your phone to react to your facial expressions and suggest different ideas of what to do, or would you rather have technology stay out of your feelings and just do as you tell it to?
Erina Taradai (Group 3)