Technology Can Depict Human Emotions
In this modern era, technology has advanced not just in complex algorithms and robots but to an anthropomorphic perception with the recent detection of micro expressions on our face. With, globally, over 5 billion people possessing some type of smart device it seems ever more than likely for technology to impede our personal space and diagnose our emotions before we can. This is termed as ‘empathetic technology’ which is gradually becoming our reality.
Based on a definition by Stanford University, empathetic technology is technology that identifies and recognises human emotions to generate a response. Studies of such a nature have been noted since the 1970s where renowned psychologist Daniel Kahneman brought to light the pupillometry research. The basis of this research directly examined the increase in the size of pupil dilation corresponding to the intricacy of the demand of a task. Since then, a surge of recent reports pertaining to the relationship between pupil dilation and performance have come underway with satisfactory results. Such findings have presently been the foundation for technological advancements of today.
Signs of such progressions are hitherto detected in face recognition digital tools like phones or recreational devices. For example, in 2017, Amazon’s Alexa Team was one of the few groups that began analysing user’s voices to acknowledge their emotional state. However, for the healthcare industry it has been a crucial find. For example, further experiments have discovered how skin conductance or galvanic skin response relates to emotional arousal and how it is measured. Researchers determined that the amount of sweat secreted and the fluctuations in electrical resistance of the skin is a valuable indicator to foresee emotions like anger, stress, excitement or frustration. Furthermore, chemicals that humans exhale like carbon dioxide and isoprene can detect feelings of fear and loneliness.
For medical use, empathetic technology is practical for the implication of cameras, thermal imagery and exhalant measuring devices. A study developed at the Computer Science and Artificial Intelligence Laboratory (CSAIL) at the Massachusetts Institute of Technology has concluded that applying artificial intelligence to data collected in syntactic patterns and pitch-shift reflex can accurately detect illnesses like schizophrenia, Alzheimer’s disease or even depression. In addition, such technology has helped with the improvement of product designs to enhance user experience. For example, the University of Cambridge UK, has designed an arthritis simulation gloves prototype to mimic an empathetic experience and the mobility problems that patients may experience with common activities.
A recent concept of technology incorporating emotional intelligence in the future may pave the way for a new way of social interaction between medical professionals and their patients. Although the daunting idea of replacing physicians with coded gestured humanoids has yet some ways to go, using artificial intelligence to recognize human emotion and compassion can still offer patients a much-needed temporary comfort.
In conclusion, technology conveniently surrounds us, evoking a sense of ease for our daily life. It has assisted with the enhancement of digital tools in AI in the medical field and has assured us with possible solutions to global pressing problems.
- Technology that knows what you’re feeling, TEDTalks
- Technology Can Sense What We Want, April 2018, Medium
- Pupil dilation as an index of effort in cognitive control tasks: A review, December 2018, Psychonomic Bulletin & Review
- ‘Empathetic technology’: Can devices know what you’re feeling? April 2019, Medical News Today
- Should Algorithms and Robots Mimic Empathy?, September 2017, MedicalFuturist