{"id":1321,"date":"2023-03-31T08:45:19","date_gmt":"2023-03-31T08:45:19","guid":{"rendered":"https:\/\/robotics24.net\/blog\/?p=1321"},"modified":"2023-04-20T15:00:13","modified_gmt":"2023-04-20T15:00:13","slug":"emotion-ai-technology-that-understands-our-emotions","status":"publish","type":"post","link":"https:\/\/robotics24.net\/blog\/emotion-ai-technology-that-understands-our-emotions\/","title":{"rendered":"Emotion AI: Technology that understands our emotions"},"content":{"rendered":"\n<p>Emotional Artificial Intelligence EAI is a branch of artificial intelligence that deals with the &#8220;<strong>emotional<\/strong>&#8221; aspects of the <strong>human <\/strong>being, able to <strong>understand <\/strong>them and, if necessary, <strong>emulate <\/strong>and help them.\u00a0The term dates back to 1995 when researcher\u00a0<a href=\"https:\/\/web.media.mit.edu\/~picard\/\" target=\"_blank\" rel=\"noreferrer noopener\">Rosalind Picard<\/a>\u00a0published \u201c\u00a0<a href=\"https:\/\/mitpress.mit.edu\/9780262661157\/affective-computing\/\" target=\"_blank\" rel=\"noreferrer noopener\">Affective Computing\u201d<\/a>.\u00a0<\/p>\n\n\n\n<p>There are different types of EAI: <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Word (text)<\/li>\n\n\n\n<li>Tone of voice<\/li>\n\n\n\n<li>Facial expressions<\/li>\n\n\n\n<li>Body language<\/li>\n\n\n\n<li>Multimodal<\/li>\n<\/ul>\n\n\n\n<p>For example, if a person addresses a <strong>voice <\/strong>assistant in an <strong>angry <\/strong>or frustrated tone, the voice assistant can <strong>detect <\/strong>these emotions and <strong>adapt <\/strong>its response appropriately. Similarly, a <strong><a href=\"https:\/\/robotics24.net\/blog\/glossary\/robot\/\" data-type=\"glossary\" data-id=\"787\">robot<\/a> <\/strong>can use EAI to interpret the facial <strong>expressions <\/strong>of the people it interacts with and adapt its behaviour accordingly.<\/p>\n\n\n\n<p>Since then many <strong>studies <\/strong>have been done in this sector to try to interpret and <strong>understand <\/strong>the <strong>emotions <\/strong>of the human being, from facial expressions, to the interpretation of gestures, to the tone of voice up to the choices in purchases while browsing the web.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image size-full is-resized\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-sound-wave.jpg\" alt=\"Emotion AI sound wave\" class=\"wp-image-1389\" width=\"607\" height=\"488\"\/><\/figure>\n\n\n\n<p>In November 2022,&nbsp;<a href=\"https:\/\/www.forbes.com\/sites\/forbestechcouncil\/?sh=2909ccba649b\" target=\"_blank\" rel=\"noreferrer noopener\">Forbes<\/a>&nbsp;published an&nbsp;<a href=\"https:\/\/www.forbes.com\/sites\/forbestechcouncil\/2022\/11\/23\/emotion-ai-why-its-the-future-of-digital-health\/?sh=190590ce6516\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a> indicating that EAI is the future of digital health.<\/p>\n\n\n\n<p>One of the <strong>examples <\/strong>of EAI application is the&nbsp;<a href=\"https:\/\/www.lucidtherapeutics.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">LUCID<\/a>&nbsp;project where the therapeutic effects of music are demonstrated by creating digital <strong>therapies <\/strong>to improve human health, therapies accessible to all.&nbsp;<strong>Empathy <\/strong>is a complex concept full of meanings sometimes with unknown implications, with the aim of understanding the emotional states of human beings.&nbsp;<\/p>\n\n\n\n<p>In theory, if <strong>machines <\/strong>can have this level of understanding, they can best help us by <strong>boosting <\/strong>performance in <strong>healthcare <\/strong>and personal care.<\/p>\n\n\n\n<p>Science journalist&nbsp;<a href=\"https:\/\/builtin.com\/authors\/stephen-gossett\" target=\"_blank\" rel=\"noreferrer noopener\">Stephen Gosset<\/a>, in his article,&nbsp;<a href=\"https:\/\/builtin.com\/artificial-intelligence\/emotion-ai\" target=\"_blank\" rel=\"noreferrer noopener\">Affective computing<\/a>, describes the advantages and disadvantages of EAI.<\/p>\n\n\n\n<p>Among these emerges the ability to detect the needs of the <strong>individual <\/strong>with good <strong>precision<\/strong>, consequently providing targeted products and services, <strong>optimizing <\/strong>times and costs.&nbsp;<\/p>\n\n\n\n<p>Among the disadvantages, in addition to the <strong>privacy <\/strong>issue, there is the possibility of making decisions on <strong>erroneous <\/strong>interpretations derived from inevitable cognitive <strong>biases <\/strong>and biases both of the algorithm programmer and intrinsic in human learning.&nbsp;<\/p>\n\n\n\n<p>Again this article offers us four important ideas to understand EAI, expressed by as many four competent characters in the field:&nbsp;<a href=\"http:\/\/altaplana.com\/grimes.html\" target=\"_blank\" rel=\"noreferrer noopener\">Seth Grimes<\/a>, founder of Alta Plana and natural language processing consultant;&nbsp;<a href=\"https:\/\/www.linkedin.com\/in\/ranagujral\/\" target=\"_blank\" rel=\"noreferrer noopener\">Ranal Gujral,<\/a>&nbsp;CEO of Behaviour Signals;&nbsp;<a href=\"https:\/\/www.linkedin.com\/in\/skylerplace\/\" target=\"_blank\" rel=\"noreferrer noopener\">Skyler Place<\/a>, Cogito&#8217;s chief behavioural science officer;&nbsp;and&nbsp;<a href=\"https:\/\/scholar.google.com\/citations?user=m7Jr-b4AAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Daniel McDuff<\/a>, former principal investigator of Microsoft AI.<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"1000\" height=\"748\" src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-face-recognition.jpg\" alt=\"Emotion AI face recognition\" class=\"wp-image-1325\" srcset=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-face-recognition.jpg 1000w, https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-face-recognition-980x733.jpg 980w, https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-face-recognition-480x359.jpg 480w\" sizes=\"auto, (min-width: 0px) and (max-width: 480px) 480px, (min-width: 481px) and (max-width: 980px) 980px, (min-width: 981px) 1000px, 100vw\" \/><\/figure>\n\n\n\n<p>Another concern about the use of EAI is that some people <strong>fear <\/strong>that the technology can be used to <strong>manipulate <\/strong>people&#8217;s <strong>emotions<\/strong>.<\/p>\n\n\n\n<p>Journalist&nbsp;<a href=\"https:\/\/www.lboro.ac.uk\/schools\/social-sciences-humanities\/team\/pragya-agarwal\/#:~:text=Pragya%20Agarwal%20is%20a%20behavioural,Universities%20for%20over%2012%20years.\" target=\"_blank\" rel=\"noreferrer noopener\">Pragya Agarwal<\/a>&nbsp;has published an article&nbsp;on&nbsp;<a href=\"https:\/\/www.wired.co.uk\/profile\/pragya-agarwal\" target=\"_blank\" rel=\"noreferrer noopener\">WIRED<\/a> where she explains how according to her, EAI is not and will never be a substitute for empathy and that this is and will always remain the exclusive prerogative of human beings.&nbsp;<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>2023 will be the year when EAI will become one of the dominant applications of machine learning.<\/p>\n<cite>According to the journalist<\/cite><\/blockquote>\n\n\n\n<p>For example,&nbsp;<a href=\"https:\/\/hume.ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">Hume AI<\/a>&nbsp;, founded by&nbsp;former Google researcher&nbsp;<a href=\"https:\/\/scholar.google.com\/citations?user=-i9gbsAAAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Alan Cowen<\/a>, is developing tools to measure emotion from verbal, facial, and vocal expressions. Swedish company&nbsp;<a href=\"https:\/\/smarteye.se\/\" target=\"_blank\" rel=\"noreferrer noopener\">Smart Eyes<\/a>&nbsp;recently acquired Affectiva, the MIT Media Lab spinoff that developed the&nbsp;<a href=\"https:\/\/soundnet.net\/music-library-services\" target=\"_blank\" rel=\"noreferrer noopener\">SoundNet neural network<\/a>, an algorithm that classifies emotions like anger from audio samples in less than 1.2 seconds.<\/p>\n\n\n\n<p>Video platform Zoom is also introducing Zoom IQ, a feature that will soon provide users with real-time analysis of emotion and <strong>engagement <\/strong>during a virtual <strong>meeting<\/strong>.<\/p>\n\n\n\n<p>We now live in the time where emotional artificial intelligence has also become common in <strong>schools<\/strong>.&nbsp;In Hong Kong, some secondary schools are already using an artificial intelligence program, developed by&nbsp;<a href=\"https:\/\/www.findsolutionai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Find Solutions AI<\/a>, which measures the <strong>micro-movements<\/strong> of muscles on students&#8217; faces and identifies a range of negative and positive emotions.&nbsp;<\/p>\n\n\n\n<p>Teachers use this system to monitor <strong>students<\/strong>&#8216; emotional changes, as well as their motivation and <strong>concentration<\/strong>, allowing them to take early <strong>action <\/strong>if a student is losing interest.<\/p>\n\n\n\n<p><a href=\"https:\/\/research.aimultiple.com\/author\/cem-dilmegani\/\" target=\"_blank\" rel=\"noreferrer noopener\">Cem Dilmegani<\/a>&nbsp;in his&nbsp;<a href=\"https:\/\/research.aimultiple.com\/emotional-ai-examples\/\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>&nbsp;describes the Emotion Detection and Recognition (EDR) market as a market worth over <strong>35 billion dollars<\/strong> with an annual growth of 17% until <strong>2030<\/strong>. <\/p>\n\n\n\n<p>He also reports the projects of 10 companies that operate successfully in <strong>various fields<\/strong> ranging from marketing, travel agencies, personal assistance, education, gaming, driving assistance, disaster and emergency forecasting and many others as described in another&nbsp;<a href=\"https:\/\/research.aimultiple.com\/affective-computing-applications\/\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>&nbsp;by the same author.<\/p>\n\n\n\n<p class=\"has-small-font-size\"><\/p>\n\n\n\n<div class=\"wp-block-columns v-center is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<figure class=\"wp-block-video\"><video height=\"480\" style=\"aspect-ratio: 480 \/ 480;\" width=\"480\" autoplay loop muted src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/emotiva-emotion-ai.mp4\"><\/video><\/figure>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<p>Analyse human attentive and affective states<\/p>\n\n\n\n<p>Credits: <a href=\"https:\/\/emotiva.it\/en\" target=\"_blank\" rel=\"noreferrer noopener\">Emotiva<\/a><\/p>\n<\/div>\n<\/div>\n\n\n\n<p>Professor&nbsp;<a href=\"https:\/\/scholar.google.com\/citations?user=w2K_CmkAAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Peter Mantello<\/a>&nbsp;and psychologist&nbsp;<a href=\"https:\/\/scholar.google.com\/citations?hl=en&amp;user=_5HEYN0AAAAJ\" target=\"_blank\" rel=\"noreferrer noopener\">Manh Tung Ho<\/a>&nbsp;, describe in their&nbsp;<a href=\"https:\/\/link.springer.com\/article\/10.1007\/s00146-022-01576-y#citeas\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>, which I recommend reading in its entirety, because we must be afraid of EAI.&nbsp;The authors identify 5 main tensions in the diffusion of EAI in the social fabric.&nbsp;<\/p>\n\n\n\n<p>The first tension is the fact that the technology relies on <strong>invisible <\/strong>data tracking, which can lead to improper, unethical or <strong>malicious <\/strong>use.&nbsp;<\/p>\n\n\n\n<p>The second concerns the cultural tensions that arise from the fact that these <strong>emotional AI <\/strong>technologies cross national and cultural borders.&nbsp;While emotion <strong>sensing <\/strong>technologies are predominantly designed in the West, they are sold to a global market.<\/p>\n\n\n\n<p>The problem is that while these devices cross international <strong>borders<\/strong>, their algorithms are rarely modified to account for racial, cultural, ethnic or gender differences <a href=\"https:\/\/boa.unimib.it\/handle\/10281\/357623\" target=\"_blank\" rel=\"noreferrer noopener\">false positive identifications<\/a>, with a negative <strong>impact <\/strong>on the target individual.&nbsp;<\/p>\n\n\n\n<p>Thirdly, the lack of industry standards, such as the hidden <strong>data-gathering<\/strong> activities of many smart technologies, developed as a <strong>proprietary <\/strong>layer in many products will make its collective <strong>regulation <\/strong>very difficult.&nbsp;A striking example is the automotive industry.&nbsp;<\/p>\n\n\n\n<p>Fourth, existing ethical <strong>frameworks <\/strong>for emotional AI are often vague and inflexible.&nbsp;This is because different companies, in different <strong>cultural <\/strong>backgrounds, have different <strong>rationales <\/strong>or goals for adopting new technology.&nbsp;<\/p>\n\n\n\n<p>Finally, comes the shaky science of the emotion recognition <strong>industry<\/strong>.&nbsp;A growing number of critics argue how it is possible that <strong>emotions <\/strong>can be made <strong>computable<\/strong>.<\/p>\n\n\n\n<p>As I often like to remind you, a <strong>technology <\/strong>cannot be <strong>judged <\/strong>good or bad but it is always how it is used and the purpose of its application that can be accepted or rejected by the single <strong>human <\/strong>being.<\/p>\n\n\n\n<p>In conclusion, <strong>a question <\/strong>arises spontaneously.&nbsp;<\/p>\n\n\n\n<p>But if man created <strong>robots <\/strong>to use them in repetitive jobs and in those processes where it is appropriate to <strong>make decisions<\/strong> based on <strong>statistical <\/strong>data preferable to <strong>emotional <\/strong>choices<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>What if robots or algorithms started making decisions based not on data but about emotions?&nbsp;<\/p>\n<\/blockquote>\n\n\n\n<p>They would probably cease to be useful as robots, and humans would have to <strong>create <\/strong>another <strong>generation <\/strong>of purely computational robots.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>An analysis of Emotion AI, an Artificial Intelligence that can read and understand human emotions, and its potential applications in various fields<\/p>\n","protected":false},"author":3,"featured_media":1324,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"off","_et_pb_old_content":"<!-- wp:paragraph -->\n<p>Emotional Artificial Intelligence EAI is a branch of artificial intelligence that deals with the \"emotional\" aspects of the human being, able to understand them and, if necessary, emulate and help them.&nbsp;The term dates back to 1995 when researcher&nbsp;<a href=\"https:\/\/web.media.mit.edu\/~picard\/\" target=\"_blank\" rel=\"noreferrer noopener\">Rosalind Picard<\/a>&nbsp;published \u201c&nbsp;<a href=\"https:\/\/mitpress.mit.edu\/9780262661157\/affective-computing\/\" target=\"_blank\" rel=\"noreferrer noopener\">Affective Computing\u201d<\/a>.&nbsp;<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>There are different types of EAI: <\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:list -->\n<ul><!-- wp:list-item -->\n<li>Word (text)<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Tone of voice<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Facial expressions<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Body language<\/li>\n<!-- \/wp:list-item -->\n\n<!-- wp:list-item -->\n<li>Multimodal<\/li>\n<!-- \/wp:list-item --><\/ul>\n<!-- \/wp:list -->\n\n<!-- wp:paragraph -->\n<p>For example, if a person addresses a <strong>voice <\/strong>assistant in an <strong>angry <\/strong>or frustrated tone, the voice assistant can <strong>detect <\/strong>these emotions and <strong>adapt <\/strong>its response appropriately. Similarly, a <strong>robot <\/strong>can use EAI to interpret the facial <strong>expressions <\/strong>of the people it interacts with and adapt its behaviour accordingly.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Since then many <strong>studies <\/strong>have been done in this sector to try to interpret and <strong>understand <\/strong>the <strong>emotions <\/strong>of the human being, from facial expressions, to the interpretation of gestures, to the tone of voice up to the choices in purchases while browsing the web.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:image {\"id\":1389,\"width\":607,\"height\":488,\"sizeSlug\":\"full\",\"linkDestination\":\"none\"} -->\n<figure class=\"wp-block-image size-full is-resized\"><img src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-sound-wave.jpg\" alt=\"Emotion AI sound wave\" class=\"wp-image-1389\" width=\"607\" height=\"488\"\/><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:paragraph -->\n<p>In November 2022,&nbsp;<a href=\"https:\/\/www.forbes.com\/sites\/forbestechcouncil\/?sh=2909ccba649b\" target=\"_blank\" rel=\"noreferrer noopener\">Forbes<\/a>&nbsp;published an&nbsp;<a href=\"https:\/\/www.forbes.com\/sites\/forbestechcouncil\/2022\/11\/23\/emotion-ai-why-its-the-future-of-digital-health\/?sh=190590ce6516\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a> indicating that EAI is the future of digital health.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>One of the <strong>examples <\/strong>of EAI application is the\u00a0<a href=\"https:\/\/www.lucidtherapeutics.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">LUCID<\/a>\u00a0project where the therapeutic effects of music are demonstrated by creating digital <strong>therapies <\/strong>to improve human health, therapies accessible to all.\u00a0<strong>Empathy <\/strong>is a complex concept full of meanings sometimes with unknown implications, with the aim of understanding the emotional states of human beings.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>In theory, if <strong>machines <\/strong>can have this level of understanding, they can best help us by <strong>boosting <\/strong>performance in <strong>healthcare <\/strong>and personal care.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Science journalist\u00a0<a href=\"https:\/\/builtin.com\/authors\/stephen-gossett\" target=\"_blank\" rel=\"noreferrer noopener\">Stephen Gosset<\/a>, in his article,\u00a0<a href=\"https:\/\/builtin.com\/artificial-intelligence\/emotion-ai\" target=\"_blank\" rel=\"noreferrer noopener\">Affective computing<\/a>, describes the advantages and disadvantages of EAI.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Among these emerges the ability to detect the needs of the <strong>individual <\/strong>with good <strong>precision<\/strong>, consequently providing targeted products and services, <strong>optimizing <\/strong>times and costs.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Among the disadvantages, in addition to the <strong>privacy <\/strong>issue, there is the possibility of making decisions on <strong>erroneous <\/strong>interpretations derived from inevitable cognitive <strong>biases <\/strong>and biases both of the algorithm programmer and intrinsic in human learning.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Again this article offers us four important ideas to understand EAI, expressed by as many four competent characters in the field:\u00a0<a href=\"http:\/\/altaplana.com\/grimes.html\" target=\"_blank\" rel=\"noreferrer noopener\">Seth Grimes<\/a>, founder of Alta Plana and natural language processing consultant;\u00a0<a href=\"https:\/\/www.linkedin.com\/in\/ranagujral\/\" target=\"_blank\" rel=\"noreferrer noopener\">Ranal Gujral,<\/a>\u00a0CEO of Behaviour Signals;\u00a0<a href=\"https:\/\/www.linkedin.com\/in\/skylerplace\/\" target=\"_blank\" rel=\"noreferrer noopener\">Skyler Place<\/a>, Cogito's chief behavioural science officer;\u00a0and\u00a0<a href=\"https:\/\/scholar.google.com\/citations?user=m7Jr-b4AAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Daniel McDuff<\/a>, former principal investigator of Microsoft AI.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:image {\"id\":1325,\"sizeSlug\":\"full\",\"linkDestination\":\"none\"} -->\n<figure class=\"wp-block-image size-full\"><img src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/Emotion-AI-face-recognition.jpg\" alt=\"Emotion AI face recognition\" class=\"wp-image-1325\"\/><\/figure>\n<!-- \/wp:image -->\n\n<!-- wp:paragraph -->\n<p>Another concern about the use of EAI is that some people <strong>fear <\/strong>that the technology can be used to <strong>manipulate <\/strong>people's <strong>emotions<\/strong>.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Journalist&nbsp;<a href=\"https:\/\/www.lboro.ac.uk\/schools\/social-sciences-humanities\/team\/pragya-agarwal\/#:~:text=Pragya%20Agarwal%20is%20a%20behavioural,Universities%20for%20over%2012%20years.\" target=\"_blank\" rel=\"noreferrer noopener\">Pragya Agarwal<\/a>&nbsp;has published an article&nbsp;on&nbsp;<a href=\"https:\/\/www.wired.co.uk\/profile\/pragya-agarwal\" target=\"_blank\" rel=\"noreferrer noopener\">WIRED<\/a> where she explains how according to her, EAI is not and will never be a substitute for empathy and that this is and will always remain the exclusive prerogative of human beings.&nbsp;<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><!-- wp:paragraph -->\n<p>2023 will be the year when EAI will become one of the dominant applications of machine learning.<\/p>\n<!-- \/wp:paragraph --><cite>According to the journalist<\/cite><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>For example,\u00a0<a href=\"https:\/\/hume.ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">Hume AI<\/a>\u00a0, founded by\u00a0former Google researcher\u00a0<a href=\"https:\/\/scholar.google.com\/citations?user=-i9gbsAAAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Alan Cowen<\/a>, is developing tools to measure emotion from verbal, facial, and vocal expressions. Swedish company\u00a0<a href=\"https:\/\/smarteye.se\/\" target=\"_blank\" rel=\"noreferrer noopener\">Smart Eyes<\/a>\u00a0recently acquired Affectiva, the MIT Media Lab spinoff that developed the\u00a0<a href=\"https:\/\/soundnet.net\/music-library-services\" target=\"_blank\" rel=\"noreferrer noopener\">SoundNet neural network<\/a>, an algorithm that classifies emotions like anger from audio samples in less than 1.2 seconds.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Video platform Zoom is also introducing Zoom IQ, a feature that will soon provide users with real-time analysis of emotion and <strong>engagement <\/strong>during a virtual <strong>meeting<\/strong>.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>We now live in the time where emotional artificial intelligence has also become common in <strong>schools<\/strong>.\u00a0In Hong Kong, some secondary schools are already using an artificial intelligence program, developed by\u00a0<a href=\"https:\/\/www.findsolutionai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Find Solutions AI<\/a>, which measures the <strong>micro-movements<\/strong> of muscles on students' faces and identifies a range of negative and positive emotions.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Teachers use this system to monitor <strong>students<\/strong>' emotional changes, as well as their motivation and <strong>concentration<\/strong>, allowing them to take early <strong>action <\/strong>if a student is losing interest.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p><a href=\"https:\/\/research.aimultiple.com\/author\/cem-dilmegani\/\" target=\"_blank\" rel=\"noreferrer noopener\">Cem Dilmegani<\/a>\u00a0in his\u00a0<a href=\"https:\/\/research.aimultiple.com\/emotional-ai-examples\/\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>\u00a0describes the Emotion Detection and Recognition (EDR) market as a market worth over <strong>35 billion dollars<\/strong> with an annual growth of 17% until <strong>2030<\/strong>. <\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>He also reports the projects of 10 companies that operate successfully in <strong>various fields<\/strong> ranging from marketing, travel agencies, personal assistance, education, gaming, driving assistance, disaster and emergency forecasting and many others as described in another\u00a0<a href=\"https:\/\/research.aimultiple.com\/affective-computing-applications\/\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>\u00a0by the same author.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph {\"fontSize\":\"small\"} -->\n<p class=\"has-small-font-size\"><\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:columns {\"className\":\"v-center\"} -->\n<div class=\"wp-block-columns v-center\"><!-- wp:column -->\n<div class=\"wp-block-column\"><!-- wp:video {\"id\":1327} -->\n<figure class=\"wp-block-video\"><video autoplay loop muted src=\"https:\/\/robotics24.net\/blog\/wp-content\/uploads\/2023\/03\/emotiva-emotion-ai.mp4\"><\/video><\/figure>\n<!-- \/wp:video --><\/div>\n<!-- \/wp:column -->\n\n<!-- wp:column -->\n<div class=\"wp-block-column\"><!-- wp:paragraph -->\n<p>Analyse human attentive and affective states<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Credits: <a href=\"https:\/\/emotiva.it\/en\" target=\"_blank\" rel=\"noreferrer noopener\">Emotiva<\/a><\/p>\n<!-- \/wp:paragraph --><\/div>\n<!-- \/wp:column --><\/div>\n<!-- \/wp:columns -->\n\n<!-- wp:paragraph -->\n<p>Professor\u00a0<a href=\"https:\/\/scholar.google.com\/citations?user=w2K_CmkAAAAJ&amp;hl=en\" target=\"_blank\" rel=\"noreferrer noopener\">Peter Mantello<\/a>\u00a0and psychologist\u00a0<a href=\"https:\/\/scholar.google.com\/citations?hl=en&amp;user=_5HEYN0AAAAJ\" target=\"_blank\" rel=\"noreferrer noopener\">Manh Tung Ho<\/a>\u00a0, describe in their\u00a0<a href=\"https:\/\/link.springer.com\/article\/10.1007\/s00146-022-01576-y#citeas\" target=\"_blank\" rel=\"noreferrer noopener\">article<\/a>, which I recommend reading in its entirety, because we must be afraid of EAI.\u00a0The authors identify 5 main tensions in the diffusion of EAI in the social fabric.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The first tension is the fact that the technology relies on <strong>invisible <\/strong>data tracking, which can lead to improper, unethical or <strong>malicious <\/strong>use.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The second concerns the cultural tensions that arise from the fact that these <strong>emotional AI <\/strong>technologies cross national and cultural borders.\u00a0While emotion <strong>sensing <\/strong>technologies are predominantly designed in the West, they are sold to a global market.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The problem is that while these devices cross international <strong>borders<\/strong>, their algorithms are rarely modified to account for racial, cultural, ethnic or gender differences <a href=\"https:\/\/boa.unimib.it\/handle\/10281\/357623\" target=\"_blank\" rel=\"noreferrer noopener\">false positive identifications<\/a>, with a negative <strong>impact <\/strong>on the target individual.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Thirdly, the lack of industry standards, such as the hidden <strong>data-gathering<\/strong> activities of many smart technologies, developed as a <strong>proprietary <\/strong>layer in many products will make its collective <strong>regulation <\/strong>very difficult.\u00a0A striking example is the automotive industry.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Fourth, existing ethical <strong>frameworks <\/strong>for emotional AI are often vague and inflexible.\u00a0This is because different companies, in different <strong>cultural <\/strong>backgrounds, have different <strong>rationales <\/strong>or goals for adopting new technology.\u00a0<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Finally, comes the shaky science of the emotion recognition <strong>industry<\/strong>.\u00a0A growing number of critics argue how it is possible that <strong>emotions <\/strong>can be made <strong>computable<\/strong>.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>As I often like to remind you, a <strong>technology <\/strong>cannot be <strong>judged <\/strong>good or bad but it is always how it is used and the purpose of its application that can be accepted or rejected by the single <strong>human <\/strong>being.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>In conclusion, <strong>a question <\/strong>arises spontaneously.&nbsp;<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>But if man created <strong>robots <\/strong>to use them in repetitive jobs and in those processes where it is appropriate to <strong>make decisions<\/strong> based on <strong>statistical <\/strong>data preferable to <strong>emotional <\/strong>choices<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:quote -->\n<blockquote class=\"wp-block-quote\"><!-- wp:paragraph -->\n<p>What if robots or algorithms started making decisions based not on data but about emotions?\u00a0<\/p>\n<!-- \/wp:paragraph --><\/blockquote>\n<!-- \/wp:quote -->\n\n<!-- wp:paragraph -->\n<p>They would probably cease to be useful as robots, and humans would have to <strong>create <\/strong>another <strong>generation <\/strong>of purely computational robots.<\/p>\n<!-- \/wp:paragraph -->","_et_gb_content_width":"","footnotes":""},"categories":[7],"tags":[],"class_list":["post-1321","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"_links":{"self":[{"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/posts\/1321","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/comments?post=1321"}],"version-history":[{"count":20,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/posts\/1321\/revisions"}],"predecessor-version":[{"id":1445,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/posts\/1321\/revisions\/1445"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/media\/1324"}],"wp:attachment":[{"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/media?parent=1321"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/categories?post=1321"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/robotics24.net\/blog\/wp-json\/wp\/v2\/tags?post=1321"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}