Affectiva was a software company that built artificial intelligence, acquired by SmartEye in 2021. The company claimed its AI understood human emotions, cognitive states, activities and the objects people use, by analyzing facial and vocal expressions.[11] An offshoot of MIT Media Lab,[12] Affectiva created a new technological category of Artificial Emotional Intelligence, namely, Emotion AI.[13]
Affective was acquired for a mostly-stock deal of $73.5m by Swedish SmartEye, a former competitor.[16]
Technology
The company has expanded its Emotion AI technology to detect more than facial expressions, reactions and emotions. Affectiva's software detects complex and nuanced emotions, cognitive states, such as drowsiness and distraction, certain activities and the objects people use. It does that by analyzing the human face, vocal intonations and body posture.
Affectiva's AI is built with deep learning, computer vision, speech analytics and large amounts of data that has been collected in real-world scenarios.[17] The AI uses an optical sensor like a webcam or smartphone camera to identify a human face in real-time.[17] Then, computer vision algorithms identify key features on the face, which are analyzed by deep learning algorithms to classify facial expressions. These facial expressions are then mapped back to emotions. One journal paper found the Affectiva iMotions Facial Expression Analysis Software results are comparable to results using facial Electromyography.[18] Affectiva also uses computer vision to detect objects like a cellphone[19] and car seat,[20] as well as body key points, which track body joints to determine movement and location.[21] The company's speech technology works by analyzing audio segments for their acoustic-prosodic features, such as, for example, changes in tone, loudness and tempo.[22]
Affectiva has collected massive amounts of data that are used to train and test the company's deep learning algorithms, and provide insight into human emotional reactions and engagement. The company has analyzed more than 10 million face videos from 90 countries, making it one of the largest data repositories of its kind.[23] Affectiva has also collected more than 19,000 hours of automotive in-cabin data from 4,000 unique individuals.[24] This automotive data is used to adapt its algorithms to varying camera angles, lighting and other environmental conditions in a vehicle.
Applications
Affectiva's AI had many applications, but the company's primary focus was on Automotive and Media Analytics.[25] Other uses of Affectiva's AI included applications in healthcare[26][27] and mental health,[28] robotics,[29]conversational interfaces,[30] education,[31][32] gaming,[33][34] and more.
Media Analytics
Affectiva's technology was first deployed in media analytics, for market research purposes. The company had since then tested more than 53,000 ads in 90 countries.[35] Brands, advertising agencies and insights firms used the company's Emotion AI to measure the unfiltered and unbiased emotional responses consumers have when viewing video ads and movie trailers.[36] These insights helped improve brand and media content, and predict key metrics in advertising such as sales lift, purchase intent and virality.[37][28] Affectiva's technology was also used in qualitative research.[38]
Affectiva had partnered with leading insights firms such as Kantar, LRW, Added Value and Unruly.[39][40] Through these collaborations, 28 percent of the Fortune Global 500 companies, and 70 percent of the world's largest advertisers, used Affectiva's Emotion AI.[35]
On September 5, 2019, Affectiva announced the appointment of Graham Page, a seasoned Kantar executive, as Global Managing Director of Media Analytics to expand on the company's existing footprint in the media analytics space.[41]
Automotive
On March 21, 2018, Affectiva launched Affectiva Automotive AI, the first multi-modal in-cabin sensing solution[buzzword] to understand what is happening with people in a vehicle.[42] It used cameras in the car to measure in real time, the state of the driver, the state of the occupants and the state of the vehicle interior (i.e. cabin). This insight helped car manufacturers, fleet management companies and rideshare providers improve road safety and build better driver monitoring systems, by understanding dangerous driver behavior such as drowsiness, distraction and anger.[43] It was also used to create more comfortable and enjoyable transportation experiences, by understanding how passengers react to the environment, such as content they can consume in the back of the car.[25]
In addition to understanding driver and occupant emotional and cognitive states, Affectiva Automotive AI could also detect contextual cabin information such as the number of passengers, where they are sitting and if an object is present.