Apple’s Emotion Detection research shows how the company is leveraging AI

Apple Emotion detection and AI

Emotion detection from user engagement has been a widely debated feature in recent years.

With the advent of new technologies, many tech companies and academic researchers have refined their thinking and approach to this hard engineering problem.

Apple’s research team is not far behind. The Cupertino giant has been working on this problem for years, as evident from a recent patent filing approved this week.

Recent success stories in automated object or face recognition, partly fueled by deep learning artificial neural network (ANN) architectures, have led to the advancement of biometric research platforms and, to some extent, the resurrection of Artificial Intelligence (AI).

Related reading:

Emotion detection is a multi-faceted process

Within this context, it turns out that automating emotion recognition is far from being straightforward, with several challenges arising for both science (e.g., methodology underpinned by psychology) and technology (e.g., the iMotions biometric research platform).

Research has shown that skin conductance is also a sensitive autonomic measure of emotional arousal (Boucsein et al., 2012).

The higher the arousal, the higher the skin conductance for both positive (“happy” or “joyful”) and negative (“threatening” or “saddening”) stimuli. Accessory and smartwatch makers have been studying this and introducing GSR sensors on wearables.

Apple’s latest research patent, ‘Emotion detection’ originally submitted in 2019, explores the feature using convolutional neural networks (CNNs). The main idea here is to examine various 2D images and run them through CNNs to detect the various emotions amplified in the image. Apple Emotion detection feature

The patent highlights the machine learning-based techniques that can detect emotions and help build appropriate apps for VisionOS to be used in AR/VR environments.

Apple Emotion detection

Apple also addresses issues that may arise centered on user privacy issues and government regulation in this area.

For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA).

In contrast, health data in other countries may be subject to other regulations and policies and should be handled accordingly.

Hence, different privacy practices in each country should be maintained for different personal data types, complicating many facets surrounding this feature.

Apple has acquired a few companies to help with some of the engineering challenges in this area. The earliest of the acquisitions in this space was a Swedish company in 2010 called Polar Rose.

Apple’s computer vision-based application build-out started taking a more serious shape in 2016/2017, as evidenced by the numerous acquisitions.

The company first acquired RealFace, an Israeli startup, in 2017 to help build out the components for FaceID, which has become a very useful feature in the Apple ecosystem. The other company that Apple added to its portfolio was Emotient Inc in 2017, which uses AI to read people’s emotions.

When it comes to health, this technology can be powerful. In 2017, researchers showed how a system to detect facial expressions could help care for dementia patients suffering from pain.

With the advent of more powerful neural networks and associated chipsets, we are not far from seeing use cases on Apple’s products centered on emotion recognition.

Previous articleApple exploring strain gauge sensors for Health applications
Next articleHow to setup outdoor run or outdoor cycle metric preferences on your Apple Watch
Sudz Niel Kar
I am a technologist with years of experience with Apple and wearOS products, have a BS in Computer Science and an MBA specializing in emerging tech, and owned the popular site AppleToolBox. In my day job, I advise Fortune 500 companies with their digital transformation strategies and also consult with numerous digital health startups in an advisory capacity. I'm VERY interested in exploring the digital health and fitness-tech evolution and keeping a close eye on patents, FDA approvals, strategic partnerships, and developments happening in the wearables and digital health sector. When I'm not writing or presenting, I run with my Apple Watch Ultra or Samsung Galaxy Watch and closely monitor my HRV and other recovery metrics.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.