Apple exploring machine learning-based blood texture maps to enable more realistic Avatars

Apple Real Life Avatars with blood texture maps

Apple is exploring a machine learning-based blood flow tracking feature that will assist with building out more realistic avatars, according to the latest patent filing published today.

The new patent ‘Machine Learning-based Blood Flow Tracking’ suggests extrapolating a ‘blood texture map’  for a more realistic rendering of avatar emotional and facial expressions.Apple Machine Learning based blood flow texture map for avatars

The general abstract of the patent reads as follows:

“Rendering an avatar may include determining an expression to be represented by an avatar, obtaining a blood texture map associated with the expression, wherein the blood texture map represents an offset of coloration from an albedo map for the expression, and rendering the avatar utilizing the blood texture map.”

Current challenges with Avatar emotional rendering 

Existing avatar systems do not generally provide the ability to communicate nuanced facial representations or emotional states.

Apple’s research is looking at making these facial expressions more realistic in that they can communicate the actual emotional state of the user.

Known existing systems tend to be computationally intensive, requiring high-performance general and graphics processors, and generally do not work well on mobile devices, such as smartphones or computing tablets. 

Today, Avatars observed across numerous platforms may take various forms, including virtual humans, animals, and plant life. 

However, this novel research focuses on computer products that include avatars with facial expressions driven by a user’s facial expressions. 

One use of facially-based avatars is in communication, where a camera and microphone on your device transmit audio and real-time 2D or 3D avatars of your profile to other users on Apple devices.

How would this feature technically work?

Blood flow can be mimicked based on a subject’s facial expressions to generate a better photorealistic avatar. 

Your blood moves around the face differently when you talk, express emotions, make different facial expressions, or perform any other movement that deforms the face.

As the blood moves, the coloration of the subject’s face may change due to the change in blood flow (e.g., where the subject’s blood is concentrated under the skin). Apple Blood Flow maps for Avatars

Blood flow may be determined by extracting the lighting component as it is displaced from the albedo map.

The albedo map describes a face’s texture with perfectly diffused light and in the static version of a subject’s skin. Accordingly, the extracted lighting component indicates the offset from the albedo map for a particular expression.

AI-powered avatars will be commonplace as we onboard ourselves to more powerful computing platforms.

One can easily imagine life-like avatars in the Vision Pro’s future iterations.

There are already app makers such as RepliKa who are leveraging machine learning and AI to make more realistic avatars in the pursuit of providing AI Companion apps.

Facebook already has a storefront called the Meta Avatars store, suggesting that as we move into these mixed-reality environments, the world of more realistic avatars that can communicate facial expressions and emotions will be an important component of these computing systems.

Previous articleApple Watch ECG vs Garmin ECG – here are the major differences
Next articleHere’s how you could track your breathing and digestion with skin-worn wearables
Sudz Niel Kar
I am a technologist with years of experience with Apple and wearOS products, have a BS in Computer Science and an MBA specializing in emerging tech, and owned the popular site AppleToolBox. In my day job, I advise Fortune 500 companies with their digital transformation strategies and also consult with numerous digital health startups in an advisory capacity. I'm VERY interested in exploring the digital health and fitness-tech evolution and keeping a close eye on patents, FDA approvals, strategic partnerships, and developments happening in the wearables and digital health sector. When I'm not writing or presenting, I run with my Apple Watch Ultra or Samsung Galaxy Watch and closely monitor my HRV and other recovery metrics.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.