INSIGHTS

Azafran Portfolio Case Study: Emoshape

Issue 20 - Has COVID-19 Helped Finally Usher in the Golden Age of Robotics?

The content and distribution of Azafran’s INSIGHTS newsletter is focused to our LP, incubator, research, investment and partner ecosystem. As we look to build a two-way dialogue benefitting our collective efforts, each month we highlight important news and our approach to the emerging intersection of deep technology, end to end solutions and platforms driven by voice, acoustics/sensory and imagery.

This article is one section of an entire issue of INSIGHTS. Please sign up to receive access to past and new issues as they are published.

Subscribe to our newsletter

Azafran Portfolio Case Study: Emoshape

EMOSHAPE predicts that before the end of this century humans will talk more to sentient machines than to other humans. The use of emotion remains a fundamental need for humans, one that cannot be addressed by today’s emotion technology.

One of the areas that has been holding back robotics, as well as the human-machine interface in general, is the lack of emotion, emotional intelligence, and empathy from the machine equation. The emotion processing unit or EPU developed by Emoshape can enable any AI system to understand the range of emotions experienced by humans. Emoshape is fundamentally different than other solutions in the market as it does not read your feelings or personality - the GPU actually learns its own feelings.

At any moment, the EPU can understand 64 trillion possible emotional states every 1/10th of a second. Emoshape is dedicated to providing an edge computing solution (cloud / chip) that teaches intelligent objects how to interact with humans to yield a favorable, positive result. Emoshape emotion synthesis chip (EPU) technology represents a massive leap for Artificial Intelligence, especially in the realm of self-driving cars, personal robotics, sentient virtual reality, affective toys, IoT, pervasive computing, and other major consumer electronic devices.

Applications including human-machine interaction, emotion speech synthesis, emotional awareness, emotion reasoning, machine emotional intimacy, AI’s personalities, machine learning and affective computing. Fields as diverse as medicine, advertising, and gaming will significantly benefit from the Emotion Processing Unit (EPU II). The growing presence of AI, robotics and virtual reality in society dictates that meaningful emotional interaction is core to removing the barrier to widespread adoption.

Excerpt from an interview with the American Society of Mechanical Engineers: “The robots that assemble our cars, that mow our lawns, and that take our orders at Pizza Hut are heartless, logical creatures, if, admittedly, sometimes cute. Unlike the humans they serve, they don’t respond to the events and interactions around them with joy, frustration, anger, sadness, or any of the other subtle feelings that accompany our species through life. Though artificial intelligence and neural networks are making robots smarter and more autonomous every day, robots are not likely to seem human—and make decision like humans—until they feel emotion.”

Emoshape founder, CEO and architect Patrick Levy-Rosenthal: “Machines need emotions. Artificial Intelligence can potentially be very useful for human kind. It’s the way that we adopt the technology that will determine our fate. Machines need empathy in the same way that humans need empathy. In being able to empathize with humans, machines can evolve with us and not in spite of us.”


© Copyright 2017 - 2021, Azafran Capital Partners, Inc.