Transforming Conversational AI: Introducing ERA (Emotionally Responsive Assistant)
audEERING’s ERA demo showcasing the integration of Voice AI into conversational AI agents marks a significant milestone in the realm of affective computing. By enabling AI assistants to comprehend and respond to human emotional expression, this innovation revolutionizes our interactions with virtual companions.
Affective Avatars & The Voice AI Solution
We use avatars to show our identity or assume other identities, and want to make sure we express ourselves the way we want. The key factor in expression is emotion. Without recognizing emotions, we have no way of modifying a player’s avatar to express their expression and individuality.
With entertAIn play, recognizing emotion becomes possible.
A Devcom 2021 Panel Review: AI for Playtesting & VR
Just like last year, we are proud that we got the chance to present our ideas and be part of the devcom Developer Conference 2021. We talked in a panel discussion about the entertAIn brand. If you missed the talk, watch it on our Youtube Channel.
GaCha 2019: The Top 3 Emotion AI Game Concepts
Young developers from all over the world used audEERING’s emotion detection to create their own game. Have a look and enjoy the new way of gaming.
Transparent AI Part 4: AI Applications, Types and the AI Effect
Here we are at the end of our series of Transparent AI with episode 4 of this series, talking about the applications of AI, types of AI, and the AI effect.
Transparent AI Part 3: Machine Learning and AI Processes
In the past two episodes, we clarified the definition, and we learned one of the common ways to model emotions. In this episode, we want to see what are machine learning and artificial intelligence and how can we use them to create an AI agent.
Transparent AI Part 2: Modeling Emotions
In part 2 of the Transparent AI series we lay the foundation for the modeling of emotions in affective computing. To describe the perception of emotional express there’s a 3D space usable. It’s built from the dimensions valence, arousal and dominance.
Transparent AI Part 1: Define in Order to Unite
In this video series, we go over some definitions, learn the basics of modeling emotions, present machine learning methods to train models, and some of the applications of emotion AI. The journey begins with the delimation of terms we mostly use in our daily life without a clear definition: emotion, feeling, affect, mood.
Integrating Emotions into Your Game
If you have been following our blog posts, you know that we explained the importance of emotions and their role in our daily interactions (link to the first blog). Then, we focused on the human-computer interaction in the context of video games (link to the second blog post), and finally, we explained how emotions can be the missing layer for human-computer interaction (link to the third blog post). This week, we will see how integrating this emotional layer into your products can benefit you and as a result, makes the world a better place.
Emotional Interaction – The Missing Link
In our last blog posts, we explained the role of emotions in video games, had an overview of the human-computer interaction and what it meant for the video game industry. We found out that from a technical perspective we have reached almost full freedom in the 3D environment. For example, CAVEs can give you the freedom to navigate naturally in a virtual environment in real-time.
New Ways of Human-Machine Interaction in Video Games
When you write a text message to your friend, call them, or meet them in a cafe, you are interacting with them. Any conversation can be an interaction. Interaction happens when two or more objects or agents affect one another. Some centuries ago, humans were mostly interacting with each other and their domestic animals, and thanks to evolution, we are equipped for that.