Blog

In the past two episodes, we clarified the definition, and we learned one of the common ways to model emotions. In this episode, we want to see what are machine learning and artificial intelligence and how can we use them to create an AI agent.
What can AI do in the fight against the corona virus? Audio AI can really move the needle in medicine as the magazine Nature recently published. Diseases like Parkinson’s, depression and COVID-19 have one thing in common: They affect the way a person’s voice sounds.
In part 2 of the Transparent AI series we lay the foundation for the modeling of emotions in affective computing. To describe the perception of emotional express there's a 3D space usable. It's built from the dimensions valence, arousal and dominance.
In this video series, we go over some definitions, learn the basics of modeling emotions, present machine learning methods to train models, and some of the applications of emotion AI. The journey begins with the delimation of terms we mostly use in our daily life without a clear definition: emotion, feeling, affect, mood.
Mental health is an increasingly important topic in nowadays fast-moving societies. Young people are heavily affected by the requirements they face during their education and in social groups. 20 % of all youths in Germany experience mental health issues according to the German Association of Psychotherapists.
We are happy to announce the availability of the next major release 3.0 of openSMILE, audEERING's open-source, cross-platform audio feature extractor. With more than 150,000 downloads since its first publication in 2010 and more than 2650 citations in academic papers, openSMILE has become an immensely popular tool among the research community, audio-related companies and individuals. We strongly believe that researchers and enthusiasts should have free and unrestricted access to the fundamental tools that they need for their work and to make new advances in these fields.
If you have been following our blog posts, you know that we explained the importance of emotions and their role in our daily interactions (link to the first blog). Then, we focused on the human-computer interaction in the context of video games (link to the second blog post), and finally, we explained how emotions can be the missing layer for human-computer interaction (link to the third blog post). This week, we will see how integrating this emotional layer into your products can benefit you and as a result, makes the world a better place.
In our last blog posts, we explained the role of emotions in video games, had an overview of the human-computer interaction and what it meant for the video game industry. We found out that from a technical perspective we have reached almost full freedom in the 3D environment. For example, CAVEs can give you the freedom to navigate naturally in a virtual environment in real-time.
Virtual Reality and Artificial Intelligence can take the training of staff in hospitals to the next level. You wonder how? In this blog post we will show which amazing possibilities there are using those innovative technologies in the health sector with a current project in this field.
When you write a text message to your friend, call them, or meet them in a cafe, you are interacting with them. Any conversation can be an interaction. Interaction happens when two or more objects or agents affect one another. Some centuries ago, humans were mostly interacting with each other and their domestic animals, and thanks to evolution, we are equipped for that.
The role of emotions in video games is undeniable. Countless tears have been shed by unforgettable moments in video games. Whether it's yelling in excitement, or screaming in fear, we can see a thousand cases that games made us emotional. Those were the moments we were connected to a character, a story, an event, or even an object so much that its elevation or destruction caused a shared reaction across all players in the world.