Integrating Emotions into Your Game
If you have been following our blog posts, you know that we explained the importance of emotions and their role in our daily interactions (link to the first blog). Then, we focused on the human-computer interaction in the context of video games (link to the second blog post), and finally, we explained how emotions can be the missing layer for human-computer interaction (link to the third blog post). This week, we will see how integrating this emotional layer into your products can benefit you and as a result, makes the world a better place.
Emotional Interaction – The Missing Link
In our last blog posts, we explained the role of emotions in video games, had an overview of the human-computer interaction and what it meant for the video game industry. We found out that from a technical perspective we have reached almost full freedom in the 3D environment. For example, CAVEs can give you the freedom to navigate naturally in a virtual environment in real-time.
Affective Computing in the Game Industry: From Machine Learning to Game User Interface
Two weeks ago, we wrote a short general article about emotions in video games. Make sure to check it out HERE (link to last article) if you have missed it. This week, we want to get a bit more specific and see how it actually started and how it works.
Affective Computing Gets even Your Grandma Excited about VR
Trees were starting to blossom in the spring of 2016, when Oculus Rift was released as a consumer product. You could put it on and simply teleport into the virtual realm. It is enhanced with more features ever since, helping the users to feel even more immersed in the environment.