Roboface-an Interactive Face Animation Tracker

Published: August 17, 2025 at 11:19 PM UTC+0200
Last edited: 17 August, 2025 at 10:10 PM UTC+0200
Author: Richard Djarbeng

I want you to imagine a character that tracks your mouse and expresses emotions based on how you interact with it. Emotions range from happy, surprised, bored, or neutral. The character in this case is a 2D face, a playful experiment to build something that feels alive. See it on GitHub or try the live demo.

Face animation gif with mouse

šŸ˜„ What It Does

The face shifts between emotional states—happy, surprised, bored, or neutral—based on mouse position and activity, acting like a finite state machine:

Face animation demo screenshot

Smooth transitions come from CSS and JavaScript, adjusting eye movement, mouth shape, and blush.

The current version is version 2. Here’s a screenshot of an earlier version where I tried to include eyebrows on the face avatar to make it more expressive.

Face_animation_with eyebrows


🧠 HRI Inspiration

This project came from a human-robot interaction course at Carnegie Mellon, where I explored how robots evoke emotions. The Keepon robot, with its simple, expressive movements, inspired me to create a virtual version with similar charm.

HRI concepts in the project:

Note: It feels alive, but it’s just pixels driven by code.


🌟 Other Projects

Since building this in May 2025, I’ve noticed similar projects. xAI’s Grok Companions (July 2025) introduced animated characters like ā€œAniā€ for premium users. Emotionally adaptive AI companions, discussed in tech blogs around August 2025, focus on emotional support but lack a specific project page. KEYi Robot’s automated companions (August 2025) also emerged. These focus on conversation and emotional support but don’t react to user movement or proximity like this face does.

This is my take on blending HRI with code to create something lively and fun.