Roboface-an Interactive Face Animation Tracker
I want you to imagine a character that tracks your mouse and expresses emotions based on how you interact with it. Emotions range from happy, surprised, bored, or neutral. The character in this case is a 2D face, a playful experiment to build something that feels alive. See it on GitHub or try the live demo.
š What It Does
The face shifts between emotional statesāhappy, surprised, bored, or neutralābased on mouse position and activity, acting like a finite state machine:
- Eyes track the cursor for a lively effect.
- Gets happier (smiling, blushing) when the mouse is closer.
- Shows surprise after a pause in interaction.
- Turns bored after long inactivity.
- Settles to neutral with less activity.
Smooth transitions come from CSS and JavaScript, adjusting eye movement, mouth shape, and blush.
The current version is version 2. Hereās a screenshot of an earlier version where I tried to include eyebrows on the face avatar to make it more expressive.
š§ HRI Inspiration
This project came from a human-robot interaction course at Carnegie Mellon, where I explored how robots evoke emotions. The Keepon robot, with its simple, expressive movements, inspired me to create a virtual version with similar charm.
HRI concepts in the project:
- Feedback Loop: Boredom prompts mouse movement, which shifts the face to happiness, creating an interaction cycle.
- Agency: The face seems to āchooseā emotions (like happiness when approached), despite being algorithmic.
- Affective Interaction: Emotions like happiness or surprise aim to spark a response in the user.
- Social Presence: The face feels like a social entity through its reactions.
- Proxemics: Emotions change with the mouseās virtual proximity, mimicking real-world closeness.
- Uncanny Valley: A cartoonish style keeps it approachable, avoiding eerie realism.
Note: It feels alive, but itās just pixels driven by code.
š Other Projects
Since building this in May 2025, Iāve noticed similar projects. xAIās Grok Companions (July 2025) introduced animated characters like āAniā for premium users. Emotionally adaptive AI companions, discussed in tech blogs around August 2025, focus on emotional support but lack a specific project page. KEYi Robotās automated companions (August 2025) also emerged. These focus on conversation and emotional support but donāt react to user movement or proximity like this face does.
This is my take on blending HRI with code to create something lively and fun.