Feelings and Faces

This site represents a collection of related (and ongoing) projects that originated while in batch at the Recurse Center.

The impetus was to explore a domain where lay intuition is quite strong but where exact modeling is quite slippery -- like natural language -- but not so vast. The question of how to define emotions happened to stumble into the crosshairs.

The theory of emotions is a topic I had enjoyed reading about in the past, especially since it sits at the intersection of many different disciplines including cultural anthropology, evolutionary biology, linguistics, health, and philosophy. A few impromptu conversations with fellow recurser Gage Krause (who is an awesome coder with an academic background in the philosophy of psychology) helped to germinate the ideas.

The following is a roughly chronological ordering of various implementations and experiments that have grown out of the initial idea.

Expression Synthesizer

Expression Synthesizer

A simple interface for controlling the expression of a 3d facemesh. Built using a vector displacement model trained on a a handful of tagged image datasets from Kaggle. Early attempts used a pixel-based "eigenface" approach, then the dlib face detection model, and finally the 3d facemesh model from MediaPipe. The displacement vectors were generated using principal component analysis.

Emotional Arithmetic Tables

Emotional Arithmetic Tables

Using the model created for the expression synthesizer, I was interested in visualizing the combinations of the 8 "primary" emotions and matching them to emotional vocabulary using analogous superposition of word vectors in a semantic embedding space.

Emotional Transference

Emotional Transference

A grid-based time-step simulation that allows for simplified interactions between cells. Each cell comprises an internal emotional state visualized as a face, as well as a single "neuron" (weight matrix and bias vector) that "learns" to move towards a user-controlled "homeostasis" state (using gradient descent) while "training" on the emotional states of its neighbors.

Emotional Pareidolia

Emotional Pareidolia

A whimsical experiment in which the emotional expression model is applied to "face-like" static images using the same basic interface as the expression synthesizer. The image deformations are obtained using thin-plate spline calculations based on the same underlying vector displacement model.

Pareidolia Webcam Filter

Pareidolia Webcam Filter

An extension of the pareidolia experiment that is untethered from the emotional expression model. Instead it uses facemesh displacement data directly from a user's webcam. The thin-plate spline calculations were optimized to run in real-time using a GPU.