
An AI-driven affective spatial system that senses mood and adaptively modifies interior stimuli to steer emotions toward positivity.
MAZE demonstrates a working closed-loop prototype that modulates colour in real time from live affect signals, shows early shifts toward neutral/positive valence, establishes a reproducible sensing→inference→spatial-output framework, and will next expand to sound/luminance/texture with controlled studies and metaverse-scale learning.
MAZE is a master’s thesis exploring how interior space can read human emotion and respond in real time. Using multi-modal sensing (e.g., facial-expression cues and behavioural signals), the system infers the user’s current mood and then modulates environmental parameters—starting with colour values—to gently nudge affect toward a more positive state. The aim is a closed-loop spatial experience where people leave feeling better than when they arrived. Supervision: Prof. Daniel Arztmann & Prof. Hans Sachs.
Measure emotional state continuously without adding friction or breaking immersion.Translate noisy behavioural signals into reliable, privacy-aware affect estimates.Convert affect estimates into spatial changes (colour, light) that feel natural, not intrusive.Maintain comfort, safety and repeatability across different users and contexts.
Sensing stack that combines face-based affect cues with behavioural patterns to estimate valence.AI inference loop that watches trends (not single frames) and filters out false positives.Real-time actuation pipeline that adjusts colour palettes and lighting to test micro-shifts in mood.Simple UX framing so users “just experience” the space—no controls, no setup, minimal friction.
