In dangerous or stressful situations, our parasympathetic nervous system (PSNS) helps us relax and regulate bodily functions. Parasympathy was inspired by the PSNS and its crucial role in physical and mental restoration. With Parasympathy, we wondered if emotions, unspoken feelings, fears, and desires could manifest as architectural elements to reflect the experiences and feelings of a community, perhaps even feeling empathy? How could architecture be a more active contributor to our social and psychological wellbeing? This installation served as an example of socially responsive architecture, placing users’ emotions at the center of the space. Parasympathy is an interactive spatial experience operating as an extension of visitors’ minds. The objective is to deploy emerging technologies in support of the wellbeing of the community especially when related to social matters such as inclusion and social justice in our built environment. By using each individual’s biosignature as a noticeable trace and by breaking away from the traditional designer-centric concept , this user-centric installation is a medium that actively responds to the mood of the community, helping to promote communication for those who often go unheard. This installation demonstrates responsible uses of emerging technology that can promote social awareness and enhance the agency of the democratic populace and equitable design. It contributes to research on cyber-physical design and the interaction of technology and empathy. This project had a singular objective, to reconcile the relationship between humans and architecture and redefine it as one of emotional empathy and active compassion.



By integrating Artificial Intelligence (AI), wearable technologies, affective computing, and neuroscience this project blurs the lines between the physical, digital, and biological spheres and empowers users’ brains to solicit positive changes from their spaces based on their real-time biophysical reactions and emotions. This project places the users’ emotions at the very center of its space by performing real-time responses to the emotional state of the individuals within the space.

The project leveraged AI as extended intelligence and relied on the human brain for information-processing, employing wearable technology and sensory environments to foster a process in which synapses in the brain triggered responses in the installation, ultimately modulating emotion. The employed method is unique in its use of wearable technology (i.e., Empatica E4 and Open BCI EEG) as prostheses to collect data and integrate AI for real-time emotion detection and communication with an intelligent interactive installation, synchronizing changes in the space to the emotional data received. Acknowledging the role of Machine Learning (ML) in performance base design since the mid-1990s, here ML is used for detecting emotions and translating them into data used in design to achieve rather inclusive spaces.

Parasympathy is made of a series of kinetic reflective tiles folded and fluctuated in a calculated rhythm, producing a spectacle of color and patterns akin to the northern lights. The effect was contingent upon the involvement of its users; using a smart wristband, biophysical data i.e., heart rate, skin electricity, blood volume, and temperature) was gathered and analyzed by our ML algorithm and translated into emotion categories. The installation was then calibrated to actively respond to these data by changing patterns and colors to create an ambient that would improve the users’ emotions. For example, if the stress was detected, space morphed, and colors shifted to calming bright colors such as blue. Based on earlier color studies, we assigned different colors to different emotions to navigate and index the moods of the users.

The installation was comprised of four 4’ x 8’ panels positioned at 90 degree in the corner of a gallery, drawing users to the center of the space to amplify the sense of immersion. Each panel consisted of a grid of retractable tiles that acted as the kinetic component of the installation. Nestled within each module was a colored LED light that activated in concert with the module’s movement and detected emotion. A Raspberry Pi computer queried the webserver every six seconds to read the last predicted emotional state and check for changes, calibrating the rhythm and sequence of the mechanisms.

While becoming aware of their mental state through the wristband and cell phone provided, visitors found the therapeutic immersion in color and light increased their sense of self-awareness as they had a key role in activating the space upon their involvement. Users learned their emotional and physiological states and thus acquired a tool to enhance, mitigate, or simply become aware of their emotions.

The application of this cognition-emotion-space interaction system has the potential to be utilized as a method of remedial therapy, provide augmented assistant living for people with physical and mental disabilities and elderlies ultimately empowering them to regain control over their environments and live more equal and independent lifestyles.

Link to the video of the project:

Team Morphogenesis Lab – Washington State University
Morphogenesis Lab Director: Mona Ghandi
Design: Mona Ghandi, Mohamed Ismail
Fabrication: Mohamad Ismail, Shanle Lin, Aisha Marcos, Ruri Adams, Jessie Lu, Marcus Blaisdell
Programming & Electrical: Marcus Blaisdell
Cinematography: Nicole Liu, Mohamed Ismail