Author’s description:
An Architecture of Emotive Intelligence: A Biofeedback Empathetic Response to Space

Current adaptive architecture approaches mostly focus on kinetics of the facade, using environmental data for energy efficacy purposes. This undermines users’ physiological and emotional needs. This project is an adaptive wall that can react to the users’ behavioral patterns in real-time. It can adjust its shape and configuration based on users’ physiological and psychological feedback by sensing users’ emotion and desires through the data gathered from biological signals. Collecting and using bio/neuro-signals, such structure enhances the environmental quality and promotes more flexible, human-centered designs. Using affective computing, this project integrates users, adaptive structures, and sensing technologies to arrive at a successful human-computer interaction that relies on cognitive synapses and emotions to help people with physical and psychological disabilities.

Part of this project is engaged in a sensory network of collecting data and understanding the human condition (affective computing). The other involves kinetic structures, actuation systems and innovative materials. Fuzzy logic and machine learning (deep neural networks), are used to create emotion recognition algorithms that are trained through interactive ground-truth data collection solutions. Adaptive wall structure: The wall consists of series of triangles that are connected to each other with flexible hinges. Changing their position, they offer different shapes, openings, forms, lightings, and functions to control light and natural ventilation, express the user’s emotions and desires, or to serve users functionally by offering seats, table, stairs, etc.

Data collection: This adaptive wall is capable of changing its shape based on a network of ambient sensors e.g., EEG headset, wearable biological sensor, eye tracker, voice detector, and facial and gesture recognition device. Biological (body temperature, skin conductance, etc.) and neurological (brainwaves) data are collected through sensors and changes are implemented on the wall as needed. For example, if the user feels hot or depressed the wall opens up to raise the natural light, provide fresh air, and offer view to ameliorate user’s condition. Data processing and actuation: Microcontrollers use collected data to make decisions and forward them to the actuators. The computing system include a central brain (Raspberry Pi) that processes user data and send them to microcontrollers that interact with the wall. This interaction manifests in reconfiguration or illumination of the wall happening through kinetic components, responsive materials, and actuators such as servos, pneumatic system (soft robotic), and programmable materials (shape memory alloy).

It has significant implications in the medical field to provide augmented assistant living for people with physical disabilities and neuromuscular diseases and offers them a greater role in shaping spaces independently using their mind and physiology. This wall can make caregiving institution and people aware of feelings, thoughts, and activities of people with PTSD and Autism. For example, it can help children with Autism Spectrum Disorder (ASD) to regain their nervous system’s inability to filter sensory input to determine an appropriate response by providing them a sensory regulatory environment through integrating physical and visual feedback. Also, by autonomously responding to the occupant’s need for light/heat/ventilation/view through biological data, this wall improves the well-being of occupants and has positive impact on energy consumption.



Leave a Reply