top of page
zero dogma [ Think Tank ]
Unit /
Spatial Forces
Integrating AI and Biosensing for Closed-Loop Design Pipelines
This research explores how artificial intelligence (AI) and biosensing can be integrated to create a closed-loop design pipeline. The goal is to develop an adaptive system that uses human responses to guide and improve design outcomes in real time.
The process began with the collection of text and image data used to train an AI, allowing it to generate images of patient rooms. These generated images were then shown to subjects while tracking emotional responses (via facial expressions) and eye movements to understand their reactions.
Through this, we gathered data on where subjects were looking and how they felt in response to specific design features. Heatmaps were created based on the time spent viewing particular areas of the images and masks were applied to highlight the most viewed regions. The chronological eye-tracking provided a sequence of focus points, which helped analyze the subjects’ visual journey through the image.
Using positive emotional responses as a filter, we removed less favorable design elements and refined the AI’s training data. This created a closed feedback loop, where human reactions directly informed future design iterations.
In parallel, we employed image-to-3D generative AI systems to translate these generated images into pseudo-3D environments, further evaluated using the same biosensing techniques.
This research bridges the gap between biosensing and design, positioning the human subject as a guiding force in the design process, ensuring more personalized and emotionally responsive environments.
1/5
Project Team
Firas Safieddine
bottom of page