CAVE-CAD Software Will Help Mine Human Brain to Improve Architectural Design

July 8, 2011 / By Chris Palmer

San Diego, Calif., July 6, 2011 — New software and hardware being developed at the University of California at San Diego makes it possible for people to communicate their experience of architectural design through physiological cues — an important consideration for those with healthcare conditions like Alzheimer’s disease that can make verbal communication difficult.

CAVE-CAD Integration with Neuroscience Research Technology

Researchers at the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2) have developed innovative CAD (computer-aided design) software called CAVE-CAD that, when integrated with novel hardware to monitor human neurological and physiological responses, makes architectural design more efficient. CAVE-CAD also adds an important feature missing in conventional CAD: an ability to immediately experience the consequence of modifying design.

“CAVE-CAD is an ideal tool for the architectural designer to create an experience of space on a human scale that allows the potential user or client to respond in real time to ideas and concepts while immersed in the design,” says Eduardo Macagno, Calit2-affiliated professor and founding dean of the UCSD Division of Biological Sciences and past president of the Academy of Neuroscience for Architecture.

A ‘neuro-architectural’ team developed CAVE-CAD for use in Calit2’s StarCAVE, a 360-degree, 16-panel immersive virtual reality environment that enables researchers to interact with virtual architectural renderings in three dimensions, in real time and at actual scale. The team has also developed new systems that synchronize the CAVE-CAD experience with electrophysiological measures of cortical brain function and emotional responses to the virtual built environments, enabling study of navigation and communication in complex settings such as hospitals and schools.

Calit2-affiliated neuroscientist Eve Edelstein, who is also trained in architecture, says that many visualization platforms allow users to perceive building designs in three dimensions, but the experience is often a passive “fly-through” requiring architects to first imagine themselves inside the building and then make design changes offline and then reload them, which can be time-intensive.

Gian Mario Maggio
Eve Edelstein
“What’s lost in a fly-through of a building is a sense of immersion, a sense of volume and the first-person perspective,” says Edelstein.

CAVE-CAD allows users to navigate building designs on their own terms, spending as much time as they want examining any particular design feature. Changes to the design can be made in real-time using a hand-held remote to manipulate intuitive icons and drop down menus, which are projected wherever the user is looking, thanks to head-tracking equipment worn by the user. The Calit2 team has developed a Beta version of the software that minimizes the use of menus and makes the process even more intuitive.

“Other software lets you move through a fixed space. We let you push back the walls or move the ceiling…in real time, and experience changing form without relying on your imagination,” says Edelstein.

She adds, “When I saw ‘Inception’, I said in the middle of the theater, ‘This is my world! This is what I do!’ That scene where the female character walks through downtown, and the architecture and the urban setting folds up, morphs and twists over according to her visualizations…that’s what we get to do.”

Edelstein’s father, Hal Edelstein, was one of the 10 founders of the world-famous Gensler architectural design firm. She says they spent countless hours together talking about architecture and visiting buildings he designed.  “Oh, he would have loved this,” says Edelstein about her father, who wrote his firm’s first computer modeling software system – a kind of proto-CAD. “He would have been the first one to incorporate these technologies.”

Funded with grants from Calit2’s Strategic Research Opportunities program and a major gift from San Diego-based HMC Architects, the CAVE-CAD project was originally the vision of Macagno, who recognized the potential of studying how people use cues to navigate built environments in the virtual reality confines of the StarCAVE. Macagno and Edelstein currently direct the project. Calit2 Staff Research Associate Lelin Zhang and Research Scientist Jürgen Schulze developed the software, along with neuroscientists, designers and architects.   

Gian Mario Maggio
User demonstrates CAVE-CAD rendering of hospital environment
CAVE-CAD also incorporates technology from another Calit2 project to precisely place sounds in the 3D environment. This technology, called SoniCAVE, allows CAVE-CAD researchers to simulate the acoustical environment in a given building design. Edelstein and SoniCAVE developer, Peter Otto, professor in UCSD’s Department of Music and director of Calit2’s Sonic Arts Research and Development group, have used CAVE-CAD and SoniCAVE to simulate real hospital environments, where the sound of ambient patient monitoring equipment can often interfere with conversations between medical personnel — and even lead to medical error.

Edelstein’s studies found that the noise during shift change in a hospital peaks at 120 decibels, the equivalent noise level of a jet engine. Ongoing development to their software by doctoral candidate, Joachim Gossman, may motivate a redesign of hospital environments, testing virtual sound-scenes using building materials that strategically absorb sound, for example.

Integration with Neural and Physiological Measurements

Integrated into the CAVE-CAD system is innovative sensor technology that monitors users’ responses to architectural renderings projected in the StarCAVE. A portable, dry electrode electroencephalography (EEG) system, designed by Gert Cauwenberghs, professor of Bioengineering and Biology, along with his students Michael Chi and Cory Stevenson, will capture a user’s brainwaves as they interact with the computer model.

The EEG system consists of a cap worn on the head with from 8 to 32 electrical channels feeding into a small amplifier/transmitter device that fits in a pocket and communicates the data stream to a computer outside of the CAVE using wireless transmission, which allows the user complete freedom of movement inside the StarCAVE – and, ultimately, outside in real-world built settings. This work benefits from collaboration with Dr. Tzyy-Ping Jung and his colleagues at the Swartz Center for Computational Neuroscience at UCSD, whose work on EEG is internationally renowned.

“The portable EEG device will add a significantly more objective approach to gaining evidence about human responses to the buildings we use and live in,” says Macagno.

“Expanded into the realm of cognitively impaired individuals, where direct verbal communication is often not possible, it may for the first time offer us an objective tool to assay and improve how Alzheimer’s patients, for example, interact with the institutional structures to which they are confined,” he adds.

Gian Mario Maggio
An electrooculography mask allows tracking of gaze in three dimensions for the first time
The researchers can also integrate information about where the user is looking using 3D electrooculography (EOG). EOG is the measurement of the electrical activity of the retina and the muscles that control eye movements. Users in the StarCAVE environment can don a mask with sensors on the top, bottom, left and right side of each eye to determine the direction of gaze. By incorporating signals from both eyes, the team is developing the means to measures the depth of gaze, allowing researchers to track the eyes in three dimensions for the first time.

“Though architects try to predict where attention will be directed, we do not know where the eye actually looks. However, in the CAVE, we seek to record exactly where a user is looking in space and tie measurements of brain activity to the individual design features being looked at,” says Edelstein.

“An architect can spend days perfecting a design feature, but when an observer looks at it, it may not even register as a blip on the brain wave activity recording, meaning it has not made any impact on the observer.”

Recently, one of Edelstein’s students at the NewSchool of Architecture and Design in San Diego, (where Edelstein teaches the course ‘Neuroscience for Architecture’), made a realistic model of the courtyard at the Louis Kahn-designed Salk Institute for Biological Studies. Driven by hypotheses based on architectural theory, they manipulated several subtle features of the Salk’s design, and found increased preference for the modified design among a set of observers.

“Architects have long theorized about design preference. Now with CAVE-CAD, EEG, and 3D EOG, we can begin to test if these hypotheses are valid,” says Edelstein, who demonstrates a CAVE-CAD rendering of the Barcelona Pavilion in an upcoming episode of the BBC television show “The Secret of Buildings.”

CAVE-CAD is Networkable and Scalable

The team also demonstrated CAVE-CAD's networking capability, dubbed “Collaborative CAVE”, at a 2010 meeting of approximately 150 UCSD medical leaders. The observers, who gathered in Calit2’s headquarters in Atkinson Hall, watched as users in several different CAVE environments navigated through the same model of a virtual hospital. Observers in each location were able to direct the remote users to go back and spend more time in a particular patient room, change the design to modify the daylight entering the intensive care unit, or try to find the way to the nurses’ station.

Gian Mario Maggio
"Collaborative CAVE" demonstration at Calit2
The demonstration was simultaneously experienced by users halfway around the globe in a NexCAVE (a 3D virtual reality display made from passive stereo HDTVs) at the King Abdullah University of Science and Technology (KAUST) in Saudi Arabia, which is a special partner of Calit2.
In addition to its networking capability, Edelstein says CAVE-CAD’s scalability contributes to its collaborative potential. She envisions the goal of  “Collaborative CAVE” as serving “global, remote 24-7 teams working together to create a single piece of design – or observe a single surgical procedure.”
“The key is interaction, whether you use CAVE-CAD in the CAVE, at a visualization lab, or on a desktop.”

Macagno adds that as the next generation of CAVEs being designed at Calit2 by Tom DeFanti, Greg Dawe and Jürgen Schulze become significantly cheaper and easier to use and maintain, “the market for new CAVE technology and CAVE-CAD will be every architectural firm, every hospital and medical school, every training facility.”

“CAVEs running CAVE-CAD have an enormous range of possible uses.”

Notes Edelstein: “There are opportunities for people to collaborate with us on multiple planes, both as a service where we work with clients on specific projects or as research partners. There are a number of ways that we can bring this technology to design today.”

Related Links


Architecture in the Age of Neuroscience

UCSD Neuroscientists Study How Way-Finding Affects the Brain

Salk Institute

HMC Architects

Academy of Neuroscience for Architecture (ANFA)

NewSchool of Architecture and Design

King Abdullah University of Science and Technology

Media Contacts

by Chris Palmer, 858-534-4763, or