Yale researchers reconstruct facial images locked in a viewer’s mind

Using only data from an fMRI scan, researchers led by a Yale University undergraduate have accurately reconstructed images of human faces as viewed by other people.
Diagram showing facial reconstruction using brain activity readings.

(Graphic by Alan Cowen)

Using only data from an fMRI scan, researchers led by a Yale University undergraduate have accurately reconstructed images of human faces as viewed by other people.

“It is a form of mind reading,” said Marvin Chun, professor of psychology, cognitive science and neurobiology and an author of the paper in the journal Neuroimage.

The increased level of sophistication of fMRI scans has already enabled scientists to use data from brain scans taken as individuals view scenes and predict whether a subject was, for instance, viewing a beach or city scene, an animal or a building.

“But they can only tell you they are viewing an animal or a building, not what animal or building,” Chun said. “This is a different level of sophistication.”

One of Chun’s students, Alan S. Cowen, then a Yale junior now pursuing an advanced degree at the University of California at Berkeley, wanted to know whether it would be possible to reconstruct a human face from patterns of brain activity. The task was daunting, because faces are more similar to each other than buildings. Also large areas of the brain are recruited in the processing of human faces, a testament to its importance in survival.

“We perceive faces in a much greater level of detail than we perceive other things,” Cowen said.

Working with funding from the Yale Provost’s office, Cowen and post doctoral researcher Brice Kuhl, now an assistant professor at New York University, showed six subjects 300 different “training” faces while undergoing fMRI scans. They used the data to create a sort of statistical library of how those brains responded to individual faces.  They then showed the six subjects new sets of faces while they were undergoing scans.  Taking that fMRI data alone, researchers used their statistical library to reconstruct the faces their subjects were viewing.

Cowen said the accuracy of these facial reconstructions will increase with time and he envisions they can be used as a research tool, for instance in studying how autistic children respond to faces.

Chun said the study shows the value of funding research ambitions of Yale undergraduates.

“I would never have received external funding for this, it was too novel,” Chun said.

Related

Share this with Facebook Share this with X Share this with LinkedIn Share this with Email Print this

Media Contact

Bill Hathaway: william.hathaway@yale.edu, 203-432-1322