Monday, November 30, 2009

Video Scenes Pulled from Peoples' Thoughts

“Some scenes decode better than others,” said Gallant. “We can decode talking heads really well. But a camera panning quickly across a scene confuses the algorithm.
clipped from www.livescience.com
Dr. Gallant and
his colleague Shinji Nishimoto have used fMRI to scan the brains of two
patients as they watched videos.


"A computer program was used to search for links between the
configuration of shapes, colors and movements in the videos, and
patterns of activity in the patients’ visual cortex.


"It was later fed more than 200 days’ worth of YouTube internet
clips and asked to predict which areas of the brain the clips would
stimulate if people were watching them.


"Finally, the software was used to monitor the two patients’ brains
as they watched a new film and to reproduce what they were seeing based
on their neural activity alone.


"Remarkably, the computer program was able to display continuous
footage of the films they were watching — albeit with blurred images."


This appears to be the first instance in which video scenes were recovered; previous work has been done to recover spatial memories seen in the hippocampus via fMRI.

No comments:

Post a Comment