Which would, honestly, amount to reading someone's mind.
And a significant step has been taken toward that goal by a team of neuroscientists at the ATR Computational Neuroscience Laboratories of Kyoto University. In a paper that was published just two weeks ago, the scientists, Guohua Shen, Tomoyasu Horikawa1, Kei Majima, and Yukiyasu Kamitani, describe a technology that can take the neural output of a person and use it to come up with an image of what the person was looking at.
The paper, called "Deep Image Reconstruction from Human Brain Activity," is available open-source on the site BioRxiv, and all of you should take the time to read it, because this quick look is not nearly going to do it justice. The idea is that the researchers are taking a novel approach to detecting fluctuations in the electric field generated by the brain, and from that reconstruct images that are nothing short of astonishing.
The authors write:
Here, we present a novel image reconstruction method, in which the pixel values of an image are optimized to make its [deep neural network] features similar to those decoded from human brain activity at multiple layers. We found that the generated images resembled the stimulus images (both natural images and artificial shapes) and the subjective visual content during imagery. While our model was solely trained with natural images, our method successfully generalized the reconstruction to artificial shapes, indicating that our model indeed ‘reconstructs’ or ‘generates’ images from brain activity, not simply matches to exemplars. A natural image prior introduced by another deep neural network effectively rendered semantically meaningful details to reconstructions by constraining reconstructed images to be similar to natural images.I'm not going to show you all of the results -- like I said, I want you to take a look at the paper itself -- but here are the results for some images, using three different human subjects:
The top is the image the subject was shown, and underneath are the images the software came up with.
What astonishes me is not just the accuracy -- the spots on the jaguar, the tilt of the stained glass window -- but the consistency from one human subject to the next. I realize that the results are still pretty rudimentary; no one would look a the image on the bottom right and guess it was an airplane. (A UFO, perhaps...) But the technique is only going to improve. The authors write:
Machine learning-based analysis of human functional magnetic resonance imaging (fMRI) patterns has enabled the visualization of perceptual content. However, it has been limited to the reconstruction with low-level image bases or to the matching to exemplars. Recent work showed that visual cortical activity can be decoded (translated) into hierarchical features of a deep neural network (DNN) for the same input image, providing a way to make use of the information from hierarchical visual features... [H]uman judgment of reconstructions suggests the effectiveness of combining multiple DNN layers to enhance visual quality of generated images. The results suggest that hierarchical visual information in the brain can be effectively combined to reconstruct perceptual and subjective images.This is amazingly cool, but I have to admit that it's a little scary. The idea that we're approaching the point where a device can read people's minds will have some major impacts on issues of privacy. I mean, think about it; do you want someone able to tell what you're thinking -- or even what you're picturing in your mind -- without your consent? And if this technology eventually becomes sensitive enough to do with a hand-held device instead of an fMRI headset, how could you stop them?
Maybe I'm being a little alarmist, here. I know I have Luddite tendencies, so I have to stop myself from yelling "Back in my day we wrote in cuneiform on clay tablets! And we didn't complain about it!" whenever someone starts telling me about new advances in technology. But this one... all I can say is the "wow" is tempered by a sense of "... but wait a moment."
As Michael Crichton put it in Jurassic Park: "[S]cience is starting not to fit the world any more. [S]cience cannot help us decide what to do with that world, or how to live. Science can make a nuclear reactor, but it cannot tell us not to build it. Science can make pesticide, but cannot tell us not to use it."
Put another way, science tells us what we can do, not what we should do. For the latter, we have to stop and think -- something humans as a whole are not very good at.