UW Model Shows Cortical Implants Like Elon Musk’s Blindsight Unlikely to ‘Exceed Normal Human Vision’
07/30/2024
July 29, 2024
UW model shows cortical implants like Elon Musk’s Blindsight unlikely to ‘exceed normal human vision’
University of Washington researchers created a computational model that simulates the experience of a wide range of human cortical studies, including an extremely high-resolution implant like Blindsight.Pixabay
Elon Musk recently declared on X that Blindsight, a cortical implant to restore vision, would have low resolution at first “but may ultimately exceed normal human vision.”
That pronouncement is unrealistic at best, according to new research from the University of Washington.
Ione Fine, lead author and UW professor of psychology, said Musk’s projection for the latest Neuralink project rests on the flawed premise that implanting millions of tiny electrodes into the visual cortex, the region of the brain that processes information received from the eye, will result in high-resolution vision.
New research from the University of Washington created a computational model that simulated a wide range of human cortical studies. The image on the left was generated using 45,000 pixels. The one on the right — representative of high-resolution cortical implants like Elon Musk’s Blindsight — uses 45,000 electrodes.Ione Fine
For the study, published online July 29 in Scientific Reports, the researchers created a computational model that simulates the experience of a wide range of human cortical studies, including an extremely high-resolution implant like Blindsight. One simulation shows that a movie of a cat at a resolution of 45,000 pixels is crystal-clear, but a movie simulating the experience of a patient with 45,000 electrodes implanted in the visual cortex would perceive the cat as blurry and barely recognizable.
That’s because a single electrode doesn’t represent a pixel, Fine said, but instead stimulates, at best, a single neuron.
On a computer screen, pixels are tiny ‘dots.’ But that’s not the case in the visual cortex. Instead, each neuron tells the brain about images within a small region of space called the “receptive field,” and the receptive fields of neurons overlap. This means that a single spot of light stimulates a complex pool of neurons. Image sharpness is determined not by the size or number of individual electrodes, but the way information is represented by thousands of neurons in the brain.
“Engineers often think of electrodes as producing pixels,” Fine said, “but that is simply not how biology works. We hope that our simulations based on a simple model of the visual system can give insight into how these implants are going to perform. These simulations are very different from the intuition an engineer might have if they are thinking in terms of a pixels on a computer screen.”
The researchers’ approach was to use a wide range of animal and human data to generate computational “virtual patients” that show, for the first time, how human electrical stimulation in the visual cortex might be experienced. Even blurry vision would be a life-changing breakthrough for many people, Fine said, but these simulations — which represent the likely best-case scenario for visual implants — suggest that caution is appropriate.
While Fine said Musk is making important strides in the engineering challenge of visual implants, a big obstacle remains: Once the electrodes are implanted and stimulating single cells, you still need to recreate a neural code — a complex pattern of firing over many thousands of cells — that creates good vision.
“Even to get to typical human vision, you would not only have to align an electrode to each cell in the visual cortex, but you’d also have to stimulate it with the appropriate code,” Fine said. “That is incredibly complicated because each individual cell has its own code. You can’t stimulate 44,000 cells in a blind person and say, ‘Draw what you see when I stimulate this cell.’ It would literally take years to map out every single cell.”
So far, Fine said scientists have no idea of how to find the correct neural code in a blind individual.
“Somebody might one day have a conceptual breakthrough that gives us that Rosetta Stone,” Fine said. “It’s also possible that there can be some plasticity where people can learn to make better use of an incorrect code. But my own research and that of others shows that there’s currently no evidence that people have massive abilities to adapt to an incorrect code.”
Without that sort of development, the vision provided by Blindsight and similar projects will remain fuzzy and imperfect — no matter how sophisticated the electronic technology.
For now, the models developed in the study could be used by researchers and companies to aid in the placement of existing devices and the development of new technology, among other benefits. Entities like the Food and Drug Administration and Medicare could also gain insight into what sort of tests are important when evaluating devices. Further, the models provide realistic expectations for surgeons, patients and their families.
“Many people become blind late in life,” Fine said. “When you’re 70 years old, learning the new skills required to thrive as a blind individual is very difficult. There are high rates of depression. There can be desperation to regain sight. Blindness doesn’t make people vulnerable, but becoming blind late in life can make some people vulnerable. So, when Elon Musk says things like, ‘This is going to better than human vision,’ that is a dangerous thing to say.”
Geoffrey Boynton, UW professor of psychology, was a co-author. The research was funded by the National Institutes of Health.
For more information, contact Fine at ionefine@uw.edu.
Tag(s): College of Arts and Sciences • Department of Psychology • Geoffrey Boynton • Ione Fine