Abionic prosthetic eye that speaks the language of your brain
On the grand scale of things, we know so very little about the brain. Our thick-headedness isn’t quite cosmological in scale — we really do know almost nothing about the universe beyond Earth — but, when it comes down to it, the brain is virtually a black box. We know that stimuli goes in, usually through one of our senses, and motor neurons come out, but that’s about it. One thing you can do with a black box, however, is derive some semblance of a working model through brute force testing.
Take prosthetic arms, for example: We don’t have a clue about the calculations that occur in the brain to trigger arm muscle motor neurons, but that doesn’t stop us from slapping some electrodes onto a subject’s bicep muscles and measuring the electric pulses that occur when you tell him to ‘think about moving your arm.’ By the same logic, a brain-computer interface can measure what our general cranial activity looks like when we’re thinking something and react accordingly, but it can only do this through training; it can’t actually understand our thoughts. Taking this one step further, though, Sheila Nirenberg of Cornell University has been trying to work out how the retina in your eye communicates with your brain — and judging by a recent talk at TEDMED (embedded below), it seems like she’s actually cracked it.