One of the things my lab is doing is building a vibratory vest so that we can feed in sensory information through the skin of your torso rather than through more typical sensory channels.
So, for example, we’re doing this for people who are deaf who want to be able to hear. We set up a microphone on the vest and then the auditory stream is turned into this matrix of vibrations on your skin and what that does is it feeds in electrical signals into the brain that represent the auditory information.
And if it sounds crazy that you would ever be able to understand all these signals through your skin, remember that all the auditory system is doing is taking signals and turning them into electrical signals in your brain. It turns out what the brain is really good at doing is extracting information from streams of data, and it doesn’t matter how you get those data streams there. You can feed it in through the ear, or you can feed it in through an atypical sensory channel, like the skin. So we’re developing this right now so that deaf people will be able to hear through their skin.
But our next stage is to feed not just auditory information but other data streams into the vest, for example, stock market data or weather data and people will be able to perceive these data streams just by walking around all day and unconsciously having the stream of information coming into their body and it will expand their sensory world.
And so I think, at the moment, there’s really no limit to this. We will be able to enhance the natural sensory capabilities that humans have, and I think this is where technology and the brain have a very fertile meeting ground, is in giving us a larger view of the reality out there.
In Their Own Words is recorded in Big Think’s studio.
Image courtesy of Shutterstock