Monday, January 30, 2006

Another Idea

So I had this idea a couple weeks ago, and it’s kinda freaked a couple people out. My roommate thinks I’m insane. I think it’s only about 20 years off.

Humans possess the technology to directly interface with the brain through both sight and sound. That is, we can talk to the brain using implants that simulate sound (http://www.fda.gov/cdrh/cochlear/WhatAre.html) and sight (http://www.sciencentral.com/articles/view.php3?article_id=218392534). Interfacing with humans’ ability to produce sound vocally has been around for many years. That’s simply a microphone embedded in the ear somewhere. Our voice travels through our head and out the ear canals too. The audiology industry calls it the Occlusion Effect.

So far, the research has been performed in the name of health and military advancement. If history is any indication, commerce isn’t all that far off.

Consider that developed nations have become culturally adapted to those little ear-buds and microphones that allow us to talk on the phone without the use of our hands. Once the stuff of the Secret Service, it is no longer weird to see someone who appears to be talking to themselves only to see that they are using a blue-tooth headset. So here’s the idea, and I’d be surprised if engineers at Motorola, Samsung, HTC, Qualcomm, other manufacturers or their vendors aren’t already thinking about this.

Take the basic critical elements of a cell phone. The microphone. The speaker. The radio transceiver. The memory and processor. The software. The little LCD screen. The buttons can go away. We don’t really need to touch a device anymore to communicate with one another. Voice recognition is already in the phones we have today.

So, we take those essential elements, and embed their core technology into our own anatomy. The pieces can be parsed out and distributed throughout the skull. The microphone goes in the ear canal. The speaker can go away entirely, and the signals
that are used to make sound through that speaker can travel directly to a processor that interfaces with the brain through electrical impulses. The screen goes away and the image processor interfaces with the same technology that is used to run artificial eyes. Information can be presented to us not unlike a dashboard in a car. Ever see Terminator? Yes, we’ll look like we’re talking to ourselves, but again, culture will adapt. Just like with cell phones when they came out in the 80’s. Just like with wired headsets and Bluetooth headsets.

And how do we power this contraption? See here: http://sandia.gov/news-center/news-releases/2006/comp-soft-math/eye.html

I’d say, we’re 20 years away from giving humans the ability to communicate using anatomically embedded telephony.

I’d then give it another 30 years before we figure out how to convey thoughts and words without the use of the mouth, the hindrance of spoken words, or even language barriers. Once we can turn thoughts and words into electrical impulses, we’re one step closer to anatomically embedded telepathy (http://dictionary.reference.com/search?q=telepathy}

No comments: