Keyboard-less Input
Last week I have had plenty of time to think, during the long walks along the empty sand beaches of the Hel peninsula. Walking is so relaxing. I can go for hours and hours, sometimes listening to some music, sometimes to an audiobook, and sometimes just doing nothing but walking. And thinking. Unfortunately after a couple of days it gets somehow boring. At the seashore you can go either eastwards, or westwards. And the landscape, although beautiful and unique, does not change much when you walk.
So on the third of forth day I felt like doing something more while walking along the beech. I had Internet with me. Everyone has it these days. In a mobile phone in a pocket. The trouble is, it is really hard to interactively operate a mobile Internet device, while walking. One thing is the screen. No way to look at the screen and walk. Sooner or later you will hit a tree this way. And equally difficult to use keyboard. Oh how I was missing the screen-less, keyboard-less system I envisioned in the previous post. Actually I began to realize the input part is more challenging than the output. This is because we already have a variety of HUD (Head Up Display) solutions available. Currently mostly in military applications and other special uses, but even on-retina projection systems have been on the market for some time now.
Input devices connecting to a brain via various types of antennas and electrodes are far from maturity. From my personal experience with the OCZ NIA USB headband, it can be trained to recognize just a few very basic commands. I use mine most often to navigate Google Reader to the next unread item. This can be quite cool. Sitting in an armchair, with a screen in front of me, I can operate my computer completely hands free and in silence (that is, not using any voice input either). But there are drawbacks. First, the NIA has to be trained before use. Then there are just a few commands you can execute. And worst of all, the system crashes on any brain wave storm. Like when all of a sudden my phone rings. In this case I can just see a dozen items fly by. The NIA has no simple means of disengaging itself, when it receives above - normal brain activity and goes haywire. We have long way to go.
But we are getting there. Walking an thinking I came to the conclusion the keyboard-less input device will be the next big thing after Google and Facebook and whatever we have today on top. And I am not talking about voice recognition, envisioned by Andy Kessler in the Grumby either. If we all start talking aloud to our pocket computers, our everyday life will quickly become unbearable. Not to mention security and privacy. I am talking about personal machines reading our brains directly. Just imagine a company that finally enables such input device. Imagine the market it will open. Direct Brain Human Interface Device may be the biggest thing of the upcoming decade. Time to start investment research in this area. The technology and its outcome may be more predictable (lower risk), compared to biotech, while the potential rewards can be huge. And this is a task for a great research team of interdisciplinary engineers.
The subject is of course considered important by the biggest players in the industry. Just last week I came across an article describing milestones in real time mind reading by Intel. Computer that read minds are being developed by Intel.
Exactly what we need. I can only agree to Justin Ratner saying "Mind reading is the ultimate user interface.". Period.
So on the third of forth day I felt like doing something more while walking along the beech. I had Internet with me. Everyone has it these days. In a mobile phone in a pocket. The trouble is, it is really hard to interactively operate a mobile Internet device, while walking. One thing is the screen. No way to look at the screen and walk. Sooner or later you will hit a tree this way. And equally difficult to use keyboard. Oh how I was missing the screen-less, keyboard-less system I envisioned in the previous post. Actually I began to realize the input part is more challenging than the output. This is because we already have a variety of HUD (Head Up Display) solutions available. Currently mostly in military applications and other special uses, but even on-retina projection systems have been on the market for some time now.
Input devices connecting to a brain via various types of antennas and electrodes are far from maturity. From my personal experience with the OCZ NIA USB headband, it can be trained to recognize just a few very basic commands. I use mine most often to navigate Google Reader to the next unread item. This can be quite cool. Sitting in an armchair, with a screen in front of me, I can operate my computer completely hands free and in silence (that is, not using any voice input either). But there are drawbacks. First, the NIA has to be trained before use. Then there are just a few commands you can execute. And worst of all, the system crashes on any brain wave storm. Like when all of a sudden my phone rings. In this case I can just see a dozen items fly by. The NIA has no simple means of disengaging itself, when it receives above - normal brain activity and goes haywire. We have long way to go.
But we are getting there. Walking an thinking I came to the conclusion the keyboard-less input device will be the next big thing after Google and Facebook and whatever we have today on top. And I am not talking about voice recognition, envisioned by Andy Kessler in the Grumby either. If we all start talking aloud to our pocket computers, our everyday life will quickly become unbearable. Not to mention security and privacy. I am talking about personal machines reading our brains directly. Just imagine a company that finally enables such input device. Imagine the market it will open. Direct Brain Human Interface Device may be the biggest thing of the upcoming decade. Time to start investment research in this area. The technology and its outcome may be more predictable (lower risk), compared to biotech, while the potential rewards can be huge. And this is a task for a great research team of interdisciplinary engineers.
The subject is of course considered important by the biggest players in the industry. Just last week I came across an article describing milestones in real time mind reading by Intel. Computer that read minds are being developed by Intel.
"We are currently mapping out the activity that an average brain produces when thinking about different words. It means you'll be able to write letters, open emails or do Google searches just by thinking".
Exactly what we need. I can only agree to Justin Ratner saying "Mind reading is the ultimate user interface.". Period.
Comments
Post a Comment