Act Naturally

So many of us have already spent years, in terms of man-hours, sitting in front of a computer, clickity-clacking away at the keyboard, clicking a mouse and damaging tissue in our collective quest to develop the geek’s disease-Carpal Tunnel Syndrome.

Think of how much the world of IT has changed: phones have gone from big bulky things that sat on your desk and weighed as much as a bowling ball to tiny little gadgets that fit into the pockets of a pair of skin-tight jeans; CPUs have gone from glorified calculators to teraflop-measured powerhouses; RAM and hard drives have gone from being measured in MBs to GBs today-not to mention having become several orders of magnitudes faster. Yet, sadly, the way we interface with our computers has remained unchanged…

The Mouse And Keyboard
This writer is plodding away at his keyboard writing this, and all of you do the same while using your computers. If you’re not a keyboard warrior, you’re most definitely a mouse wrestler. Ever since the GUI (graphical use interface) first appeared, the mouse and keyboard have been a staple of our digital diets.

Sure, the mice and keyboards being made today are anything but ordinary (see box Devices Of Desire), but the fact of the matter is that this is just design, and not anything groundbreaking.

Other Controllers
The first major shift in input devices was brought about by gaming-so many of us grew up playing games on our Atari and Nintendo gaming systems. Next we saw joysticks being connected to computers; then there were the wheel controllers for car racing games.

However, we’ve seen gaming branch off into the direction of consoles, and they seem to have taken the controllers with them: who really uses a joystick or wheel on a PC anyway? It’s so much simpler-not to forget cheaper-to just use the default interface devices, the trusty keyboard and mouse, for gaming.

Perhaps the truth is that we just haven’t found anything better than these relics of technology to control our PCs. Well, hopefully, that’s about to change. Perhaps not as soon as we might wish, but it’ll change nonetheless.

Amidst all the hype over all the other thousands of new technologies and inventions that keep popping up, there’s hope for those of us who want to be able to sit back instead of hunching over our keyboards and reaching out for that oddly-placed mouse!

From all the news and prototypes we’ve come across, at least a few revolve around changing the way we interface with PCs. The research points in four basic directions: better touch-based devices (as in new designs for glorified keyboards and mice), speech control, eye control, and even thought control. Basically, we’re looking for computers to develop the senses we have-touch (they already feel us typing and clicking), hearing (speech control), vision (eye control), and extra-sensory perception in terms of thought control. Smell and taste we might not want them to develop-the last thing you need is your PC telling you that you smell, or chomping down on your finger if it gets hungry!

Listen To Me
Speech recognition has been around for quite some time now, but isn’t really going anywhere far. We’ve reached a point where computers can understand us when we talk-at least as far as understanding commands goes-but that’s provided you talk in an expressionless monotone, and are prepared to spend hours training the software. And don’t forget that you always have to keep shushing people because the mic cannot tolerate background noise.

The shortcomings of speech-to-text are well documented, and we don’t need to focus on it here. All we can hope for is adaptive or intelligent software that will be able to understand what we say, regardless of whether we have a cold, are breathless, or are talking to our PDAs in noisy railway stations!

Watch Me
The more interesting alternative input devices being researched are the visual-aided ones. Eye-control or gesture control devices use a camera to interface with customised software, and translate your motion into mouse control.

Interestingly, this was first thought of to help the physically disabled interface with PCs. However, as technology progresses, and as we all become a little disabled in our own way (basically, lazy!), such input techniques might come as a boon to us all.

Most devices actually track deviations in your eyes or the entire head. Some of these setups let you left-click with a wink of your left eye and right-click with your right eye; double-clicking is taken care of by rapid blinking. The problem with such devices is that you’re generally confined to your chair and cannot move your head more than a few inches to the left or right. This is because the camera needs to focus on your face or head. Now, for the physically handicapped, such restrictive devices make perfect sense, but not for the majority.


TrackIR 4 Pro translates head movements into mouse movements on your screen

Take TrackIR 4 Pro (www.eyecontrol.com) for instance, a head-control system by NaturalPoint Inc. This system consists of a little headpiece with three reflective surfaces, and which can be attached to almost any headwear, such as a baseball cap. It has a sensor system that, with the help of software, tracks the way your head moves and translates it to movements in games. So instead of moving your mouse to look left, you can just look left!

“But doesn’t that mean looking away from the monitor?” is the question everyone asks. The answer is simple: sensitivity! You can set the software to be less or more sensitive to your movements, so moving your head left by a degree could change your view in the game by, say, 10 degrees.

What’s more, TrackIR 4 Pro looks for movements along all three axes, and also tracks movements in the form of pitching, yawing, and rolling. This is ideal for, say, a game like Microsoft’s Flight Simulator.

Reviewers from around the world claim that this form of control is very easy to get used to, and that it shouldn’t take you more than an hour to be in complete control!

Now if the TrackIR system is built specifically for gaming, there are other products built for regular computer use. An example is Quick Glance from www.iriscom.org, which is specifically targeted at people unable to use a mouse and keyboard. It’s a completely custom-built solution that comes with training, so not too much detail can be provided here. However, it does use eye-tracking to move a pointer, which can click on a particular area either by just leaving it over an icon for a short while (by staring at the icon), or by blinking slowly. As you can expect, the system comprises software and a camera that can be mounted on your desktop or laptop monitor.


Iriscom’s Quick Glance 2 follows your eyes and clicks on icons when you blink slowly

It’s clear that enough inroads have been made in this field to already have viable products. As for “tomorrow,” things can only get better and more accurate!

Read My Mind
Wouldn’t it be great if PCs could read your mind? Well, they can! Back in March 2005, Wired Magazine (www.wired.com) wrote an article about Matthew Nagle, a victim of a violent crime that left him with a severed spine-a quadriplegic for life. Nagle, thanks to some innovative technology research, was able to regain a little control of his life. After his doctor suggested that he join a clinical test being conducted by Cyberkinetics Neurotechnology Systems (www.cyberkineticsinc.com), Nagle received a brain implant. After intensive training, Nagle can now turn his TV set on and off, raise the volume, and change channels. He can also interface with his computer and play Pong, draw basic shapes, and more.

The chip in his brain connects to software in his PC or home appliances, and Matthew “wills” them to work by thinking about what he wants to do. Thanks to Cyberkinetics Neurotechnology Systems’ BrainGate Neural Interface System, Nagle has hope. Though the clinical trials are only open to people suffering from similar conditions as Nagle, the technology works! We now know that thought control is actually possible, and research will get us there soon enough!

A Brain Computer Interface was recently showcased, at CeBIT 2006. It allowed two people to put on some strange-looking headgear and then play Pong-perhaps one of the simplest games ever made. Yes, the headgear is funny, and the sheer number of wires emanating from it are likely to have people cracking “Please don’t hurt me, I’ll take you to my leader!” jokes. But the fact is, we haven’t reached the stage of aesthetics yet. When (with emphasis on the word when) such technologies are perfected, we can start worrying about aesthetic appeal; for now, we’re just happy that it works.

There are many research projects working on this new way of interfacing with computers, and that’s heartening for those of us who are fed up of keyboards and mice! Here’s Wikipedia’s entry for Brain-Computer interface, which should help you find more interesting news about this exciting technology change: http://en.wikipedia.org/ wiki/Brain-computer_interface.

We Don’t Want Silly Hats
Well, we don’t! We don’t want to have to sit stationary in front of our computers either, as is required by head or eye tracking systems. The same holds true for the gloves and other devices that connect to limbs and allow you to control motion. Most of us would not be willing to undergo surgery for implants either, for fear of short-circuiting our brains! So what do we do?

As the research, prototypes and products mentioned here already suggest, researchers are already looking for a non-intrusive, minimal-fuss interface that will only limit you as much as a mouse and keyboard do.

What Happens Tomorrow?
In the near future-five years or less-we hope to be able to see a coming together of all these technologies and methods of interfacing. This is the only way in which a truly complete solution might emerge.

Think about it: thought control will never be perfect, because human brains are unique enough to cause anomalies in the way the software registers thought patterns. What is “Minimise” for me might be “Close” for you and so on. What can help iron out the creases are speech and motion detection software. If you think “Minimise,” and the PC is confused about whether you meant “Minimise” or “Close,” it can use the eye tracking camera and software to figure out which button you were looking at. Or perhaps it can use a little AI to understand that your body language does not suggest that you’re done with the window in front of you, and therefore would not want to close it.

If someone could integrate speech recognition (or natural language processing, preferably), motion detection, thought control and a little bit of AI, we might finally be rid of our traditional, boring input devices. Until that day, nothing is going to dethrone them-old habits die hard!  


Team Digit

Team Digit

Team Digit is made up of some of the most experienced and geekiest technology editors in India! View Full Profile

Digit.in
Logo
Digit.in
Logo