The Body Beautified

A largish step short of the cyborg, the Bionic Man is already a reality

Here we go again about man-machine convergence: two years ago, we spoke about cyborgs and the engineering of the human condition. This time it’s about bionics. So why are we repeating ourselves?

We aren’t. “Cyborgism” is about transcending human capabilities, about creating supermen. Bionics is about repairing injured or disabled humans so they can function normally-as normally as possible, at any rate. Becoming a bionic man may not sound as cool as becoming a cyborg, but the stories are just as fascinating.

“Bionics” actually refers to the application of mechanisms found in nature to the design of artificial systems using technology. However, there is so much usage of the term in the context of designing body parts that “bionics” is now almost synonymous with “artificial organs”.

Humans are benefiting from bionics to a degree of sophistication you wouldn’t believe. Though expensive, there are bionic eyes, ears, arms, legs, knees, lungs, livers, you name it… a bionic anus has been developed, too, in case you were wondering. The following, then, is the story of how technology is repairing bodies and restoring lost bodily functions-a story of brain, mind, body, machine.

Sight And Sound
Research into the possibilities of bionic enhancements dates back a long time, but the first real breakthrough-at least, the most-remembered and best-known achievement of the early days-was the development of a bionic ear. The Australians are so (rightly) proud of it, it has been featured on a postage stamp: they call it the “Australian Bionic Ear.”

Developed by Prof Graeme Clark, the prototype was first implanted in an adult at The Royal Victorian Eye and Ear Hospital in 1978. This one let the patient hear speech for the most part: speech, aided by a small processor that the patient could wear, was analysed and converted to electrical signals that were direct stimulation to the main hearing nerve.
The bionic ear has been brought much closer to perfection over the years. It is now more popularly known as the cochlear implant.

In the case of the sense organs, pretty much the same principle applies throughout all implants: how do we see, and how do we hear? Information is converted into electrical impulses that speed along nerves. The same thing happens in the Argus II system, which consists of external glasses, a retinal implant consisting of electrodes, and a wearable processor. The camera on the glasses views the image and sends signals to the processor. These are converted to bio-friendly information, which is sent back to the glasses and then wirelessly to a receiver under the surface of the eye. Once the signals are inside the eye, it’s easy to imagine: the signal is conveyed to the electrodes in the retinal implant, which stimulate the retina appropriately. (The retina then does its job of sending signals to the optic nerve, which carries on to the brain.)

Note that both systems mimic biology: they have been inspired by the appropriate biological mechanisms, and they do almost exactly the same thing their biological counterparts do.

Argus II has been given the go-ahead by the US Food and Drug Administration, and may be used in patient trials-initially with patients over the age of 50. Commercialisation will happen if successful, and the system will cost around $30,000 (Rs 13.5 lakh). It’s obviously not for everybody-yet.

It’s fine in theory, but what do patients experience? It’s encouraging, but far, very far, from perfect. Terry Byland, then 58, was fitted with an implant in 2004. When it was done, “It was like seeing assembled dots-now it’s much more than that,” Byland says. “When I am walking along the street I can avoid low-hanging branches-I can see the edges of the branches.” What about faces? “I can’t recognise faces, but I can see them like a dark shadow.” What this means is that we’re still a long way from “restoring vision”-it’s more like helping the blind live their lives without someone (or a dog) having to guide them all the time.

An Arm And A Leg-And A Mind
Jesse Sullivan can put on his socks, eat without assistance, and vacuum the floor without assistance, too. This is interesting not because he is a baby, but because both his arms were amputated. He now has artificial arms-and they seem to be doing well.

What kind of computer signal would a nerve be
able to understand?

Sullivan’s arms are a major advancement: we’ve had externally-powered hands and wrists and more for many years, but the problem is with the control mechanisms. These things interface with the body at the point where the natural limb was amputated; they are controlled by brain signals from a pair of muscles there. That allows only a single motion at a time; for example, to move the wrist and the elbow at the same time-try it, it’s easy for us, but impossible with those prostheses. The system Sullivan has works on a different principle: researchers at the Rehabilitation Institute of Chicago discovered that when a limb is lost, the body’s mechanism that controls the limb-the control signals-are accessible from nerves that can be grafted to other muscles. This produces signals that can be used to control prostheses in a more natural way.

Such a “muscle reinnervation procedure” is used in Sullivan’s Bionic Arm. Nerves from the shoulder were transferred, in Sullivan’s case, to his chest muscles. The prosthetic arm uses the signals from the chest, activated by Sullivan’s thought-generated nerve impulses. As you can guess, electrodes on the surface carry the signals to the arm. Sullivan doesn’t need to think “somewhere in his chest” about moving his arm; the nerves have been transferred, so when he thinks about moving his elbow, some chest muscle detects the thought, and the arm moves. It probably takes a lot of getting used to, but like we said, Jesse Sullivan seems pleased. This is semi-mind-control, because nothing is connected directly to the brain; we’ll come to total mind control in a bit.

“Smarts” can be infused into anything these days, it seems. BBC reporter Stuart Hughes has a prosthetic leg. He lost his leg stepping on a landmine covering the Iraq war. Receiving a prosthetic in such a situation is routine nowadays; what’s special here is that the foot-called the Proprio-is intelligent. It can detect what kind of terrain is being walked upon.


He’s actually repairing a bike! Make no mistake-prostheses today allow recipients to be near-normal

“Every type of terrain has a distinct ‘signature’ which the software controlling the foot is able to interpret. When the microprocessor recognises a change in terrain, it instructs a motor to automatically adjust the angle of the foot in preparation for the next step,” explains Richard Hirons of Ossur, the Iceland-based prosthetics company that developed the Proprio.

The idea that we’re quickly moving towards seamless integration of man and machine is getting more and more commonplace. Saeed Zahedi, visiting professor in prosthetics at the University of Surrey, is confident that we could soon create hybrid limbs that combine natural tissue with electromechanical devices. “What we’re seeing at the moment is just the tip of the iceberg.”

And then, there’s direct mind control. A company called Cyberkinetics developed a brain-computer interface called BrainGate, which was breakthrough enough to have been in the news the world over. It is actually plugged right into the brain, and allows the paralysed patient to do things like control a computer cursor and move a robotic hand by “thought control,” meaning thinking about the action does it. Matthew Nagle, stabbed in 2002 and paralysed from the neck down, was the first to try BrainGate. Of course, he still had to sit in a wheelchair all the time, but he wowed people by controlling a computer cursor and similar feats. (He actually could beat some people at Pong.)

Space doesn’t permit talk of all the cool, groundbreaking stuff going on in the space of prosthetics and their interfaces and the signals that control them-see box Links for more. We should remember that much of this borders on “cyborgism,” for lack of a better word.

We mentioned the performance artist and would-be cyborg, Stelarc, in our June 2005 issue-we quoted him as saying, “The body is neither a very efficient nor very durable structure…” He now has an extra ear in his arm and says it could be “connected to a modem and a wearable computer, and broadcast RealAudio sounds to augment the local sounds that the actual ears hear. This “extra ear” becomes a kind of Internet antenna that telematically and acoustically scales up one of the body’s senses.” Stelarc is not very articulate, and what follows and precedes those sentences doesn’t make much sense to us. (As an aside, his Web page is, surprisingly, ghastly.) Why we’re mentioning Stelarc at all is because it indicates that bionics, as a phenomenon, has excited the imagination of this new breed of humans that wants to be wedded to machines.

What’s Ahead?
Plenty. If you’ve been with us thus far, it might just seem that whatever’s been done only needs to be improved upon. Not so. Think about one of the essential problems: how to directly interface devices, such as implants, to the nervous system. Brand-new technology needs to be developed for this to happen. For one example, there’s micro-mechanics, as we learn from Strategic Workshop on Future Challenges in Bionics, by Giulio Sandini of the University of Genova. Micro-mechanics would have to come in if a direct interface between nerve cells and silicon is to be developed. Micro-mechanics, because-imagine this-tiny connectors actually touching individual neurons. Apart from the fact that it’s all tiny, Sandini explains that bio-compatible materials need to be explored; in addition, a “micro-environment” suitable for the coexistence of the flesh and the wires needs to be devised.

Thinking from the ground up, then, we have more problems. Think of information coding. What kind of computer signal would a nerve be able to understand? How can-or should-we fire neurons? This involves deep study of the language that nerve cells use amongst themselves, and how that relates to the impulses we can generate.

Now, if we’re going to have a new kind of coded information, we’re also going to need a new kind of information processing. Think about this: once an implant is in place, the tips of the nerves and the tips of the electrical interface cannot possibly be perfect. This means the processor would need to be tweaked a little to make for a better implant experience, and for that tweaking to happen, we need to make sense of the information actually flowing within the device.

We also need better and new types of batteries-as in everything digital today. Receivers of, say, artificial legs, can take their legs off at night and recharge them from the mains, but what about those with heart implants? Forget to recharge, and you pay for it with your life… of course, it’s not quite that way, but we’re talking about devices that can be powered by such things as body heat and limb movement. Like watches that don’t require batteries, running on the energy generated by the movement of your forearm.


The extra ear becomes a kind of Internet antenna”
Stelarc
Principal Research Fellow
Performance Arts Digital Research Unit
Nottingham Trent University

Then there is the problem of creating new sensors. Optical and acoustic sensors are reasonably acceptable today in terms of quality, but what about those used in smell and taste? Smell, as you know, occurs when molecules of a substance bombard sensors in the nose. We need to create devices that can actually translate molecular concentration into computer-friendly signals.

Endpoint: The Brain
In the ultimate analysis, to make implants work the way we want them to, we need to understand how the brain works. Grand dream, naturally, but little step by little step, that’s the road ahead.

“Philosophers and psychologists have long noted that human perception has both analogue and digital characteristics,” says Dr Sebastian Seung, a professor at MIT and co-author of the research report we’ll soon talk about. “One neuron can make another active or inactive, but the intensity of the activity varies in a continuous way.” In other words, apart from turning each other on and off, nerve cells also have analogue interactions. This is reflected in, say, visual perception: we make spontaneous yes / no distinctions like whether it is light or dark outside, but we also see shades of grey when we need to.

In what seems a breakthrough idea to us-if not a breakthrough implementation-researchers at MIT have, using this concept, built a circuit of artificial neurons with hybrid analogue / digital interconnections. Details are sketchy, but the creators believe such circuits can-after refinement-help process auditory and visual information for robots. They could also process feedback information from bionic implants: retinal chips-as in the artificial eye we talked about earlier-could at some point be pre-processors feeding into such circuits, for example.

What is special about the circuit, the way we see it, is that it mimics the brain. It’s a top-level idea, and it’s not been seen in action yet, but think about it: since the brain is the final processor of information from and to any source in the body-biological or artificial-it’s natural that the circuits that interface with it should behave like it.

Such is the road ahead: “Continued research in neuroscience and bioengineering will no doubt lead to improvements in man-machine interfaces and functional replacements, but it will be a long, hard road filled with many failures and a few successes.” That’s William Jenkins, Vice President for Development at Scientific Learning, which makes products that develop learning and communication skills.

Links
http://tinyurl.com/285zhx
A cool robotic hand with ultra-cool fingers. Videos, too.
http://tinyurl.com/yuv677
“Physical Enhancement Page.” Lots of links. Trans-humanism and stuff.
http://tinyurl.com/2xg2ge
Excellent, long Wired article on The Desire To Be Wired.
http://tinyurl.com/yqkv5v
A company page. They call it “healthcare,” and they talk about bionic stuff like store products. Don’t skip the part about the biker.
http://tinyurl.com/yovamq
About the bionic products of Iceland-based Ossur, whom we mentioned.
http://tinyurl.com/2crlnt
Web site of Victhom, which develops bionic devices. Videos here too.
http://tinyurl.com/yvdmon
Artificial everything: researchers make kidneys from a cow’s ear, and more.
http://tinyurl.com/27jdkl
The first completely artificial heart. Videos.


Endnote
So bionics helps the blind see, the wheelchaired walk, the deaf hear, and more. It’s interesting that some people look upon the field with a kinky sort of fascination: going beyond need, as in patients who need prostheses, they want implants. What if we could have more than two arms, Hindu-god-style? When will such things be socially acceptable?

We’re veering away again to the realm of cyborgs, androids, and other such interesting creatures, but then, bionics does straddle the middle ground between corrective surgery and all this.

Like we said, it’s not just a question of our current methods getting better: there are fundamental issues to be addressed. Again, like we said, there needs to be a better understanding of brain function for bionics to take off.

We cannot end on a predictive note, saying when those legs will be on the shelves: Jenkins is probably right. It will be a long, hard road.  

Ram Mohan Rao
Digit.in
Logo
Digit.in
Logo