My personal involvement with this project was small. However, I am delighted to have been a part of the action. This is a brilliant overview, written by Adrian and presented by Srini; it needs no extra comment from me!


Text of the After-Dinner Speech at the Conference on
“Insect Sensors and Robotics; a Personal Memoir”,
at Brisbane, on Tuesday, August 24
th ,2004,
by Emeritus Professor Adrian Horridge, read by Professor Srinivasan.

Research on robot vision by copying the vision of honeybees has been bringing into ANU an average of about $1 million per year for the past 15 years. The robot vision story began when I was on the Alpha Helix Expedition to the island of Banda in Indonesia in 1975. There I discovered a number of insects, (notably dragonflies) and crustaceans (an amazing Squilla, Odontodactylus) that had specialized regions of the eyes, but these foveas were nothing to do with binocular vision or the estimation of range. Since compound eyes have no accommodation, this intensified the problem of how range is measured as the animals move about, especially if they fly or swim freely. In 1977, I published a paper with coloured photographs in the magazine Endeavour (vol. 1, pp. 7-17), with the postulate that insects measure range by motion parallax as they move along or sideways. It was published just in time to win a footnote in Rudiger Wehner's long Review in 1981. At the time, there were others who identified the apparent size of objects of known size as the key to measurement of range (Land and Collett on freely flying flies). The topic lay dormant in my mind while I looked at various other insect eyes.

Three events in the mid-eighties were coincidental. First, I started working on the peering behaviour of juvenile mantids, which actively measure range when they walk to the end of a twig and look out for the next foothold. The small Australian mantid Tenodera was particularly well behaved. It was clear that ordinary bodily movements such as swaying were as effective as lateral movements of the head in generating relative motion. Secondly, I was able to appoint Srini for the second time; he came back from Zürich to take up a tenured position. It was easy to get him interested in active vision for measurement of range. Srini brought with him the technique of training honeybees which could then be tested with a variety of questions, so that we could ask them what they saw. He also brought Miriam Lehrer in the first of many long visits to Canberra. They were working at the time on behavioural tests of the resolution of the photoreceptors of bees. The third event was the formation of the Centre of Visual Sciences, which was a collaboration between three Research Schools, with Profs Snyder and Levick. At the time, this was a purely political move to counter increasing pressure against blue-skies research in ANU. From on high from about 1985 onwards, we were urged to collaborate across disciplines and direct our research towards useful ends. The University were convinced by our new name and notepaper. We got an extension to our building and some extra post-doc positions when we made a request for collaborative projects.

The first experiment was to train freely-flying bees to go for a reward to a black paper flower standing on a stalk of a certain height, in preference to several other similar ones of different sizes on stalks of different heights. The bees flew above the paper flowers which were at different ranges below them.

The positions of the flowers were shuffled at intervals so the only cue was the range. The bees learned this task easily. It was our first example of randomizing the cues that the bees should not learn, while keeping constant those that the bees should notice. Later we put the black flowers at different ranges on parallel sheets of Perspex. The demonstration, that bees measure range irrespective of anything else, was published in 1988. By that time we had a number of other projects on the way. Srini became interested in the judging of the "time-to-crash" when an obstacle was encountered.

It became obvious that an idea first outlined by Helmholz would become significant for insect vision. This was the idea that the motion of the animal itself would generate sufficient information from relative motion and parallax to give a measure of the range of all contrasting objects in every direction. There were three degrees of freedom in translational motion and three in rotation of the bee. Srini wrote out the six equations that defined the system in terms of the angular velocity at each point on the eye. In one way it was an obvious postulate to make; that the animal used information about angular velocity in this way. However, at the time, one powerful if not fully accepted view was that insect motion perception takes a measure of contrast frequency, not angular velocity. That was the beginning of a long running argument between the bee workers and the fly partisans, or if you like between the attractions of honey versus banana.

It was an exciting time in the study of flight control and motion perception. There were at least two other groups looking at similar topics at the time; Aloimonos at the University of Maryland was interested in the active visual recovery of the structure of the surroundings by self-motion of a spherical eye, and for some years Heisenberg at Würzburg had been revising our understanding of the way that flies use fast saccades with motion perception, to control their stability in flight.

In Canberra, we advanced by making very simple apparatus that had never been used before. We trained bees to come to a platform where we could arrange various kinds of parallax at the edge and observed how and where the bees landed. They responded to closing parallax but would not land where there was opening parallax. We made a channel between two moving belts of patterned paper. Bees were trained to fly along the channel while their flights were followed with a video camera. The bees flew along a line that kept the angular velocity of the walls the same on each side. We trained bees to fly inside a rotating drum and to ignore it while they selected a moving target inside the drum.

Once we had broken into a new chapter of ideas there were endless new experiments that could be done. My work diverged into three directions. First, I had been to the Academia Sinica in Beijing where I met Zhang Shaowu, and invited him to Canberra on the new money from the Centre. Zhang did experiment after experiment, Miriam Lehrer provided the techniques of handling the bees, Srini did the conceptual part and progressively took over the management, and I provided the funds. It was hard to beat a team composed of a Survivor of the Cultural Revolution, a Swiss Israeli, a Tamil Brahmin and a Cambridge-educated Yorkshireman, in work with highly trained bees in the Australian sunshine.

From the start we realized that measurement of the angular velocity would be useful for measurements of range in robot vision, and Srini and I visited a number of manufacturers, but none of them were interested. One day a member of the Royal Society for Guide Dogs for the Blind visited me to find out what went on in our Centre for Visual Sciences. She was surprised to find that we worked on insects, but her visit led to me giving a talk at the Guide Dog Association in Melbourne in 1987, about the possibility of making visual aids for the blind out of silicon chips, copying the principles that we had learned from insect vision.

Almost at the same time, a notice in the Canberra newspaper announced that government funds would be made available to support collaborative research between academics and any company that could use new technology. The Guide Dogs had recently brought out from England a remarkable electronics engineer with a PhD in Psychology and also in Physics. This was Tony Heyes, who turned out to be a Cambridge graduate who was keen to work with us. Tony had been blind and had worked on technology of aids for the blind since he recovered partial sight. The Guide Dogs had the status of a Company, and they sent a letter to say that they would lend Tony to our project for a fraction of his time, with his travel expenses.

I simply went along to the Department of Education Employment and Training, in Canberra (GERD, Gov. Education, Research & Development), and found the man in charge of handing out the money. He seemed to have the stuff coming out of his ears but was unable to find any relevant files on the Department computer network, which had 29 drives but no explanations for them. So he had to telephone his aides for every bit of information, and eventually gave me a form to fill in. A couple of weeks later he rang me to say that a third of a million dollars had been sent to ANU for our project, to appoint two staff and two research students with expenses. That's how we acquired Peter Sobey and Martin Nagle. Peter had just finished a PhD in engineering at Adelaide on the inspection of sawn timbers by computer-vision, and Martin was a technical officer who was working on the CCD camera systems at the Mt. Stromlo Observatory. A Polish refugee, Jan Dalkzinski, from Sweden with a degree in Medical Technology applied for a technician job, but I gave him a PhD scholarship. Gert Stange also transferred into the group, but I think Behavioural Biology continued to pay his salary. So we had a gang of gadgeteers, and they made gadgets. Srini and Zhang also continued the work on the bee behaviour.

The general idea was to make handy little rangefinders for blind people to wear, putting into silicon circuits the equations for relative motion, as used by bees. Several models were made and steadily improved, eventually to be put on self-steering wheeled vehicles. The camera gave a picture in black and white, but when the camera was moved, the relative motion was converted into a colour code for range. We could not find any company interested in making these gadgets, because the market is very small and most blind people in the world are too poor and not interested in technology.

At this point a strange coincidence happened. In 1987, the nuclear power plant at Chernobyl had blown up, and the Japanese Govt was very concerned in case a similar catastrophe happened to one of their 30 reactors in Japan. (In 2004, they now have 73 of them.) So they offered tax breaks and cash to their big companies for the development of all kinds of equipment for emergencies, including seeing robots that would enter a hot and radioactive hell instead of men.

As we later found out, they had no idea how to make a mobile seeing system; they were trying to copy human vision, which is a hopeless task for a mobile robot on account of the vast number of permutations of combinations of pixels when the computer tries to make sense of a bitmap. They did what the Japanese have done before; they sent groups of engineers with pocket cameras around the world to look at developments in centres of excellence. Eventually, a group from Fujitsu Computer Company arrived in ANU and was told that we existed.

I happened to be away at the time, and had left Ian Morgan in charge. Ian worked on chicken retina, and had no idea what we were doing. He showed them our lab and gadgets (but not how they worked), and convinced them that we, and of course they also, had found the Holy Grail. A few weeks later, an invitation arrived for 4 Professors from ANU to visit the Fujitsu research labs in Japan and be entertained by the Fujitsu Company. I must admit that their supercomputer construction factory was impressive and they had a team of robots that played football, but they knew nothing about simple visual systems. We all gave talks about our work, but like Tar-Baby in the Brer Rabbit story, I said nothing about relative motion or how our system worked. The other 3 from ANU were never an item, really, just talking heads to avoid loss of face. They became unpleasantly ill after eating decayed lobster at a feast with Geisha ladies, but I didn't have any.

Some more weeks later, the Dep. Vice-Chancellor of ANU, Ian Ross, was offered $5 million for our know-how. He called me to his office and asked me if our gadget worked !! When I assured him it did, he suggested that $5M was not enough, and we should increase the stakes. At the time, nobody except ourselves realized that our whole story was known to any one-eyed cricket player: that relative motion of the eye provides an on-line measure of range. The simplest ideas are worth the most. I continued to say nothing. There was nothing we could patent that could not have been manufactured by a pirate company in Taiwan. In due course, Fujitsu offered $10 million and sent a couple of engineers to collect the good oil. They put a team of 20 engineers on the project and made a black box, called Ishtar, which simply speeded up the machine by parallel processing. I was soon to retire at this point. We got as much funding as we needed from Fujitsu and ANU took $9.6 million, which I understand was not exactly wasted, but cannot now be recognized by solid achievements. The Research School of Biol Sci got nothing except fame, or perhaps notoriety. Fujitsu put the vision of range into a few mobile robots and sold a few, but they recovered their money in another way. They realized that the equations of the processing system applied in reverse to Virtual Reality. In our vision by relative motion, the movement gave the correct ranges of surrounding objects: in virtual reality, the relative positions are known from the start and the apparent motion is calculated and presented on the screen. So we contributed to the computer games industry.

To try to simulate the 360° vision of insects, Sobey and Nagle experimented with a video camera directed at the point of a cone of polished reflecting metal, to give an all-round view. The picture was transformed geometrically to fit a flat screen. The shape of the cone was improved and eventually patented. I think this was the only patent that came out of the project, and it was a side issue.

I retired at the end of 1992 and changed my topic so that I could discover new ideas about insect pattern vision, now being published 10 years later. We had talked about our work at IEEE and DFG conferences and Srini had published various articles in Journals of Robotics, and Srini was well known. Soon, there were efforts in Sussex, San Francisco and elsewhere to copy insect vision into simple machines. Sandini called his The Beebot, though he never worked on bees. In 1993 Srini was invited to give a talk at the Australian Defence Research Establishment at Adelaide, and from them he secured funding to continue the robot vision work. Soon after, he was approached by DARPA in the USA and then by the USA Air Force to put insect-type range-finding vision into small helicopters. In actual truth, in practice, Srini has difficulty in putting cheese into a mousetrap, and many trained specialists have collaborated to bring about the complex achievements of the group. He was appointed as the new Professor, and a new team took over, eventually succeeding with helicopters and light aeroplanes. The next step is top-secret but may appear on Mars in A.D. 2007.

As a postcript, I would like you to realize that none of this research could have been planned or explained beforehand in a grant application. All the money was given on the understanding that trained researchers would spend it, as well as they could, on what appeared to be the best way forward. No questions were asked and no explanations of what we did were ever offered, and as time went by our critics just died of Envy, a wasting disease. It was done by bringing together a team of extremely well-prepared scientists of several different disciplines, with technicians with a variety of skills, in a well set-up research lab, and funding was sufficient but never overflowing.


  Adrian Horridge


Return to top of this page



Free counter and web stats