Machine Understanding


More Tech Stuff:

Indexing Books: Lessons in Language Computations

Client-Side Frame Manipulation Inside the Microsoft Internet Explorer Object Model with Visual Basic .NET

Replacing a PC power supply

Constructing a Mandelbrot Set Based Logo with Visual Basic.NET and Fireworks

posted July 11, 2012

A Physical Gesture Aping Machine

We are familiar with humanoid robots, if only from fiction and computer animations. Consider how a humanoid robot might mimic the physical gestures of an exemplar robot without being under the mandatory control of, and without knowing the internal state of, the exemplar.

The ape could use any number of methods of determining the special posture and movements of the exemplar: visual or sonar might serve. Analyzing this data in itself would be a complex task for a present-day artificial intelligence program. For now presume we have an artificial neural network or feature detection system up to the task. This subsystem would present the critical data to the ape-capable master system.

The aping system would have had to have already learned how to control its robotic body. It would know how to move its arms and legs to specific positions. This could be engineered, but the type of system we seek would do better if it started as a blank slate capable of learning from trial and error.

Knowing the coordinates of the exemplar's limbs or other parts, it needs to be move the appropriate coordinates of its own parts to match. For gestures it needs to remember the exemplar's original position, implement that, and then follow the gesture over a time interval. It should also be able to remember such gestures, so that it can perform them at other times. Thus an organic ape might reach up and pick a fruit. A baby ape might mimic that, but at a later time use the learned gesture to actually pick a piece of fruit.

Such an aping machine might also be constructed in a virtual world.

Next: ApeWorm, an initial project

Previous page

copyright 2012 William P. Meyers