(Disney Meets Darwin)

5: Results

The Character Evolution Tool allows one to generate forms and behaviors in a number of different graphical domains, and to save them for later use. It offers a variety of levels at which to explore and control these forms and behaviors. By allowing behavior objects to be explored at varying levels, it educates at the same time as it functions as a designing tool - this is a necessary part of the way in which it works. To test its viability as a tool to think with, a number of people were asked to interact with the system, so that I could get reactions and comments.

Subject's Responses
In some cases subjects were not told anything about what the system was about nor what they were supposed to do with it. The reason for this was to see if the interface communicated enough to give the subject a sense of what was going on, and to measure to what degree the information unfolded as the user explored. In most cases, as soon as the subject was told (or discovered) that there was a genetic evolution basis to the system, he/she began to understand things much more, due to the availability of a familiar metaphor.

Reactions to the animated figures are almost always positive - people find them amusing right from the start, even before they have begun to manipulate them. This has served the system well: it establishes a backdrop of motivation for further exploration into the nature and control of their behaviors.

Although there is an obvious element of chance in the creation of behaviors, there is also a significant degree of predictability over the course of the interactions, and people tend to have some sense of ownership to the behaviors that have evolved.

Lag Time Learning
There was a realization as a result of implementing the gesture tool. It has to do with the rate of genetic learning in the population, and how incompatible this is with the notion of telling someone to "move like this." Clearly, a choreographer, when demonstrating a movement to a dancer, expects immediate results (whether or not the results are good the first time). This kind of immediacy is expected when one is communicating non-verbally to another person, or even to an object in a graphical interface. We are conditioned, for instance, to expect an object in a typical graphical user interface to drag across the screen when we place the mouse cursor into it, click the mouse button, and move the mouse. Some actions, by their nature, demand immediate results. The idea of gesturing to a population - and expecting the population to genetically evolve over many generations in order to fit that gesture - is uncommon. Not only that but it is counter-intuitive. For this reason, the development of an interface for making this activity meaningful has been a challenge. But, absurd biological analogies aside, it may be useful when seen as a spacetime constraint solution, in that it presents a new way of thinking about the specification of motions in objects which have their own innate ways of achieving motion.

Aperiodic Expressions
One drawback to the gesture tool is that the set of possible head motions that an articulated figure can create is a very small subset of the number of possible gestures one can motion into the scene. The articulated characters are not sophisticated enough to generate aperiodic motions (which is a quality of many forms of linear expression). Their joint angle changes are created by series of sine functions each of whose frequencies are identical (or at least related by whole number ratios). Thus, any gestural action a character makes will be repeated over and over again. This is of course useful for locomotion, which is primarily a periodic activity in most animals. But it becomes a disadvantage in that it is difficult for a character to approximate most gestures a user may draw. For this reason, I have constrained the gestures I have drawn to have some repetitive qualities to enable some matching to be found in the characters' motions.

This problem would be alleviated if my characters had the ability to change their joint angles according to more complex motor control algorithms, allowing for aperiodic motions. Stimulus/response models such as those cited in section 2 would offer this kind of flexibility.

Prototyping and Phenotyping
The Character Evolution Tool is successful when seen as a research environment for a genotype/phenotype methodology of design. This is apparent in the accumulation of behavior objects which I have been able to quickly prototype for my own research and for demonstrating the concepts. A phenotype template was developed early on in the research to give me (and potentially other graphics programmers) a quick way to prototype a new behavior object. Figure 22 illustrates this template. The collection of different behavior objects I have created has not only increased my design repertoire, it has also offered many people a chance to see graphic behaviors demonstrated through a genetic lens.




Figure 22 A new species of behavior objects can be created using the phenotype template.


Artificial Life
As a research tool for artificial life experiments, the Character Evolution Tool has proven useful. It has supplied an environment for a project in studying two interdependent emergent phenomena: morphology and locomotion behavior (Ventrella, 94).


Conclusions


(go to beginning of document)