Thank you for watching the demo

We wish to close the animation with a few words about the animation and the goals we had set for it. For a start, the texts that came sliding in from the top of the screen during the animation are repeated here, in case you missed any of them. This is followed by a few words about the "making of" the Sisters demo. The document ends with a summary of the toolkits that we used and that make fluent context-sensitive animation a reality.


The running text, again

If you were paying attention to the animation rather than reading the explicative texts that scrolled in from the top of the window, here they are again.

(The storyline)

Fanni and her sister Tila are charming creatures that live in the world of fantasies.
By dancing, they try to seduce spectators to leave the world of reality and to join theirs.
Rivalry between the sisters is not uncommon.
But then again, this is only fantasy.
Or is it?

(The implementation notes)

More seriously, this example shows frame animation inside sprite animation. The motions of the main characters of this demo are precompiled "movies". These movies (or "clips" as animators call them) are then put on top of a background, under the control of a sprite engine.

The sprite engine hooks the MIDI timer so that the sisters dance "on the beat". The synchronized sound effects are handled transparently by the frame animation engine. The sprite engine does the collision detection.

So actually, the real sisters at work here are EGI and AniSprite, the frame and sprite animation engines. United, they are a strong team. Although rivalry is not uncommon.


Animating "Sisters"

The Sisters demo is not interactive, in the sense that the animation does not respond on user input. Every animob (animob = "animated object") is set up with its own script, though, and these scripts allow for flexibility. The two main animobs, Fanni & Tila, for example, constantly monitor what either is doing and what the other's position is. Their next move is then based on the situation and environment. As such, the individual animations are context-sensitive. As another example of context-sensitivity, the "Lambada" song has a "bridge" where the tune and the rhythm change -actually this bridge appears twice in the piece. In response to the start and end of the musical bridge, which the animobs are notified of by the MIDI engine, Fanni and Tila start to perform an alternate "wiggle".

Sprite animation lends itself well for interactivity. Frame animation allows subtle an fluent animations. The Sisters demo shows a combination of both. The "clips" (short segments of a motion) of the characters are typically 6 to 12 frames long and those clips are glued together, under control of a script, to build a longer animation. The gestures of Fanni and Tila are more fluent than what you typically see in computer animation: there is slow-in and slow-out, anticipation, overlapping motion and secondary motion, a bit of staging, etc.

John Canemaker wrote that everything that animators do is ultimately tied to registration ("registration" refers to the proper alignment of various layers or objects that make a complete frame). All individual drawings must be lined up correctly in relation to the background image and in relation to each other. They must also be lined up correctly over time, as the next frame typically uses a different set of images on top of the same background. An finally, the drawings must be lined up below the camera in exactly the same way as on the drawing boards of the diverse artists. The other vital ingredient of animation is time. You must understand time, or else your animated character will not come to life. Context-sensitive animation, were the actions of the animobs are only partially planned ahead, brings its own challenge: synchronization. The animobs of the "Sisters" demo run each on a script; that is, the animobs are programmed to behave in certain ways. Much of the time, the scripts are searching for synchronization points: both synchronization points in the music, the beat, and synchronization points in the other animations. Registration, timing and synchronization, those are the magic words.

In hand drawn animation, it is very common to animate an action, then slow into a pose and hold the drawing of that pose for several frames, then move into action again. The animation stays alive even with the use of held drawings. But in computer animation, as soon as you go into a held pose, the action dies immediately. In the Sisters demo, we chose to make all clips where the animobs be synchronized a multiple of 6 frames (approximately ½ second). So such a clip could be 6, 12, 18, etc. frames, but not 8 frames. Keeping with this 6-frame granularity avoided that animobs had to "hold position" and wait for the synchronization point of another animob.

The graphics were made with a variety of tools. We used POV-ray to create the building blocks (the balls of the puppets, the checkerboard background). The clouds are from a stock photograph, the light effects in the background image were painted with Paint Shop Pro. Once the background was done, we created an optimized palette for the background plus all foreground material. The PaletteMaker utility allowed us to give more importance to the foreground colours than to the background. Then, with the palette done, the puppet animations were done with Pro Motion. Its "light-box" functionality and the ability to quickly test the animation make Pro Motion a valuable tool. Cast shadows were generated automatically by a series of scripts for the "PluginMaker" plug-in for Pro Motion. The shadows are drawn into each frame using two special palette entries; later, in run-time, AniSprite converts these special colours to the semi-translucent shadow. Finally, the animations were assembled with the EGI compiler. The main program is written in C. We used the Watcom 11.0 compiler, but we tested with Borland and Microsoft C/C++ compilers as well.


Technical aspects

As already mentioned in the implementation notes, the "Sisters" demo makes use of several toolkits. The demo primarily shows how they combine to something that goes well beyond each individual toolkit.

EGI is a toolkit for frame animation. You might compare it to Microsoft's Video for Windows, or to MPEG movies. EGI is optimized for short clips and "animation" content, rather than for longer recorded movies. EGI has some extensions towards interactive animations and hooks towards other engines. EGI is good at chaining together short (cyclic) animations in a fluent manner, and it already stores alpha channels, in one of several formats, that a sprite animation engine may need. You can find an extensive feature list of EGI at http://www.compuphase.com/software.htm".

AniSprite is a sprite animation toolkit, with support for alpha blending and shadow/shine effects and collision detection. In this demo, AniSprite is quite useful in painting four, partially overlapping, frame animations (from EGI) on a fixed background. The very first version of the Sisters demo used EGI alone. Getting two or more animations run on the same background without flicker was a lot of work, and we never got it exactly right. More features of AniSprite are found on http://www.compuphase.com/software.htm".

Maximum MIDI drives the animation. From the outset, it was clear that the principal animobs should dance on the beat. The Maximum MIDI toolkit makes this possible by exporting its timer to other applications/toolkits, plus it passes key MIDI events to your application. MIDI is primarily concerned with performance -not music- and a MIDI player is typically built upon a stable, high-resolution timer. If you want to synchronize animation to MIDI music, you have two alternatives: use a single timer or use multiple timers and synchronize these. To be blunt: synchronizing multiple timers under Microsoft Windows is not an option. Exporting the timer is a crucial feature that makes Maximum MIDI a valuable tool in computer animation. You can found out more of Maximum MIDI on http://www.maxmidi.com.

The "rivalry" between the toolkits that the implementation notes referred to is about overlapping features in the toolkits and alternative implementations for specific features. The first version of the Sisters demo did not use AniSprite at all. You can do simple sprite animation (with very mobile sprites) with EGI alone, plus a bit of work. Similarly, using film strips in AniSprite, you can do without EGI, at the cost of using quite a bit more memory. AniSprite can calculate alpha channels (transparency masks) on the flight, but EGI can also pre-compute them and store them in the animation file. AniSprite can fire an event on a collision and invoke a sound effect, but EGI can equally well set a label to a specific frame and attach the sound effect to that label. Andsoforth, andsoforth...


This message was brought you by...

...QHTM-light, a light-weight HTML control. When I wanted to have a simple control with formatted text and a few links to web-pages, a RichText control proved overkill and a hassle to set up. QHTM-light allowed me to just create the control and load a HTML page, and be done with it. QHTM and the "Light" version used here live at http://www.gipsysoft.com.