Tuesday, December 15, 2009

OpenGL ES from the Ground Up Part 9a: Fundamentals of Animation and Keyframe Animation

This is not the article I was originally going to post as #9 in this series. That article will go up as #10. Before I get into OpenGL ES 2.0 and shaders, though, I want to talk about something more fundamental: animation.
Note:You can find the source code that accompanies this article here. A new version was uploaded at 10:14 Eastern time that fixed a problem with it not animating (see comments for details).
Now, you've already seen the most basic form of animation in OpenGL ES. By changing the rotate, translate, and scale transformations over time, we can animation objects. Our very first project — the spinning icosahedron - was an example of this form of animation, often called simple animation. Don't let the name fool you, though, you can do quite complex animations using nothing more changing matrix transformations over time.

But, how do you handle more complex animations? Say you want to make a figure walk, or a ball squish as it bounces?

It's actually not that hard. There are two main approaches to animation in OpenGL: keyframe animations and skeletal (or bone) based animations. In this installment, we'll be talking about keyframe animations, and in the next article (#9b), we'll look at skeletal animation.

Interpolation & Keys


Animation is nothing more than change in the position of vertices over time. That's it. When you translate, rotate, or scale an entire object, you are moving all of the vertices that make up one object proportionally. If you want to animate an object in more complex and subtle ways, you need a way to move each vertex different amounts over time.

The basic mechanism used in both types of animation is to store key positions for each vertex in an object. In keyframe animation, this is done by storing the individual position of every vertex for each key. For skeletal animation, it's done by storing the position of virtual bones, with some way to denote which bone affect the movement of which vertices.

So, what are keys? The easiest way to explain them is to go back to their origin, which is in traditional cel animation like the classic (pre-computer) Disney and Warner Brothers cartoons. In the earliest days of animation, small teams would do all the drawings that made up a short. But as the productions got larger, that became impossible, and they had to start specializing into different roles. More experienced animators took on the role of lead animator (sometimes called a key animator). These more experienced animators would not draw every cell in a scene, instead, they would draw the most important frames. These would usually be the extremes of motion, or poses that captured the essence of the scene. If they were animating a character throwing a ball, they might draw, perhaps, the frame where the arm was the furthest back, then a frame where the arm is at the top of the arc, then a third frame where the character released the ball.

Then, the key animator would move on to a new scene and another animator called an in-betweener (sometimes called a rough in-betweener, since it would often be another completely different person's job to clean up the in-betweener's drawings) would then figure out how much time there was between these key frames, then do all the intermediate drawings. If the throw was a one-second throw, and they were animating at twelve frames per second, they would have to figure out how to add an additional nine frames between the existing keyframes drawn by the lead animator.

The concept in three dimensional keyframe animation is exactly the same. You will have vertex data for the key positions in a motion, and your rough in-betweener will be an algorithm called interpolation.

Interpolating is some of the simplest math that you'll do in three-dimensional graphics. For each of the cartesian dimensions (x,y,z), you simply take the difference between the two keyframe values, figure out how much of the total animation time has elapsed, and divide the difference by that fractional portion.

It might make more sense if we do a practical example. Let's just look at one vertex. In our first keyframe, let's pretend that it's at the origin (0, 0, 0). For the second keyframe, we'll assume it's at (5, 5, 5), and the duration between these two keyframes if five seconds (just to keep the math nice and simple).

If we're one second into the animation, we just figure out the difference between the two vertices for each axis. So, in our case, the total movement between the two keyframes if five units on each of the x, y, and z axes (five minus zero equals five). So, if we're one second into our five second animation, we're 1/5th of the way through, so we add 1/5th of five to the first keyframe's x, y, and z values, to come up with a position of (1, 1, 1). Now, the numbers won't usually work out that nicely, but the math is exactly the same. Figure out the difference, then figure out based on the time elapsed what percentage of the way through this action we are, multiply the difference on each axis by that fraction, and then add the result to the first key frame's value for that axis.

This is the simplest form of interpolation, called straight line interpolation and it's just fine for most purposes. There are more complex algorithms that weight the interpolation based on how far into the animation you are. Core Animation, for example, provides the option to "ease in", "ease out", or "ease in/out" when performing an animation. Perhaps we'll cover non-straight-line interpolation in a future article, but for today, we're just going to keep things simple and work with straight-line interpolation. You can do the vast majority of what you want with this technique just by altering the number of keyframes and duration between them

Keyframe Animation in OpenGLES


Let's look at a really simple example of animation in OpenGL ES. When traditional hand-drawn animators are trained, the first thing they do is animate a bouncing ball that squishes as it bounces. It only seems fitting for us to do the same thing, so here's what our app is going to look like:
bouncy.png

Let's start by creating a ball in Blender (or any 3D program you want to use, if you've got a way to export the vertex and normal data in a useable manner. In this example, I'm going to use my Blender export script, which generates header files with the vertex data.

I start by creating an icosphere at the origin. I rename the mesh to Ball1, then I save this file as Ball1.blend and export Ball1.blend using my export script. You can find my Blender file for this keyframe here.
Screen shot 2009-12-15 at 2.45.21 PM.png

Now, I do a save-as (F2) and save a copy of the file as Ball2.blend. In this copy, I rename the mesh Ball2 so that the export script uses different names for the data structures. Then I hit 'tab' to go into edit mode, press 'A' and move and scale the ball's vertices so that it's moved down and is squished. I save the squished ball and export Ball2.h. You can find my Blender file for the second keyframe here.

Screen shot 2009-12-15 at 3.52.42 PM.png


At this point, I have two .h files, each containing the vertex positions for one keyframe in my animation. Working from my OpenGL ES template, I first define a few values in GLViewControler.h to help me keep track of the animation:
#define kAnimationDuration  0.3
enum animationDirection {
kAnimationDirectionForward = YES,
kAnimationDirectionBackward = NO
}
;
typedef BOOL AnimationDirection;

Since I will be bouncing back and forth between the two keyframes, I need to keep track of whether it's traveling forward or backward. I also set a value to control how fast it moves between the two keyframes.

Then, in GLViewController.m, I interpolate between the two keyframes repeatedly, like so (don't worry, I'll explain):
- (void)drawView:(UIView *)theView
{
static NSTimeInterval lastKeyframeTime = 0.0;
if (lastKeyframeTime == 0.0)
lastKeyframeTime = [NSDate timeIntervalSinceReferenceDate];
static AnimationDirection direction = kAnimationDirectionForward;

glClearColor(1.0, 1.0, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0.0f,2.2f,-6.0f);
glRotatef(-90.0, 1.0, 0.0, 0.0); // Blender uses Z-up, not Y-up like OpenGL ES

static VertexData3D ballVertexData[kBall1NumberOfVertices];

glColor4f(0.0, 0.3, 1.0, 1.0);
glEnable(GL_COLOR_MATERIAL);
NSTimeInterval timeSinceLastKeyFrame = [NSDate timeIntervalSinceReferenceDate] - lastKeyframeTime;
if (timeSinceLastKeyFrame > kAnimationDuration) {
direction = !direction;
timeSinceLastKeyFrame = timeSinceLastKeyFrame - kAnimationDuration;
lastKeyframeTime = [NSDate timeIntervalSinceReferenceDate];
}

NSTimeInterval percentDone = timeSinceLastKeyFrame / kAnimationDuration;

VertexData3D *source, *dest;
if (direction == kAnimationDirectionForward)
{
source = (VertexData3D *)Ball1VertexData;
dest = (VertexData3D *)Ball2VertexData;
}

else
{
source = (VertexData3D *)Ball2VertexData;
dest = (VertexData3D *)Ball1VertexData;
}


for (int i = 0; i < kBall1NumberOfVertices; i++)
{
GLfloat diffX = dest[i].vertex.x - source[i].vertex.x;
GLfloat diffY = dest[i].vertex.y - source[i].vertex.y;
GLfloat diffZ = dest[i].vertex.z - source[i].vertex.z;
GLfloat diffNormalX = dest[i].normal.x - source[i].normal.x;
GLfloat diffNormalY = dest[i].normal.y - source[i].normal.y;
GLfloat diffNormalZ = dest[i].normal.z - source[i].normal.z;

ballVertexData[i].vertex.x = source[i].vertex.x + (percentDone * diffX);
ballVertexData[i].vertex.y = source[i].vertex.y + (percentDone * diffY);
ballVertexData[i].vertex.z = source[i].vertex.z + (percentDone * diffZ);
ballVertexData[i].normal.x = source[i].normal.x + (percentDone * diffNormalX);
ballVertexData[i].normal.y = source[i].normal.y + (percentDone * diffNormalY);
ballVertexData[i].normal.z = source[i].normal.z + (percentDone * diffNormalZ);

}


glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glVertexPointer(3, GL_FLOAT, sizeof(VertexData3D), &Ball2VertexData[0].vertex);
glNormalPointer(GL_FLOAT, sizeof(VertexData3D), &Ball2VertexData[0].normal);
glDrawArrays(GL_TRIANGLES, 0, kBall1NumberOfVertices);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);


}



First, I start out with some setup. I create a static variable to keep track of when we hit the last keyframe. This will be needed to determine how much time has elapsed. The first time through, we initialize it to the current time, then we declare a variable for keeping track of whether we're animating forward or backward.

    static NSTimeInterval lastKeyframeTime = 0.0;
if (lastKeyframeTime == 0.0)
lastKeyframeTime = [NSDate timeIntervalSinceReferenceDate];
static AnimationDirection direction = kAnimationDirectionForward;


After that, we do normal OpenGL ES stuff. The only thing of note here is that we rotate -90° on the X-axis. We're accounting for the fact that OpenGL ES uses a Y-up coordinate system and Blender uses a Z-up. We could also have rotated in Blender instead.

    glClearColor(1.0, 1.0, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0.0f,2.2f,-6.0f);
glRotatef(-90.0, 1.0, 0.0, 0.0); // Blender uses Z-up, not Y-up like OpenGL ES

Next, I declare a static array to hold the interpolated data:
    static VertexData3D ballVertexData[kBall1NumberOfVertices];


Just to keep things simple, I set a color and enable color materials. I didn't want to clutter this example up with texture or materials.

    glColor4f(0.0, 0.3, 1.0, 1.0);
glEnable(GL_COLOR_MATERIAL);

Now I calculate how much time has elapsed since the last keyframe. If the time elapsed is greater than the animation duration, we flip the direction around so that we're going the other way.
    NSTimeInterval timeSinceLastKeyFrame = [NSDate timeIntervalSinceReferenceDate] - lastKeyframeTime;
if (timeSinceLastKeyFrame > kAnimationDuration) {
direction = !direction;
timeSinceLastKeyFrame = timeSinceLastKeyFrame - kAnimationDuration;
lastKeyframeTime = [NSDate timeIntervalSinceReferenceDate];
}

NSTimeInterval percentDone = timeSinceLastKeyFrame / kAnimationDuration;


In order to accommodate bi-directional animation, I declare two pointers to the source keyframe data and destination keyframe data, and point each one to the appropriate data array based on the direction we're currently going.

    VertexData3D *source, *dest;
if (direction == kAnimationDirectionForward)
{
source = (VertexData3D *)Ball1VertexData;
dest = (VertexData3D *)Ball2VertexData;
}

else
{
source = (VertexData3D *)Ball2VertexData;
dest = (VertexData3D *)Ball1VertexData;
}


And, finally, the interpolation. Here's a fairly generic implementation of that linear interpolation we were talking about:
    for (int i = 0; i < kBall1NumberOfVertices; i++) 
{
GLfloat diffX = dest[i].vertex.x - source[i].vertex.x;
GLfloat diffY = dest[i].vertex.y - source[i].vertex.y;
GLfloat diffZ = dest[i].vertex.z - source[i].vertex.z;
GLfloat diffNormalX = dest[i].normal.x - source[i].normal.x;
GLfloat diffNormalY = dest[i].normal.y - source[i].normal.y;
GLfloat diffNormalZ = dest[i].normal.z - source[i].normal.z;

ballVertexData[i].vertex.x = source[i].vertex.x + (percentDone * diffX);
ballVertexData[i].vertex.y = source[i].vertex.y + (percentDone * diffY);
ballVertexData[i].vertex.z = source[i].vertex.z + (percentDone * diffZ);
ballVertexData[i].normal.x = source[i].normal.x + (percentDone * diffNormalX);
ballVertexData[i].normal.y = source[i].normal.y + (percentDone * diffNormalY);
ballVertexData[i].normal.z = source[i].normal.z + (percentDone * diffNormalZ);

}

Then, all that's left to do is to do cleanup work.
    glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glVertexPointer(3, GL_FLOAT, sizeof(VertexData3D), &Ball2VertexData[0].vertex);
glNormalPointer(GL_FLOAT, sizeof(VertexData3D), &Ball2VertexData[0].normal);
glDrawArrays(GL_TRIANGLES, 0, kBall1NumberOfVertices);
glDisableClientState(GL_VERTEX_ARRAY);
glDisableClientState(GL_NORMAL_ARRAY);

}

Not that hard, right? It's just division, multiplication, and addition. Compared to some of the stuff we've been through so far, this is nothing. This is the basic technique used, for example, in the MD2 file format used by Id in their older games. Every animation used is performed using keyframe animation, just like I've done here. Later versions of Milkshape support other file formats, but you can do some pretty sophisticated animations using keyframes.

If you want to check out the bouncy ball, you can download my Xcode project and run it for yourself.

Not all 3D animation is done using keyframes, but interpolation is the basic mechanism that enables all complex animation. Stay tuned for the next installment, #9b, where we use interpolation to implement a far more complex form of animation called skeletal animation.



31 comments:

mik said...

Very complete. Milkshape ascii format could handle skinning. And HL mdl format was derivated from Quake MD2 format

Niklas said...

Thanks for a new great tutorial.

How about the POD format which seems to support animations as well, and as I understand it is the preferred optimal format for the POWERVR chip in iPhones since it's created by the chip producer.

http://www.imgtec.com/powervr/insider/powervr-utilities.asp

warmi said...

POD is a very efficient format for storing meshes but it sucks tremendously when it comes to storing animations. It supports skeletal and nodal animation which is fine but it has some severe limitations like forcing all objects in the scene to have the same number of keyframes, requiring constant keyframe step as well as not supporting more than one animation per POD file ( you can get around that by having an axillary file which lists animation ranges like “Walk” frame 0-12, Run frame 13-15 etc … but it still sucks)

I wrote an exporter for Lightwave as well as converter from Ogre Mesh files and Halflife Source SMD files to POD and it was a pain when it came to convert animation data ... I had to resample perfectly valid animations to make sure that all tracks have the same number of frames and are stepping at constant frame rate which works fine 99% of time but sometimes can cause subtle errors.

In other words, I am using POD for meshes and even animations in my engine but I am thinking about adopting some other format (like for instance Ogre Mesh/Skeleton )

Jeff LaMarche said...

Niklas:

What he said.

:)

I actually haven't looked at the POD format myself, but Warmi's comments are always insightful and informed, so I'd run with it :)

Jeff LaMarche said...

Mik:

Yes, I should have been clearer - I meant the original .md2 file format.

Tim said...

I've been following your OpenGL ES tutorials - thank you - and this one does not run.

It starts with the ball smashed against the bottom of the screen and does nothing from there :(

-t

Vineet said...

The ball doesn't animate for me as well.

Jeff LaMarche said...

Tim and Vineet - What version of the iPhone SDK are you guys working on? Are you trying to run on the simulator or on the device?

Jeff LaMarche said...

D'oh! My bad. I froze it there in code to take a screen shot, and must have zipped it up before I changed it back! Sorry, I'll repost momentarily.

You can fix your versions by changing the calls to glVertexPointer and glNormalPointer to look at the interpolated array rather than the Ball2 array:

glVertexPointer(3, GL_FLOAT, sizeof(VertexData3D), &ballVertexData[0].vertex);
glNormalPointer(GL_FLOAT, sizeof(VertexData3D), &ballVertexData[0].normal);

Jeff LaMarche said...

Okay, you should be able to re-download now and get the correct project.

Tim said...

Works great now!

Erik Price said...

Don't forget to update the table of contents page at http://iphonedevelopment.blogspot.com/2009/05/opengl-es-from-ground-up-table-of.html

Zero said...

Great work, thanks!

Tim said...

I tried to make a new Ball2 object, by modifying the Ball1.blend object by making it smaller, round and further down the screen.

When I export this .h file and replace it in the project, I was hoping to get a ball that bounced "deeper" into the screen and still down into the bottom.

All I ended up with was a ball in place that animates to a smaller ball and back, but no movement down the page.

Any idea what I missed in Blender? I moved the ball down as well as scaled it smaller - at least I thought I did!

Thanks,

-t

Prabakar said...

It is good, but doesn't explain anything about blender usage. I installed blender and don't know anything how to use it? Creating image is mandatory using blender? If i don't want to use blender, how would i proceed? It is definitely not suitable for new-bies who are entering into Opengl for iPhone development. Are there any articles for new-bie which is explaining with step by step procedure to start with Opengl for iPhone development? I don't find any..

Ryan said...

@Prabakar: He doesn't explain how to use Blender to create 3d models in the same way he doesn't explain how to type to write code.

Generating 3D models in Blender won't be any different for iPhone than any other platform. There are some great tutorials in Blender3D: Noob to Pro to help with that.

bengro said...

How do I run your .py script in blender to export the .blend file to .h format?

Jonathan said...

Hi,

This is an excellent tutorial from new-bie to experience devs. I want to create a 'under water' kind of effect in my game app screen(full screen with water flowing). How to achieve this? Any helps please?

Thanks.

Lunar Lion said...

I have been totally looking forward to your next tutorial discussing bone animation. Do you have a time frame of when you might post this?

Alex said...

Thank you for your tutorials. They're a great learning source for any newbie to Open GL.
I've looked over your script for exporting objects in .h from blender and I've seen that it can return TexturedVertexData3D. I've added a UV mapped texture over a cube and tried to export it in .h but instead of TexturedVertexData3D it exported VertexData3D. Is there any tricky UV mapping settings in Blender to work with the script?

Alex said...

I've figured out what I was missing. After adding the texture you have to go in Object Mode , press the U key and select Object & ObData & Materials + Tex.

mattfite said...

this is such a great tutorial and blog. thank you so much and please keep them coming.

richard said...

Very Nice guide, i was wondering if this will be effient for charachter animations, like walk cycles, jump crouch and such? or should i be looking into something else?

tsilber said...

I can't download your export script. Has innerloop.biz stopped hosting it, or ceased to exist? Could you offer an alternate download location?

Also, I am very interesting in the process of extracting mesh and material data in OpenGL. In DirectX, the SDK will do it for you if you phrase all of this data in .X files, but in OpenGL there is no such option to my knowledge. Could anyone offer a good resource for learning about this?

Kallewoof said...

The ball is still smudged to the bottom in the example code written in the post (i.e. if reader writes it on their own into their project, like I tend to when I try to learn new things from examples, the code's still off).

That aside, awesomely informative. Thank you so very much for all the work you've put into these.

Blake said...

I still don't understand how to export the ball from blender using your script...

Kallewoof said...

Blake: good way to get a response: be detailed, show that you tried, show that you tried hard.

Veronika Irvine said...

Hi Blake,

I have never used Blender either but I found the following useful reference:
http://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro/Advanced_Tutorials/Python_Scripting/Export_scripts

Below is a quick summary of what it said there but there are more details in the tutorial.

To install the script:

You need to copy the provided python script into the blender script folder
(e.g. /Developer/Applications/blender-2.49b-OSX-10.5-py2.5-intel/blender.app/Contents/MacOS/.blender/scripts) Then in Blender you open the Scripts Window and in the toolbar of the Scripts Window choose "Scripts" the "Update Menus".

To run the script:

Under the "File"->"Export" menu you should see "Objective-C Header .h".
Select this and it will export Ball1.h into the same folder where you have Ball1.blend.

Edwin said...

scrub m65 kamagra attorney lawyer body scrub field jacket lovegra marijuana attorney injury lawyer

SEO Services Consultants said...

Nice information, many thanks to the author. It is incomprehensible to me now, but in general, the usefulness and significance is overwhelming. Thanks again and good luck! Web Design Company

h4ns said...

What youre saying is completely true. I know that everybody must say the same thing, but I just think that you put it in a way that everyone can understand. I also love the images you put in here. They fit so well with what youre trying to say. Im sure youll reach so many people with what youve got to say.

Arsenal vs Huddersfield Town live streaming
Arsenal vs Huddersfield Town live streaming
Wolverhampton Wanderers vs Stoke City Live Streaming
Wolverhampton Wanderers vs Stoke City Live Streaming
Notts County vs Manchester City Live Streaming
Notts County vs Manchester City Live Streaming
Bologna vs AS Roma Live Streaming
Bologna vs AS Roma Live Streaming
Juventus vs Udinese Live Streaming
Juventus vs Udinese Live Streaming
Napoli vs Sampdoria Live Streaming
Napoli vs Sampdoria Live Streaming
Fulham vs Tottenham Hotspur Live Streaming
Fulham vs Tottenham Hotspur Live Streaming
AS Monaco vs Marseille Live Streaming
AS Monaco vs Marseille Live Streaming
Alajuelense vs Perez Zeledon Live Streaming
Alajuelense vs Perez Zeledon Live Streaming
Technology News | News Today | Live Streaming TV Channels