We've seen how to make and export static meshes, now it's time to look at meshes you can play animations on. As in the previous article, I'll investigate how the pipeline expects the data to be set up, then take a look at how you export that data, and what happens in the process.
Note that I'm focussing on non-character objects (i.e. humanoïds, animals, etc) because, well, characters aren't the only ones that need animations, and because it is slightly easier for me to generate data for them. I will cover characters though. Someday.
Why should I care?
Pretty much the same as usual, to see how restrictive or flexible the pipeline is, what's done in the editor, what's done in the modelling package.
In UDK
(Note that once again, only the FBX pipeline is covered.)
In Unreal, whenever you need an object to play an animation and/or to have some of its parts being procedurally controlled (e.g. cloth, vehicles), you will need a Skeletal Mesh, so called because it's relying on the mesh having a skeleton (e.g. a hierarchy of bones/joints).
Authoring the model
As far as I'm aware, there is little to no requirement for the scene setup. Limitations are the same as for static meshes. There is no specific setup, but you will need at least two things:
- A hierarchy of objects that represent the skeleton. These objects will usually be bones/joints, but they don't have to (however, you might have a hard time skinning meshes on non-bones). For instance, in UDK's sample skeleton, the IK targets are meshes, rather than joints. They get converted into joints at export time.
- At least one mesh that is bound to the skeleton. Your mesh can be split in several parts if needed. If you can, avoid having some of the meshes to export be parented to other meshes that need to be exported, or the importer might get confused if you import two hierarchies (the skeleton and the hierarchised meshes).
Theoretically, it is possible to have no bones and just imply the hierarchy through mesh parenting. This is what Unreal calls rigid skeletons. It works, though I've never been able to properly export animations on that kind of skeleton. Problems with the objects' pivots, which seem to follow the rules of static meshes. I'll admit I didn't try very hard to make it work, as skinned meshes work perfectly fine.
Exporting the model
It is pretty simple. Select all the meshes that you have to export, along with the root of the skeleton. Note that this last bit is somewhat optional. If you only select skinned meshes, the importer will somehow retrieve the skeleton. However, it will only import the bare minimum of bones to support the imported geometry.
You might have noticed that the reoccurring word here is "select" You can't just export the thing, you always have to select objects first before you export, meaning you have to be careful not to forget anything, or make selection sets and keep them up to date.
There's nothing special in the import process. Once you're done, you can open the AnimSet editor to see your mesh and modify some of its properties.
The AnimSet Editor is a huge beast, so I'll just highlight the few features that pertain to the mesh itself:
- This is where you set the materials.
- You can move or rotate bones in that 3D view. It's only for preview purpose, allowing you to see if the skeleton has been imported properly.
- Mesh simplifcation and LOD generation is done here, thanks to Simplygon (though it doesn't seem to be as good as with static meshes).
- This is also the place to set up cloth and soft bodies.
- You can set up sockets, which are attachment points for particle systems or additional meshes.
Collisions
Unlike static meshes, where you can define the collision mesh in your modelling software, collisions for skeletal meshes can only be defined within the editor, using a tool called PhAT (Physics Asset Tool).
This tool has two main uses: allowing you to tie collision primitives to the object's joints, and define the constraints between these primitives that will rule physics driven animation (e.g. ragdoll simulation).
For each joint, you can define one or more primitives that together make the collision hull that is tied to that joint. You can translate/rotate/scale those primitives however you want. By default, when you create a physics asset, PhAT automatically creates primitives for each joints, matching the size of the bounding box of the meshes skinned to that joint.
Obviously, not every single joint has to have a primitive. Now the "problem" is that the only primitives available are boxes, spheres and capsules. Which makes it a pain to make any accurate collision without going for per-polygon collision (which, if I'm not mistaken, is only used for decals and other raycasts anyway). That said, bear in mind that the main purpose of PhAT is physics simulation, hence the primitives having to be simple.
For each joint, you can define one or more primitives that together make the collision hull that is tied to that joint. You can translate/rotate/scale those primitives however you want. By default, when you create a physics asset, PhAT automatically creates primitives for each joints, matching the size of the bounding box of the meshes skinned to that joint.
Obviously, not every single joint has to have a primitive. Now the "problem" is that the only primitives available are boxes, spheres and capsules. Which makes it a pain to make any accurate collision without going for per-polygon collision (which, if I'm not mistaken, is only used for decals and other raycasts anyway). That said, bear in mind that the main purpose of PhAT is physics simulation, hence the primitives having to be simple.
Exporting an animation
In the FBX pipeline, the "rule" is 1 animation = 1 file. Which makes sense in some ways, is very annoying in others. However, the exact rule is 1 animation = 1 FBX file, allowing a bit more flexibility provided we work cleverly.
The FBX exporter bakes the animation, and by default only export the current frame range on the time slider. So with a bit of work, it can't be too hard to come up with a script that holds a list of frame ranges to export several animations from a single scene. Once again, you need to select what you need to export. In the case of animations, only the root of the skeleton needs to be selected.
Importing the animation into the engine is a tiny bit different. In Unreal, animations (called sequences) belong to AnimSets, which are just a bunch of animation. I'll explain more about the point of doing this in another article, but this means animations are imported from within the AnimSet Editor, and not the content browser.
In the AnimSet editor, you can preview your animations, alter the play rate, define the compression options, and more importantly, set notifies. They are events that you place during the animation to trigger various responses (sound, particle, script events, etc.). Each type of notify is its own class, with its own custom settings. A notify can be instant or have a duration.
You can also convert existing animations to additive animations.
In the AnimSet editor, you can preview your animations, alter the play rate, define the compression options, and more importantly, set notifies. They are events that you place during the animation to trigger various responses (sound, particle, script events, etc.). Each type of notify is its own class, with its own custom settings. A notify can be instant or have a duration.
Note the animation properties on the bottom left hand side, and the notifies along the timeline. |
In CryEngine 3 SDK
If we exclude facial animation, there are two ways of animating stuff in CryEngine 3: either you animate hard body geometry (CGA, for CryEngine Geometry Animation), or you animate skinned geometry (characters (note that any skeletal mesh is considered a character)). Depending on your choice, the pipeline differs a bit. The CryEngine documentation seems to recommend CGA whenever animation doesn't involve deforming the geometry, as the CGA pipeline is simpler (for instance, there's no need to worry about animation compression).
I said I was going to focus on non-character animated meshes, so I'll ignore that second pipeline for this article. Also, remember that I'm mostly talking about the Maya pipeline.
I said I was going to focus on non-character animated meshes, so I'll ignore that second pipeline for this article. Also, remember that I'm mostly talking about the Maya pipeline.
Authoring the model
Rules are the same as for static geometry: you put your meshes into groups that represent objects, parent these objects as you see fit and put everything in a group called cryexportnode. As with static meshes, LODs are authored "by hand".
However, there is a significant difference. For a reason I do not understand, it would seem that Maya needs to be set in Y-Up for CGA to export in the correct orientation (unless I missed something there). That is really weird as this is not the case for static geometry and, more importantly, Z is up in Sandbox...
However, there is a significant difference. For a reason I do not understand, it would seem that Maya needs to be set in Y-Up for CGA to export in the correct orientation (unless I missed something there). That is really weird as this is not the case for static geometry and, more importantly, Z is up in Sandbox...
Because CGA isn't using skeletons, you're completely free to do whatever you like with the hierarchy. There doesn't even need to be a hierarchy as the animation will simply record the position and rotation of each object.
The other nice thing about not having to use a skeleton is the fact that there's no need for skinning. as such, updating the rig is as easy as adding or removing meshes from the corresponding object groups.
Helpers (the equivalent of Unreal's sockets) are authored in the modelling package by simply creating a locator with a specific name, allowing you to profit from all of its placement tools.
Helpers (the equivalent of Unreal's sockets) are authored in the modelling package by simply creating a locator with a specific name, allowing you to profit from all of its placement tools.
Exporting the model
This is exactly the same process as for static geometry. You just need to make sure that you're exporting to the correct file type (CGA). Once exported, the mesh can be previewed in the Character Editor.
Though there isn't much more to do here. Animations can be previewed if you've exported some. You can also display the skeleton through a debug option, but you can't check the hierarchy or the skinning. That said, here I'm working with a CGA which by definition doesn't have a skeleton, even though each object is considered a bone when it comes to attachments.
Attachments, they are the raison d'être of the Character Editor. It is obvious that this tool has be designed with modular characters in mind. In the skeletal geometry pipeline, the idea is to export a naked skeleton, and each body part individually. Then, in the Character Editor, you create a number of attachments, allowing you to... attach an object to a bone.You can attach just about any kind of object (static, hard body, skeletal geometry). Then, that skeleton with all its attachments can be saved as a new Character Definition File (CDF).
But you can also add attachments to CGA objects, and thus create CDF files for them as well. We can then see that the two pipelines converge at some point. However, the CDF stuff is beyond the scope of this article.
Collisions
Collision meshes for animated geometry are authored in the modelling package in the same way as for static geometry. This allows you to create more accurate collision meshes quite easily.
Ragdoll constraints are also done there, but only work for skeletal geometry (as the extra attribubtes are only set on joints). The fact that they're not done in the editor means that you can't live tweak the values, though iteration time isn't bad as the export is pretty quick.
Exporting an animation
The first big thing about animation export is CryEngine is the fact that (apparently), the exporter doesn't support assets from referenced scenes (in Maya, at least). It means that it won't find the cryexport node or the material groups, mandatory components of a successful export.
The problem with that, is the fact that CGA export always exports the mesh AND the animations together. As a result, you can't author your animations in different files, as you can't reference in the rig (Note that at the time of writing, I haven't tested if this is also true for skeletal animations. I'm hoping guessing it's not.). I find this extremely annoying, even though it's probably fine for objects that don't have many animations. My guess is that the CGA objects were initially designed for decorative animated objects not gameplay relevant.
You can author your animations however you want, using controllers or expressions, for instance. You only need to make sure that you've set at least one key (somewhere, anywhere, doesn't have to be on the exported groups/meshes).
Before exporting, you need to define your animations using the Animation Manager.
You just need to specify a frame range, a name, a root, and a destination folder. Then you export the geometry, and the animations will get exported as well, in separate files.
Back in Sandbox, the Character Editor will pick up all the animations living in the same folder as the CGA file you're exporting. You can use it to preview animations, and set up animations events (which is basically a string with extra information).
The red thing is the position of the event (the time is in percentage of the animation). |
The bulk of the in-editor animation work in done in the Animation Graph Editor, which I'll pit against Unreal's AnimTree Editor in another article.
Conclusion
Thanks to FBX and the fact that all the configuration is done editor-side, Unreal's pipeline is very straightforward. That means it doesn't take much training to get the artists up and running, which is always a good thing. I don't know what's going on with the rigid animation pipeline. The fact that you can't group related animations in one file is a bit annoying, but it can be worked around. The ability to verify the skinning in the AnimSet Editor without having to create a test animation is also quite valuable.
In CryEngine, hard body geometry works very well, eliminating a number of steps in potential modifications to the rig. The Character Editor looks fantastic as it means modular meshes with little to no code support required. Doing the collision in the modelling package is frankly simpler that having to play with primitives. Also, and usual in CryEngine, getting geometry in game is a one-step process (compared to the export/import in Unreal). The referencing issue is quite a big deal for me, though but it's nothing that couldn't be fixed in the future.
No comments:
Post a Comment
Note: only a member of this blog may post a comment.