I've been in the process of making some changes to my website, mainly switching to a WordPress powered site and changing the skin as well. I'll publish this change by the end of the week. While the main address will remain the same, any bookmark or RSS feed subscription will become invalid. I'll post here again and on G+ once the new site is live.
I'm slightly changing my "posting policy" on this blog. From here on, I will only post on the blog articles I wrote. For links or random thoughts, I'll use public posts on Google+. So if you happen to be interested in what I have to share outside of my articles, you can followcirclify add me to Google+. I'll also share there the links to any new blog post. Just click on my name on this post and it should send you to my G+ profile.
And no, I don't have Twitter and I don't like it :)
Last time I took a look at the startup sequence of a CryEngine 3 game. Now, it's time to investigate what happens when loading a level, up to the spawning of the player's character. This will eventually lead us to the Lua side of the engine.
The investigation will start from when the player selects a level from the main menu.
I'm starting to look at the programming side of the CryEngine 3 Free SDK. I've been quite surprised to see that the official documentation gives little to no information on the overarching concepts and systems that are present in the engine. I then decided go down the brutal path to understand the system, which is putting breakpoints (almost) everywhere, and see what happens, in which order.
As usual when I do that kind of self-teaching, I find that writing up the results of my investigation helps me making sure I know what I'm talking about. And it might as well be useful to other people.
We've seen how to make and export static meshes, now it's time to look at meshes you can play animations on. As in the previous article, I'll investigate how the pipeline expects the data to be set up, then take a look at how you export that data, and what happens in the process.
Note that I'm focussing on non-character objects (i.e. humanoïds, animals, etc) because, well, characters aren't the only ones that need animations, and because it is slightly easier for me to generate data for them. I will cover characters though. Someday.
Why should I care?
Pretty much the same as usual, to see how restrictive or flexible the pipeline is, what's done in the editor, what's done in the modelling package.
I am now all set to animate my helicopter. I'll need a bunch of animations to make my tests and I'll start with a simple one : a looping animation of the rotors turning. This will be the opportunity to talk about a few of Maya's animation related features: referencing, playblasts, and the graph editor.
Preparing a new article for the UDK vs CryEngine 3 SDK series, I had to rig a non-organic object (read: not a character) in order to test the animated object pipeline with something simple. It also gave me the opportunity to make a skeleton from scratch and learn about skinning (or binding, in Maya language), which is the topic of this article.
For a number of reasons that don't need explaining, I chose to rig a vehicle, an helicopter to be precise. I took the Black Hawk from the CryEngine 3's sample assets. However, it wasn't rigged, so I had to it myself as I wanted to try and export this model into UDK. Good exercise.
Earlier this week, a co-worker let me know about a tool aimed at helping people write branching story lines, mainly for games. It's called Articy:draft and allows you to define the flow of your story, the conversations (even handles multiple-choice dialogue systems). You can also create annotated maps of the various locations. It's multi-user and integrates with version control systems. It looks really cool, but the benefits/price ratio will make it hard to convince anyone to buy licences.
Today, I'll take a look at the pipeline for getting your own meshes into the editor. I will focus on "normal" meshes (i.e. those that are not rigged).
I won't spend any time explaining the actual process, the documentation (UDK, CryEngine) does that well enough. I'll be pointing out the restrictions and peculiarities that rule the way the artists should work. I will then go over geometry export, collision export, and material export.
Note that as I'm a Maya user these days, I may talk more about Maya than about Max.
Why should I care?
I want to know how quick it is to get a new model into the game, and how quick it is to update it.
As I was testing things out for the article on whiteboxing, I realised that each editor has its own set of nice features that makes your life easier when placing meshes and various entities in the world. I thought it deserved an article of its own.
Why should I care?
Well, because the easier it is to mess about with level's objects, the faster I can work.
Today 's post is focussed on the whiteboxing tools available in UnrealEd and Sandbox. And I've just realised I haven't written a glossary entry about whiteboxing, so I'll quickly explain what it is. Whiteboxing is the process of producing a very crude version of the level's layout. It focuses on the playable space and its purpose is to evaluate things like scale, navigability and timings.
Why should I care?
As an evaluation tool, a whitebox level will be modified many times, and sometimes started over from scratch. This is why whitebox geometry must be cheap to produce and easy to modify, so people (especially artists) don't feel like their time is wasted. Ideally, the whiteboxing tool would be simple enough so you don't actually need modelling skills to create those basic shapes, allowing level designers to completely own that stage. That's why the tools must be easy to use and their result easy to modify.