Alrighty then... I've reached the end of a milestone. Took a bit longer than I expected, but I got there. I set off to determine what I need in order to actually make an animated short (which turned into a series). I knew that I wouldn't be modelling my characters from scratch, so I wanted to use an open source character generator of some sorts. The ones available were makehuman and MB-Lab. After comparing both, I decided to generate most of my characters with MB-Lab. Not saying I'll never use Makehuman; I will, especially for younger aged characters. For example MB-Lab doesn't generate characters younger than 18 years. However, I decided to concentrate my effort on seeing how to modify MB-Lab to fit my needs. I ended up creating a few additions to MB-Lab which I need:

  1. Face Rig to make it easier to animate facial expressions
  2. Phoneme Rig to make it easier to lip-sync

To do both of the above, I needed to do so some coding. This took quiet a bit of time. You can look through the history in this blog to see what I did. Suffice it to say, it required me to learn how to create blender add-ons, learn how to create drivers for shape keys in blender and a bunch of extra mambo jumbo, to be able to make the face and phoneme rig work.

I also wanted to automate the lip-syncing as much as possible. I looked at existing add-ons to do that. But I decided to create my own: Yet Another Speech Parser (YASP). This has two components. A C program which uses pocketsphinx library to parse audio clips and generate a phoneme description in JSON format. Basically a JSON file describing the phonemes and their timing. I looked at a couple of speech parsing libraries out there, and I settled on pocketsphinx. The second part is a python blender add-on which takes as input the audio file and text transcript of the audio file and creates the animation. This actually worked more or less well. It creates a first pass animation, which I then fine tune. One of the weaknesses of MB-Lab is its lack of finer control over the face expressions. I think Makehuman has a better face rig, but I digress. It is what it is.

Along the way I got side tracked by trying to create an automated face expression tool, which uses an open source library called OpenFace to analyze footage and reflect the facial expressions from real footage on the animated character. I did try to use it, but I found that I can get better facial expressions through animating by hand. It's actually more fun too.

Once I started doing some simple animation, I quickly realized that the MB-Lab Rig isn't very nice. Thankfully, there is an add-on which generates a Rigify rig from the MB-Lab rig. Of course I had to tweak it and change it to work the way I wanted it to. That took a while. But I really like Rigify. It's a nice rig.

It's worth mentioning that I adopted Blender 2.8 while it's still in alpha stage. So I got involved in porting some of the add-ons to Blender 2.80 as well as opening tickets and submitting a couple of python patches. If you fish for my name you'll find me buried somewhere in the commit log.

Once I was at a point where the tools I created were mature enough, I decided to apply them and my skills to create an animated scene. I flip-flopped a bit with this. I started off thinking I'll do an independent short, then I changed my mind and decided to create an existing movie scene. After a bit of thinking I settled on the latter, but which scene should I do? I was first thinking of animating a Captain America Civil War fight scene, and I actually started. I created a run cycle, and then connected that to a car jump, but then I stopped going down that path. I just wasn't that into Marvel movies to justify spending a lot of time animating one of their fight scenes. I finally decided on a Star Trek First Contact scene. I'm a trekkie at heart and I'm not ashamed to say it. I love TNG (that refers to Star Trek: The Next Generation, for you non-trekkies out there). Anyway, I sorta settled on working on that and I got down and dirty with setting up the scene and doing the character animation, etc. Below is a video walk through of the final scene I ended up with.

The TNG scene I animated was training ground to see how I can use the tools I have at hand to create my series. As far as I can tell, these tools are sufficient to create game like animated scenes, sorta like this. The quality isn't gonna be extremely cinematic, but I really just want to get to create my story and do what I love to do, which is story telling and film-making

The next step for me is to start creating the animatics for my script. I'm still toying with the idea of pushing the TNG scene a bit further and creating some proper locations, polishing the lighting and atmosphere. The jury is still up on that.