Post 9

    For this week, I got a bit side tracked with what I want to call the “side quests” of my animation demo reel. Instead of my third idea of a quick animation with a medium camera shot from the waist up of a female character looking determined as her familiar (cat) hops on her shoulder from above, I decided to do a facial expression test instead. I’m using the “Moisha” rig because of her many face clusters that allow me to manipulate her skin to move in a more organic way. I wanted a theme for my facial expression test to establish a personality for Moisha, so I referenced the flirty poses as well as the eyerolls and hair flips of a classic “mean girl”. For now, I turned off the visibility for the hair and extra accessories to not get overwhelmed by the details. I think I’ll be working until near completion in segments, one segment per expression. This is what I have so far for the first expression.

    The second “side quest” that I began but wanted in my demo reel anyway is creating a metahuman from a face scan and making an animation from it. I wanted to keep it simple but show off that I could do what a possible future employer would ask me to do. I decided to do a basketball scene with various dribbles, passes, and shots. I asked Dylan Robertson-Figaniak if I could scan his face using the free trial of an app called polycam and mocap his likeness. I sat him down in a library study room and set up a video for him to watch so his eyes wouldn’t track me as I went around his face taking 200+ photos. I found that any less than 200 caused inaccurate face deformations and more than 200 was just unnecessary and caused a long processing time. 

    In the motion capture room, I suited him up and gave him hand gloves so that I could also have finger tracking to remove the guesswork of how fingers move when it dribbles a ball. We also had to improvise certain movements because we didn’t use an actual basketball in fear of it bouncing too hard and damaging equipment. Instead we had a cheap plastic ball with no grip, so some actions did include him letting the ball bounce away mid-dribble and continue as if he still had it in his hand. I’ll probably have to counteranimate those actions along with any other issues such as the thumb constantly flipping backwards. So far, I cleaned up all the data so it didn’t have any gaps or swaps.

    I had to troubleshoot how to import the face scan into Unreal Engine, but, once it was in there, it was smooth sailing. Unreal did a pretty good job of tracing out his eyes, mouth, smile lines, and nose. I then imported that into Metahuman Creator to customize his features to look as similar to Dylan as possible. I will have to retarget the mocap, set up cameras, animate the ball, record face tracking of my own face onto his face and so much more. For next week, I'd like to dip back into my "main quest" that I've been avoiding so far while doing little things for these two "side quests".



Comments

Popular posts from this blog

493 Senior Portfolio Post 1

Post 2

Post 4