Lip Sync & Facial Deformers in Houdini!
As of late, I’ve been obsessed with Lip Sync in Houdini! You may have seen some of my tweets recently and they are all talking animals and insects. Honestly I never knew Houdini had the power to do Lip Sync within it’s application! This totally blew my mind as I was looking for a solution to do text to speech animation for my characters, but instead found Houdini doing procedural Lip Synchronization!
Houdini can do almost anything!!
Lip Sync LIVE during Stream!
I also tried running the Lip Sync workflow LIVE during a live stream. So I got a bug character to do the lip sync talking on the corner while I host the Live Stream. The whole process was a test to see if it would be possible to do this live running Houdini in the background during the Live Stream and I was very excited to see it work out smoothly!
Walking Talking Teddy!#Houdini #sidefx #houdini #Animations #3dart #indiegamedev #indiedev #goprocedural #gameart #gamedevelopment #gamedev #indiegame #gametools #animation3d #cg #CGI #VFX #animation #3dartist pic.twitter.com/4ag8W6fHtj
— Blue Bursting Bubble (@BubblePins) June 19, 2021
This creates a whole new set of possibilities for streamers, live performances, or maybe even webinars? What’s better than having a Teddy Bear teach physics?
How does Lip Sync work in Houdini?
Lip Sync in #Houdini using CHOPS!
— Blue Bursting Bubble (@BubblePins) June 7, 2021
8 Blendshapes, 15 phonemes, 2 filters#sidefx #houdini #Animations #3dart #indiegamedev #indiedev #goprocedural #gameart #gamedevelopment #gamedev #indiegame #gametools #animation3d #cg #CGI #VFX #animation #3dartist pic.twitter.com/gdtQoVRGq9
Houdini had a bunch of voice nodes that allows us to capture our voice on an audio recording or even live feed through the microphone. Then it’ll go through a phoneme library that you get to create based on your own voice for better accuracy and recognize phoneme values through the speech that you’re trying to animate onto your character.
Each red bar that rises upward represents a phoneme value being detected in the audio recording, which triggers a blendshape on the pig head. During the entire recording all the phoneme sounds get detected and blendshapes get fired off here and there resulting in Lip Sync animation!
Talking Ladybug LOL .... what else should I generate lip sync with...#Houdini #sidefx #houdini #Animations #3dart #indiegamedev #indiedev #goprocedural #gameart #gamedevelopment #gamedev #indiegame #gametools #animation3d #cg #CGI #VFX #animation #3dartist pic.twitter.com/M98cMtM2jO
— Blue Bursting Bubble (@BubblePins) June 8, 2021
I’m not an expert at CHOPS and rarely use it. But all the Lip Sync is done in CHOPS inside Houdini, which was one of the major struggles I had while trying to get this all working in Houdini.
I keep using the Pig Head from the list of test geometries available inside Houdini to test my proof of concept project. The Pig Head is perfect because it’s the only geometry available inside Houdini with an open mouth.
How do you get the Blendshapes in the first place?
I’m working on a HDA that will automate the process of creating the blendshapes for the facial lips and it’ll all tie in with the Lip Sync workflow in Houdini. Hopefully this will solver the issue of generate Lip Sync blendshapes for the mouth, because with the help of the HDA everything can be done within Houdini and without using external applications.
Having said all that, I did start off the Lip Sync workflow using third party applications to help me generate the lip blendshapes in the first place. This was to help me get the Lip Sync workflow working in Houdini before processing onto creating my Facial Deform HDA.
You either sculpt it yourself or get a third party program to generate blendshapes for a character. I used a Blender addon program called “Faceit” to map out the landmarks of the face of my characters and it spits out blendshapes. There are tons of programs out there that do this for you. Reallusion’s Character Creator 3 generates characters that come with facial blendshapes that can also be plugged into the Houdini Lip Sync workflow.
My Facial Deformer HDA
I’m proud to announce that I just finished my HDA that deforms a character’s face for Lip Sync phoneme Lip shapes! It’s still work in progress, but I have it functional! I currently have 10 phoneme Lip Shapes procedurally mapped into the HDA and I believe I can improve the quality of the Lip Sync by adding more phoneme Lip shapes into the procedural asset.
The process is a lot simpler using my HDA, all you have to do is click points on the mesh on the lips indicating areas on the lips to tell my HDA the geometry of the lip. Then adjust the scale and deformation thresholds to get the best result and it’ll spit out the Lip Blendshapes! I also went even further to link the blendshapes to the CHOPS channel values indicating each phoneme value. You can have Lip Sync working in minutes!
I have an overview demo of a beta version of my Facial Deformer HDA in a Live Stream that I hosted on my channel. It’s the video link on the right.
The Facial Deformer HDA will be released soon out for Members-Only!
Facial Character Animation
This is a topic I’m hugely interested in! And previously I looked into using a webcam to do the facial motion capture and I even started learning OpenCV to build my own Facial Mocap using just a webcam.
Why not just get an iphone and use the TrueDepth camera, like everyone else?
I am sticking to the webcam because I feel there’s a huge potential here. Imagine working on a film or video all done remotely. You may not have the budget to rent a studio or may not have the space at home for a home studio and having tasks done remotely may be your only way to get the work done. Using a webcam to do the facial motion capture, you could potentially hire remote actors / actresses and have them act out the scene and send you the footage electronically. Then potentially feed it into an application like something based off of OpenCV, which I’m banking on, and spit out motion capture data that you could plug into Houdini, Blender, or any other 3D animation software.
Open Source Projects
I’m not the first to do this. There are tons of people on github that have already been developing software utilizing again just a webcam to do motion tracking. I went to try some of the these open source projects and do a simple test for myself on a very cheap 30fps webcam that costed me around $30, 10 years ago! It’s a very old webcam… The results were still very good! Have a look for yourself on the video on the right.