Animating Characters in Real Time With Your Face: a guide to adobe's character animator
Adobe unveiled a video previewing their Character Animator, and the software looks unreal! Posed to be a part of the Adobe CC suite, the application is sure to be a game-changer for 2D animators, helping downsize work time and allowing for playtime in front of the facial recognition software that animates characters in real-time.
Character Animator is a motion capture and animation tool that allows users to bring digital characters to life in real time. It uses your expressions and movements, captured through a webcam, to animate characters with a process called performance capture. Perfect for animators, streamers, and content creators, it simplifies the animation process, making it more interactive and fun. This Adobe tool integrates seamlessly with other Adobe Creative Cloud apps, enhancing creative workflows.
Using advanced face tracking, the application synchs the movements of the human face with that of any character created in Photoshop or Illustrator. When you speak, your character speaks. When you blink, so does your character. By naming image layers according to Character Animator conventions, the software captures your facial movements and animates that of your characters simultaneously. Simple movements can be controlled via your mouse.
Donโt believe us โ have a look:
Though, it is still in its preview phase, weโre uber impressed and canโt wait to play with the final product.
You can also check out the After Effects blog to learn more about the features set to be a part of Character Animator.
Curious about how a digital designer can help you with your graphics and animations? Don't hesitate to reach out to Blue Ocean Interactive Marketing. Our team can't wait to work on your vision!