Copycat in Nuke - How Rotoscoping works in AI
It’s no secret that machine learning (ML) has been on the rise in the visual effects (VFX) industry for the past few years. With its increased popularity comes an opportunity for a more streamlined and efficient way of working.
This is especially important in the changing climate, with schedules getting tighter and projects becoming more complex. Artists are met with fresh challenges and need new ways of working and tools that can keep pace.
ML is a saving grace for many and is becoming a crucial part of VFX pipelines everywhere.
With this in mind, Foundry, as part of the recent Nuke 13.0 release, has integrated a new suite of machine learning tools including CopyCat—a plug-in that allows artists to train neural networks to create custom effects for their own image-based tasks.
CopyCat is designed to save artists huge amounts of time. If an artist has a complex or time-consuming effect such as creating a garbage matte that needs to be applied across a sequence, an artist can feed the plug-in with just a few example frames. CopyCat will then train a neural network to replicate the transformation from before to after, this can be used to apply the effect to the rest of the entire sequence.
We take a look at this exciting new toolset and chat with Ben Kent, Foundry Research Engineering Manager, and A.I. Research Team Lead, to find out more.
#nuke #visualeffects #vfxtamil #vfx #hollywoodmovies #animation #nuketutorial