How Framestore harnessed the power of machine learning on Long Live The Prince
Machine learning (ML) continues to make waves across a range of sectors and industries, not least visual effects (VFX) and animation, both of which can benefit from the time-saving benefits that automation and AI promise.
And since schedules are getting tighter whilst projects increase in complexity, studios and artists are met with fresh challenges. To overcome these, they need new ways of working supported by the latest technologies and cutting-edge tools.
It was this realisation that underpinned the development of Foundry’s machine learning tools, launched in Nuke 13.0. Included among these was
CopyCat—a plug-in that allows artists to train neural networks to create custom effects for their own image-based tasks.
Designed to save artists huge amounts of time, CopyCat can be used to tackle complex or time-consuming effects such as creating a garbage matte. Should this need to be applied across a sequence, an artist can feed the plug-in with just a few example frames. CopyCat will then train a neural network to replicate the transformation from before to after, and this can be used to apply the effect to the rest of the entire sequence.
Since its launch, CopyCat has enjoyed rigorous testing by studios large and small. One such studio was Framestore, who recently used the tool on Long Live the Prince, a ground-breaking anti-knife campaign in which football prodigy Kiyan Prince is digitally brought back to life after tragically losing his life in a stabbing attack at the age of 15.
Under the care of Johannes Saam, Creative Technologist at Framestore, and Karl Wooley, Project Lead, Kiyan’s likeness was sensitively and delicately recreated digitally as part of the campaign’s short film.
We caught up with Johannes to find out a little more about the creative process behind the project, and how machine learning technology in the form of Nuke’s CopyCat node was deftly used to benefit the production.
Machine learning made accessible
Released commercially in March 2021 as part of Nuke 13.0, CopyCat has spent the past few months settling into pipelines as artists and studios explore its possibilities and potential. Asked about Framestore’s experience using CopyCat so far, Johannes lends his thought on the ease with which it’s been to pick up and experiment with the tool.
“CopyCat is a very well-documented Node,” he tells us. “It was super easy to understand it and get going, especially as it was so intuitive. It’s a concept that we were internally discussing a lot before, and it helped that I‘d spent a lot of time researching it beforehand and that there are super useful YouTube guides available.”
Yet CopyCat is not Framestore’s first foray into the ML space, as Johannes points out. “We have deployed many R&D efforts in machine learning and I’m generating many images with generative art using machine learning. Generative Adversarial Networks (GANs) are constantly on our minds to create new and exciting images with. Owning many images labeled and ready to be used at a VFX house makes it very interesting to research those topics.”
Given Framestore’s ML efforts so far, how does CopyCat compare?
“CopyCat is a simple way to get started,” Johannes explains. “It allows for creative ideas to be driven by artists rather than engineers. This is where its true power comes from. Demystifying the dark arts of machine learning is enough to get artists to think about its concepts in a creative way.”
Putting CopyCat to work
At just under two minutes long, Long Live the Prince serves as an ode to late football prodigy Kiyan Prince, in which he signs for his former football club Queens Park Rangers, going onto wear the squad number ‘30’ to reflect the age he would have been today.
“Framestore took the entire project on a pro bono basis,” Johannes tells us. Speaking of how specifically CopyCat was used on Long Live The Prince, he continues: “We had to remove the label of an actor's football jersey and had no compositor available to do the job, so I painted four frames in Photoshop myself and fed them into the [CopyCat] node. After only two hours of training on my laptop, the logo was removed. This shows how simple it is to do tasks which would otherwise need an artist for a considerable amount of time.”
Ultimately, the use of machine learning on the project was a boon in terms of saving time, so artists could spend it being creative. “CopyCat enabled us to maximise artist time on areas that required the most attention from us,” Johannes explains.
CopyCat was used on two shots in total as part of the project. In these, the team cropped and tracked Kiyan’s shirt rather than using checkpoints from one shot as a basis for the next.
“I generated a random dataset from both shots,” Johannes tells us. “After the training was completed, we used the same model for both shots with success.”
Charting a course for CopyCat
Prior to Long Live The Prince, CopyCat has enjoyed extensive testing in Framestore’s pipeline, the most notable being the training of a playblast/preview render from Maya with the final render.
“This gave us a model that could translate a preview render to a final quality render,” Johannes explains. “The results are not perfect, but good enough to turn many heads towards this workflow. Neural rendering, in general, is going to change the production workflow a lot over the next few years, in my opinion. And this is just a first little glimpse of its power in enhancing the rendering pipeline.”
Speaking of the future, we were keen to find out what Johannes and the Framestore team would like to see from CopyCat in the foreseeable—and, wider, from machine learning in Nuke.
“Specific networks for extraction, inpainting, and keying are only the start,” he tells us. “GANs should generate images for textures; Nuke’s power in image manipulation would be combined with the latest research and create a true playground for procedural and scalable image manipulation/creation.”
When it comes to content creation, the future is bright for machine learning. Its impact on workflows and scalability is undeniable—something Johannes is quick to point out.
“Machine learning will be first used to ‘bake’ otherwise complex or computationally intensive processes into one dataset,” he comments. “This is already used a lot in the 3D pipeline. Rigging, rendering, and FX are already taking vast datasets generated by slow processes and generating bespoke nets that perform the specific task faster and more scalable.”
“This is precisely what CopyCat brings to Nuke— identifying labor or computationally intense steps in a production pipeline and automating them based on datasets rather than human labor. This frees up artists to focus on the actual creative work. It's only the beginning, and once technical directors are also fluent in this new technology, we will see very innovative applications of the ideas behind neural rendering and the collapsing of several pipeline steps into "black boxes."
Discover what CopyCat can do for your pipeline