"The results were never very good, so I was extra keen to try some deep learning magic on this particular example to see if it could do better." "Back in 2011, before deep learning was such a big thing, I spent months and months working on the state of the art traditional algorithms for deblur," Ben tells us in his presentation. Using a scene from his own film, Ben wanted to see what could be done with the ML model and attempted to deblur a few out-of-focus shots using only eleven small crops. It wasn’t until Ben started to experiment with the template that things started to fall into place and the team found a better way of creating the ML toolset. This in-house tool would make it easy to train new neural networks and allow customers to easily experiment with machine learning inside of Nuke. The initial training template the team created was based on these conventional ML tools, mentioned above. But, it wasn’t until Ben’s unexpected moment of machine learning enlightenment that they hit the jackpot. With all this in mind, the Research team continued to work on creating ML tools that were the right fit for artists. Plus, there is the added chance that they won’t produce the desired effect at all, which can be a massive time-waster. While there is nothing explicitly wrong with this, they're not the perfect outputs needed by VFX artists. This often means hundreds or thousands of images are needed to train a neural network.Ī problem that arises with these tools is that they can produce what Ben describes as 'decent quality effects'. For these to work effectively, they need a large dataset of images. As it is impossible to know what footage an artist might want to use, these networks tend to be pre-trained and more generic. That's why Nuke 13.0 saw the first implementation of machine learning-a goal that Foundry's A.I Research team was created for.īen and the Research team looked into producing a fairly conventional image-to-image-based machine learning tool, the norm for most digital software. Part of that involves staying up-to-date and providing artists with the latest tech. Nuke is often seen as ubiquitous in the VFX industry. Scroll for a full breakdown of the presentation, and a step-by-step guide to Cop圜at-from its conception to how you can use it on your next project. While we’ve briefly discussed Cop圜at before, Ben’s presentation gave a whole new view into Nuke’s new ML toolset, as well as an in-depth look at machine learning in the VFX industry. Last week saw NVIDIA’s virtual GTC conference for 2021.Īmongst the speakers was Foundry’s own Ben Kent, Research Engineering Manager & AI Team Lead, who was joined by Thiago Porto, VFX Supervisor/Senior Comp at MPC New York, as they dived into machine learning and Cop圜at, new to Nuke 13.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |