By James DeRuvo (doddleNEWS)
Australia based CoreMelt makes powerful and flexible editing plugins that simplifies the post production workflow. Roger Bolton, a visual effects guru with over 15 years of post production experience spanning Lord of the Rings, Charlie and Chocolate Factory, and Kingdom of Heaven, is the founder and CEO. For this round of 20 Questions, I sat down with him to talk about their latest plugins and what they bring to the post production party.
Q: Who is CoreMelt and what do you do?
We make plugins! Specifically we focus on Final Cut Pro X, and our speciality is plugins that involve motion tracking and stabilisation in some way.
Q: Give us some backstory. What did you do before you started CoreMelt?
I’ve been an online editor / flame artist, worked on feature films on Nuke and Shake and performed live motion graphics (VJ) at events and festivals in Australia and Europe. I’ve been working in some way with video since 1997 – just over 20 years.
Q: How did you get started in this business?
I worked for the Australian reseller of flame online software back in the late 90s, originally in tech support then moved across to the artist side.
Q: Final Cut X is pretty full featured, where does CoreMelt add value for editors?
FCP X has a great feature set, but there’s always opportunities to improve workflow in some way. Using my experience as an online editor and feature film compositor I am always trying to find ways to simplify complex tools to make them accessible to all levels of editor no matter what experience you have with other software.
Q: What kind of solutions do you offer?
We offer a fairly comprehensive bundle, but I think our strongest tools are our stabilizer – Lock and Load which is faster than the built in stabiliser and often gives better results. Then we have a set of products for motion tracking, including screen replacements, tracking 3D text to the background, tracked masks for color grading and a lot more.
Q: I understand you have a tight relationship with the team behind mocha; talk about that relationship and what it means for your products.
BorisFX and the guys at Imagineer Systems are great and as you know, mocha is a very powerful planar tracker. About four years ago they developed a SDK to allow other developers to license the mocha tracker. We were the first plugin creator to add mocha tracking to a product, and since then we’ve expanded the range of uses for planar tracking and work closely with them on improvements to the mocha SDK.
Q. Which products in your lineup take advantage of integrated mocha Planar Tracking?
At the moment we have our new grading plugin – Chromatic, that has tracked masks, then we have TrackX for screen replacements, DriveX for tracked 3D text and particles and SliceX for simple tracked masks, blurs etc. More coming soon!
Q. How has that integration benefited your customers?
One thing we’ve learnt is that editors hate having to round trip to another software package, they want to do as much as possible on the timeline in their NLE. Using our plugins with mocha integration they can do things like replacing the screen on a handheld mobile phone in a few minutes, without having to go into After Effects or ask another artist to do the task.
Q. Which is your most popular plug-in and why?
Actually our “Everything Bundle” with all our plugins is our best selling product, for individual products our Chromatic grading plugin is the most popular. FCP X 10.4 got some really strong new grading tools but our plugin still has advantages, tracked masks, a better color keyer and support for control surfaces like the tangent ripple.
Q. Which plug-ins do you think are the best kept secret?
DriveX lets you track 3D text or call outs onto video, it’s a really effective technique and we have a lot of people using this in the real estate video industry. Still it’s something we think more people should try in their projects.
Q: Who are some of your customers and why did they choose you?
We have everyone from editors cutting promos in Hollywood, to BBC editors to one man shop shooter, editors. The diverse range of our tools, value for money and solid support are the reasons our customers tell us they choose us.
Q: What’s the most interesting or complex project a customer has completed using your plug-ins? And how did your tools enable them to do things they might night have been able to do?
Our plugins were used by the editorial team on the feature films Whiskey Tango Foxtrot and Focus. They were able to quickly add rough comps of set extensions or tracked elements on the timeline, allowing the Director to more clearly see everything in context. In many cases these shots were then finished in Nuke, but some of them became finals in the delivery.
Q. You had some exciting announcements at NAB 2018. What was the industry reaction?
We were showing a test version of a new tracked paint tool and the reaction was “when can we have it”. We also showed a recent version of our auto transcription tool, we think people are very keen to get their hands on that as well.
Q. We heard about an exciting new Paint tool in beta. Can you tell us more about that?
Paint on a frame, hit the track button and boom, the paint stroke follows the motion. Using clone or heal tools you can use this to quickly fix blemishes on skin, to remove or hide objects in the frame, or even to perform tracked warps on sections of the image. It’s going to enable some common fixes in literally one or two minutes that would otherwise involve painting frames in photoshop then tracking them back in.
Q: How do you help a client match the right plug-ins with their applications and workflows?
We design our products to be task based, rather than tool based. So if you want to track a new screen onto a phone, that’s TrackX etc. We have many many video tutorials and sample videos to help customers understand what they need or of course we will happily direct people via email or social media to the right product.
Q: Artificial Intelligence was a big buzzword at NAB 2018 – where do you see the opportunities for AI in post production?
One of the obvious areas is automatic generation of metadata. Our transcription tool that performs speech to text is an example, but AI based computer vision will be able to generate tags about the content of shots, plus the settings. I think very soon it will standard to let all your rushes be analysed by cloud AI and then when you start editing you have many different ways the shots have been sorted and tagged for you. Replacing the assistant editor in many ways.
Q: What other trends are you watching closely?
I’m a big fan of using High Dynamic Range color at all stages of a project, during capture, editing and delivery. However even if you are only delivering SDR, capturing and editing in HDR gives you a lot of additional creative freedom. It’s great to see affordable cameras like the Panasonic GH5s now on the market and hopefully we’ll see a lot more in the next year.
Q: What do you think of VR? Where is it going?
I think 360 video has specific markets that it’s going to be great for, tourism, real estate, immersive experiences. I don’t see narrative based 360 video as taking off. The real future of VR is games in my opinion, the viewer really needs to be able to interact with VR, to move where they want, 360 video can’t achieve that.
Q: What does the future hold?
Seems like cloud based editing is inevitable, but as someone who has travelled in a lot of countries with spotty broadband internet I am somewhat skeptical of it. The other trend will be use of deep learning techniques to make tools smarter, so hopefully editors can then spend more time being creative instead of doing tedious work.
Q: What’s in your camera bag?
For more information about CoreMelt’s bundle of plugins, visit https://coremelt.com.