For day two of roaming the halls at NAB, I stopped by the booth over at Adobe and got to see a demo of the content-aware fill for video that everyone has been talking about. And it’s nothing short of astonishing. Plus, I sat down with my favorite video editing company, FXhome, to talk about three major announcements here in Las Vegas, and looked at version two of Luma Touch. Can multi track video editing in 4K really happen on the iPad? Apparently so. Plus, I talked with Puget Systems about how important it is to design a custom video editing rig from the ground up, rather than just buy one off the shelf.
The Adobe demo of Content-Aware Fill for Video was very impressive. To set the scene, they used a video of a bunch of kids playing soccer in a mud pit after the rain. The soccer ball was bright red, and some of the kids had on red shirts. So you can imagine that to remove the soccer ball from the video image would be pretty challenging. But after masking around the soccer ball, with a press of a button the ball disappeared, as if it was wrapped in a cloak of invisibility from Harry Potter.
Content-aware fill then “time travels” throughout each frame in the shot, looking for pixels that can be duplicated to replace the hole in the image. Adobe Sensei, the A.I. that drives CAF, then replaces every pixel to make it look as if the soccer ball was never there. And does that for every frame along the track of the moving soccer ball, even as it passes by the red shirted player. Should the A.I. not find a similar pixel to copy, it will merely create one using the details from similar pixels around it.
It’s not perfect, mind you, if you’re viewing it with a critical eye, you may see some minute artifacting, or skewed details. But nothing that can’t be cleaned up in post. The bottom line is, that Content-Aware Fill for video is a serious tool that will save many shots that are lost due to the boom mic getting into frame, or someone wandering in that isn’t wanted. In fact, in my chat with Victoria Nece of Adobe, she mentioned that many boom mic operators are excited because they no longer have to worry about hovering the mic just out of frame. They can get the mic closer to the talent to pick up even cleaner audio during a shot, knowing that the editor can just use the tool to remove the mic later.
We also talked about how Adobe’s latest version of Character Animator is going to be a valuable tool in creating pilots for animated series, saving hundreds of thousands. It also means that lower budgeted animated projects can create impressive proof of concepts on a shoestring.
Adobe also showcased several gamers who use Character Animator to create their own characters that can be overlayed in the gaming footage for lets play videos, and a new tool gives the audience a chance to manipulate those characters in real time. And it works with both 2D and 3D characters. Fun.
Adobe Audition gets new dubbing and ducking features that are accurate to the frame, enabling audio dubbing and voiceovers to polish every word in a sentence. Freeform View in Adobe Premiere Pro is also a handy tool that enables post production organization in a freeform manner, according to the video editor’s preferences.
You can listen to my entire interview with Victoria Nece of Adobe here –
Onto Central Hall, I visited LumaTouch, and chatted with Terri Morgan about the latest features in version 2.0 of their LumaFusion video editor for the iPad. Terri told me that in vs. 2, LumaFusion users now the ability to edit six tracks of 4K video, and six tracks of audio, complete with graphics, titles, music and more. Luma works by accessing each clip separately, that are either stored in the cloud (through DropBox or an other service) or via a GnarBox or other wireless storage device.
All users need is an iPad that runs on iOS 11 or higher. Moreover, if you have a model that is a few years older, LumaFusion still works, you just end up editing in downscaled 1080p video files. Audio users can even use LumaFusion to edit for audio, and in that instance, they can cut up to 24 independent tracks. It’s a very powerful app. My interview with Terry can be found here –
Just next door, I chatted with Kevin Buongurio of FiLMiC Pro, telling him I simply had to have FiLMiC Audio as soon as possible. Kevin mentioned that FiLMiC Audio is in limited BETA, and will be available this summer. We then talked about how FiLMiC Pro has also worked with Freefly and it’s software development kit, to map the buttons on the Freefly Movi Cinema Robot, that one handed video game controller like gimbal, to have tactile control over all of FiLMiC’s features, without ever having to touch the screen. Each button has been assigned with a function, from adjusting white balance, to locking in focus, to even changing your aperture and shutter speed, and with enough practice, the muscle memory makes it faster to setup a shot than every before. Clearly, the FiLMiC team is burning the midnight oil and are coming up with some fabulous tools for filmmakers as a result.
From there, I sat down with the team at FXhome, where they had three exciting announcements for HitFilm users. HitFilm is a great, low cost video editor that can give even the most entry level video editor professional grade tools on a budget. This week, FXhome announced that to celebrate reaching a user community of over 4 million filmmakers, that HitFilm Pro would be getting the Foundry Camera Tracker for 2D and 3D motion graphics. HitFilm is also going to be getting the ability to use After Effects plugins, which will make the effects tools in HitFilm even more powerful than ever. Lastly, FXhome is partnering with Sony to bring the effects tools found in Hitfilm, along with Emerge Imaging, to Sony Vegas. A very exciting time, to be sure!
Listen to my chat with FXhome’s Kirstie Tostevin and Hitfilm Guru Javer Vallvarr:
Lastly, why is it so important to custom built an editing workstation, when you can pickup a powerful computer to cut your video off the shelf? Well, Jon Bach, the CEO of Puget Systems, says it’s all about squeezing out every bit of performance out of your system, by crafting a set of parts that are built with your workflow in mind. Puget assigns a guide which will query what your budget is, what your workflow is like, and where you plan to go into the future, than then works with you to create a system to do everything you need and more. Puget will then build the machine, test it with complete benchmarks according to the apps you’ll be using, and then ship your system to you within a week. Then, when you’re ready to upgrade, you send back your system and they’ll do all the work, test it out, and then get it back to you. It’s a one stop shop and they use the best CPUs and GPUs on the market. Check out PugetSystems.com.
Day three promises some excitement as we head over to the North Hall for some eSports, look at how the internet is coming into your car, and we’ll get up close and personal with SmallHD/Teradek’s new wireless solutions for controlling your camera and lenses. Plus, I’ll sit down with Digital Anarchy to see what’s new in Transcriptive 2.
Time to hit the floor!