Tag Archives: Adobe

A glimpse at what was new at NAB

By Lance Holte

I made the trek out to Las Vegas last week for the annual NAB show to take in the latest in post production technology, discuss new trends and products and get lost in a sea of exhibits. With over 1,700 exhibitors, it’s impossible to see everything (especially in the two days I was there), but here are a handful of notable things that caught my eye.

Blackmagic DaVinci Resolve Studio 14: While the “non-studio” version is still free, it’s hard to beat the $299 license for the full version of Resolve. As 4K and 3D media becomes increasingly prevalent, and with the release of their micro and mini panels, Resolve can be a very affordable solution for editors, mobile colorists and DITs.

The new editorial and audio tools are particularly appealing to someone like me, who is often more hands-on on the editorial side than the grading side of post. To that regard, the new tracking features look to provide extra ease of use for quick and simple grades. I also love that Blackmagic has gotten rid of the dongles, which removes the hassle of tracking numerous dongles in a post environment where systems and rooms are swapped regularly. Oh, and there’s bin, clip and timeline locking for collaborative workflows, which easily pushes Resolve into the competition for an end-to-end post solution.

Adobe Premiere CC 2017 with After Effects and Audition Adobe Premiere is typically my editorial application of choice, and the increased integration of AE and Audition promise to make an end-to-end Creative Cloud workflow even smoother. I’ve been hoping for a revamp of Premiere’s title tool for a while, and the Essential Graphics panel/new Title Tool appears to greatly increase and streamline Premiere’s motion graphics capabilities — especially as someone who does almost all my graphics work in After Effects and Photoshop. The more integrated the various applications can be, the better; and Adobe has been pushing that aspect for some time now.

On the audio side, Premiere’s Essential Sound Panel tools for volume matching, organization, cleanup and other effects without going directly into Audition (or exporting for ProTools, etc.) will be really helpful, especially for smaller projects and offline mixes. And as a last note, the new Camera Shake Deblur effect in After Effects is fantastic.

Dell UltraSharp 4K HDR Monitor — There were a lot of great looking HDR monitors at the show, but I liked that this one fell in the middle of the pack in terms of price point ($2K), with solid specs (1000 nits, 97.7% of P3, and 76.9% of Rec. 2020) and a reasonable size (27 inches). Seems like a good editorial or VFX display solution, though the price might be pushing budgetary constraints for smaller post houses. I wish it was DCI 4K instead of UHD and a little more affordable, but that will hopefully come with time.

On that note, I really like HP’s DreamColor Z31x Studio Display. It’s not HDR, but it’s 99% of the P3 colorspace, and it’s DCI 4K — as well as 2K, by multiplying every pixel at 2K resolution into exactly 4 pixels — so there’s no odd-numbered scaling and sharpening required. Also, I like working with large monitors, especially at high resolutions. It offers automated (and schedulable) color calibration, though I’d love to see a non-automated display in the future if it could bring the price down. I could see the HP monitor as a great alternative to using more expensive HDR displays for the majority of workstations at many post houses.

As another side note, Flanders Scientific’s OLED 55-inch HDR display was among the most beautiful I’ve ever seen, but with numerous built-in interfaces and scaling capabilities, it’s likely to come at a higher price.

Canon 4K600STZ 4K HDR laser projector — This looks to be a great projection solution for small screening rooms or large editorial bays. It offers huge 4096×2400 resolution, is fairly small and compact, and apparently has very few restraints when it comes to projection angle, which would be nice for a theatrical edit bay (or a really nice home theater). The laser light source is also attractive because it will be low maintenance. At $63K, it’s at the more affordable end of 4K projector pricing.

Mettle 360 Degree/VR Depth plug-ins: I haven’t worked with a ton of 360-degree media, but I have dealt with the challenges of doing depth-related effects in a traditional single-camera space, so the fact that Mettle is doing depth-of-field effects, dolly effects and depth volumetric effects with 360-degree/VR content is pretty incredible. Plus, their plug-ins are designed to integrate with Premiere and After Effects, which is good news for an Adobe power user. I believe they’re still going to be in beta for a while, but I’m very curious to see how their plug-ins play out.

Finally, in terms of purely interesting tech, Sony’s Bravia 4K acoustic surface TVs are pretty wild. Their displays are OLED, so they look great, and the fact that the screen vibrates to create sound instead of having separate speakers or an attached speaker bar is awfully cool. Even at very close viewing, the screen doesn’t appear to move, though it can clearly be felt vibrating when touched. A vibrating acoustic surface raises some questions about mounting, so it may not be perfect for every environment, but interesting nonetheless.


Lance Holte is an LA-based post production supervisor and producer. He has spoken and taught at such events as NAB, SMPTE, SIGGRAPH and Createasphere. You can email him at lance@lanceholte.com.

Steve Holyhead

AJA brings on Steve Holyhead from Fox Broadcasting

Steve Holyhead has joined AJA as senior product manager for desktop products. He joins AJA from Fox Broadcasting Company where he was director of technical operations.

Holyhead recently moved to Grass Valley, where AJA is headquartered, from Los Angeles. In addition to working at Fox, his 20-plus years of industry experience includes developing professional digital video workflows with BloomCast, managing post operations at Discovery Communications and working as a technology evangelist, producer and technical marketing manager for both Discreet (now Autodesk) and Avid. He has also developed Avid and Adobe training courses for multiple partners, including Lynda.com.

“Steve brings a blend of real-world production and technology developer experience to AJA. His understanding of production, broadcast and post, together with his experience both designing enterprise scale workflows and as a master trainer for Adobe, Apple and Avid products, will make powerful contributions to the success of our customers,” says Nick Rashby, president of AJA.

Updates to Adobe Creative Cloud include project sharing, more

By Brady Betzel

Adobe has announced team project sharing!! You read that right — the next Adobe Creative Cloud update, to be released later this year, will have the one thing I’ve always said kept Adobe from punching into Avid’s NLE stake with episodic TV and film editors.

While “one thing” is a bit of hyperbole, Team Projects will be much more than just simple sharing within Adobe Premiere Pro. Team Projects, in its initial stage, will also work with Adobe After Effects, but not with Adobe Audition… at least not in the initial release. Technically speaking, sharing projects within Creative Cloud seems like it will follow a check-in/check-out workflow, allowing you to approve another person’s updates to override yours or vice-versa.

During a virtual press demo, I was shown how the Team Projects will work. I asked if it would work “offline,” meaning without Internet connection. Adobe’s representative said that Team Projects will work with intermittent Internet disconnections, but not fully offline. I asked this because many companies do not allow their NLEs or their storage to be attached to any Internet-facing network connections. So if this is important to you, you may need to do a little more research once we actually can get our hands on this release.

My next question was if Team Projects was a paid service. The Adobe rep said they are not talking the business side of this update yet. I took this as an immediate yes, which is fine, but officially they have no comment on pricing or payment structure, or if it will even cost extra at all.

Immediately after I asked my last question, I realized that this will definitely tie in with the Creative Cloud service, which likely means a monthly fee. Then I wondered where exactly will my projects live? In the cloud? I know the media can live locally on something like an Avid ISIS or Nexis, but will the projects be shared over the Internet? Will we be able to share individual sequences and/or bins or just entire projects? There are so many questions and so many possibilities in my mind, it really could change the multiple editor NLE paradigm if Adobe can manage it properly. No pressure Adobe.

Other Updates
Some other Premiere Pro updates include: improved caption and subtitling tools; updated Lumetri Color tools, including much needed improvement to the HSL secondaries color picker; automatic recognition of VR/360 video and what type of mapping it needs; improved virtual reality workflow; destination publishing will now include Behance (No Instagram export option?); improved Live Text Templates, including a simplified workflow that allows you to share Live Text Templates with other users (will even sync Fonts if they aren’t present from Typekit) and without need for an After Effects License; native DNxHD and DNxHR QuickTime export support, audio effects from Adobe Audition, Global FX mute to toggle on and off all video effects in a sequence; and, best of all, a visual keyboard to map shortcuts! Finally, another prayer for Premiere Pro has been answered. Unfortunately, After Effects users will have to wait for a visual keyboard for shortcut assignment (bummer).

After Effects has some amazing updates in addition to Project Sharing, including a new 3D render engine! Wow! I know this has been an issue for anybody trying to do 3D inside of After Effects via Cineware. Most people will purchase VideoCopilot’s Element 3D to get around this, but for those that want to work directly with Maxon’s Cinema 4D, this may be the update that alleviates some of your 3D disdain via Cineware. They even made mention that you do not need a GPU for this to work well. Oh, how I would love for this to come to fruition. Finally, there’s a new video preview architecture for faster playback that will hopefully allow for a much more fluid and dynamic playback experience.

After Effects C4D RenderAdobe Character Animator has some updates too. If you haven’t played with Character Animator you need to download it now and just watch the simple tutorials that come with the app — you will be amazed, or at least your kids will be. If you haven’t seen how the Simpson’s used Character Animator, you should check it out with a YouTube search. It is pretty sweet. In terms of incoming updates, there will be faster and easier puppet creation, improved round trip workflow between Photoshop and Illustrator, and the ability to use grouped keyboard triggers.

Summing Up
In the end, the future is still looking up for the Adobe Creative Cloud video products, like Premiere Pro and After Effects. If there is one thing to jump out of your skin over in the forthcoming update it is Team Projects. If Team Projects works and works well, the NLE tide may be shifting. That is a big if though because there have been some issues with previous updates — like media management within Premiere Pro — that have yet to be completely ironed out.

Like I said, if Adobe does this right it will be game-changing for them in the shared editing environment. In my opinion, Adobe is beginning to get its head above water in the video department. I would love to see these latest updates come in guns blazing and working. From the demo I saw it looks promising, but really there is only one way to find out: hands-on experience.

Brady Betzel is an online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com, and follow him on Twitter @allbetzroff. Earlier this year, Brady was nominated for an Emmy for his work on Disney’s Unforgettable Christmas Celebration.

Experiencing autism in VR via Happy Finish

While people with autism might “appear” to be like the rest of us, the way they experience the world is decidedly different. Imagine sensory overload times 10. In an effort to help the public understand autism, the UK’s National Autistic Society and agency Don’t Panic have launched a campaign called “Too Much Information” (#autismTMI) that is set to challenge myths, misconceptions and stereotypes relating to this neurobiological disorder.

In order to help tell that story, the NAS called on London’s Happy Finish to help create a 360-degree VR film that puts viewers into the shoes of a child with autism during a visit to the store. A 2D film had previously been developed based on the experience of a 10-year-old boy autistic boy named Alexander. Happy Finish provided visual effects for that version, which, since March of last year, has over 54 million views and over 850K shares. The new 360-degree VR experience takes the viewer into Alexander’s world in a more immersive way.

After interviewing several autistic adults as part of the research, Happy Finish worked on this idea that aims to trigger viewer’s empathy and understanding. Working with Don’t Panic and The National Autistic Society, they share Alexander’s experience in an immersive and moving way.

The piece was shot by DP Michael Hornbogen using a six-camera GoPro array in 3D printed housing. For stitching, Happy Finish called on Autopano by Kolor, The Foundry’s Nuke and Adobe After Effects. Editing was in Adobe Premiere. Color grading was via Blackmagic’s Resolve.

“It was a long process of compositing using various tools,” explains Jamie Mossahebi, director of the VR shooting at Happy Finish. “We created 18 versions and amended and tweaked based on initial feedback from autistic adults.”

He says that most of the studio’s VR experiences aim to create something comfortable and pleasant, but this one needed to be uncomfortable while remaining engaging. “The main challenge was to be as realistic as possible, for that, we focused a lot on the sound design as well as a testing a wide variety of visual effects, selecting the key ones that contributed to making it as immersive and as close to a sensory overload as possible,” explains Mossahebi, who directed the VR film.

“This is Don’t Panic’s first experience of creating a virtual reality campaign,” says Richard Beer, creative director of Don’t Panic. “The process of creating a virtual reality film has a whole different set of rules: it’s about creating a place for people to visit and a person for them to become, rather than simply telling a story. This interactivity of virtual reality gives it a unique sense of “presence” — it has the power to take us somewhere else in time and space, to help us feel, just for a while, what it’s like to be someone else – which is why it was the perfect tool to communicate exactly what a sensory overload feels like for someone with autism for the NAS.”

Sponsored by Tangle Teaser and Intu, the film will tour shopping centers around the UK and will also be available through Autism TMI Virtual Reality Experience view app.

Learning about LTO and Premiere workflows

By Chelsea Taylor

In late March, I attended a workflow event by Facilis Technology and StorageDNA in New York City. I didn’t know much going in other than it would be about collaborative workflows and shared storage for Adobe Premiere. While this event was likely set up to sell some systems, I did end up learning some worthwhile information about archiving and backup.

Full disclosure: going into this event I knew very little about LTO archiving. Previously I had been archiving all of my projects by throwing a hard drive into the corner of my edit. Well, not really but close! It seems that a lot of companies out there don’t put too much importance on archiving until after it becomes a problem (“All of our edits are crashing and we don’t know why!”).

At my last editing job where we edited short form content on Avid, our media manager would consolidate projects in Avid, create a FileMaker database that cataloged footage, manually add metadata, then put the archived files onto different G-Tech G-RAID drives (which of course could die after a couple of years). In short, it wasn’t the best way to archive and backup media, especially when an editor wanted to find something. They would have to walk over to the computer where the database was, figure out how to use the UI, search for the project (If it had the right metadata), find the physical drive, plug the drive into their machine, go through different files/folders until they found what they were looking for, copy the however many large files to the SAN, and then start working. Suffice to say I had a lot to learn about archiving and was very excited to attend this event.

I arrived at the event about 30 minutes early, which turned out to be a good thing because I was immediately greeted by some of the experts and presenters from Facilis and StorageDNA. Not fully realizing who I was talking to, I started asking tons of questions about their products. What does StorageDNA do? How can it integrate with Premiere? Why is LTO tape archiving better? Who adds the metadata? How fast can you access the backup? And before I knew it, I was in a heated discussion with Jeff Krueger, worldwide VP of sales at StorageDNA, and Doug Hynes, director of product and solution marketing at StorageDNA, about their products and the importance of archiving. Fully inspired to archive and with tons more questions, our conversation got cut short as the event was about to begin.

While the Facilis offerings look cool (I want all of them!), I wasn’t at the event to buy things — I wanted to hear about the workflow and integration with Adobe Premiere (which is a language I better understand). As someone who would be actually using these products and not in charge of buying them, I didn’t care about the tech specs or new features. “Secure sharing with permissions. Low-level media management. Block-level virtualized storage pools.” It was hardware spec after hardware spec (which you can check out on their website). As the presenter spoke of the new features and specifications of their new models, I just kept thinking about what Jeff Krueger had told me right before the event about archiving, which I will share with you here.

StorageDNA presented on a product line called DNAevolution, which is an archive engine built on LTO tapes. Each model provides different levels of LTO automation, LTO drives and server hardware. As an editor, I was more concerned with the workflow.

The StorageDNA Workflow for Premiere
1. Card contents are ingested onto the SAN.
2. The high-res files are written to LTO/ LTFS through DNAevolution and become permanent camera master files.
3. Low-res proxies are created and ingested onto the SAN for use in editorial. DNAevolution is pointed to the proxies, indexes them and links to the high-res clips on LTO.
4. Once the files are written to and verified on LTO, you can delete the high-res files from your spinning disk storage.
5. The editor works with the low-res proxies in Premiere Pro.
6. When complete, the editor exports an EDL that DNAevolution parses and locates the high-res files on LTO from the database.
7. DNAevolution restores high-res files to the finishing station or SAN storage.
8. The editor can relink the media and distribute in high-res/4K.

The StorageDNA Archive Workflow
1. In the DNAevolution Archive Console, select your Premiere Pro project file.
2. DNAevolution scans the project, and generates a list of files to be archived. It then writes all associated media files and the project itself to LTO tape(s).
3. Once the files are written to and verified on LTO, you can delete the high-res files from your spinning disk storage.

Why I Was impressed
All of your media is immediately backed up, ensuring it is in a safe place and not taking up your local or shared storage. You can delete the high-res files from your SAN storage immediately and work with proxies, onlining later down the line. The problem I’ve had with SAN storage is that it fills up very quickly with large files, eventually slowing down your systems and leading to playback problems. Why have all of your RAW unused media just sitting there eating up your valuable space when you can free it up immediately?

DNAevolution works easily with Adobe’s Premiere, Prelude and Media Encoder. It uses the Adobe CC toolset to automate the process of creating LTO/LTFS camera masters while creating previews via Media Encoder.

DNAevolution archives all media from your Premiere projects with a single click and notifies you if files are missing. It also checks your files for existing camera and clip metadata. Meaning if you add all of that in at the start it will make archiving much easier.

You have direct access to files on LTO tape, enabling third-party applications access to media directly on LTO, such as transcoding, partial restore and playout. DNAevolution’s Archive Asset Management toolset allows you to browse/search archived content and provides proxy playback. It even has a drag and drop functionality with Premiere where you literally drop a file straight from the archive into your Premiere timeline, with little rendering, and start editing.

I have never tested an LTO archive workflow and am curious what other people’s experiences have been like. Feel free to leave your thoughts on LTO vs. Cloud vs. Disk in the comments below.

Chelsea Taylor is a freelance editor who has worked on a wide range of content: from viral videos and sizzles to web series and short films. She also works as an assistant editor on feature films and documentaries. Check out her site a at StillRenderingProductions.com.

Automatic Duck’s Wes Plate talks about building bridges

By Randi Altman

If you’ve worked in post production during the past 14 years, there is a very good chance you know Automatic Duck and its president, Wes Plate. Over their time in business, Wes and his father, Harry, have created a number of software tools designed to make different programs and formats work together… the ultimate facilitators.

In 2011, Automatic Duck licensed its technology to Adobe, and Wes joined them as head of its Prelude team. While Adobe had acquired the technological assets of Automatic Duck, it did not acquire Automatic Duck, the company.

Fast forward a few years and the Plates and Automatic Duck are back with new products. As you might expect, Automatic Duck Ximport AE and Automatic Duck Media Copy 4.0 are designed to make post pros’ lives easier. Ximport AE transfers entire timelines, including cuts, third-party effects and transitions from Final Cut Pro X to Adobe After Effects. Media Copy 4.0 uses AAF and XML to simplify copying and moving media files from any Final Cut Pro 7, Final Cut Pro X or Avid Media Composer/Symphony project. Both products are being sold via Red Giant.

XimportAE-After Effects copy

XimportAE — After Effects

On the heels of this news, we reached out to Wes Plate, who, after working for Adobe for two years, is back at the family business.

When you joined Adobe, they bought your technology. How did that work with this AE plug-in?
We do have the ability to use some of what Adobe acquired from us, but we are also limited in some ways. We told them we had an idea for a product — translating Final Cut Pro X into After Effects, and they said, “Okay.” We used some of the After Effects code from the past, but we also had to add a whole bunch of new code for Final Cut Pro X. We are still really good working partners with Adobe and we could not have made this product without rewriting everything from scratch without their help or without their permission.

Why now, and why this?
After leaving Adobe at the start of 2014, I was trying to figure what was going to be next. At that same time, I had been hearing a lot of people on social media talking about how Final Cut Pro X had improved and become a great NLE, so I gave it a try. I really enjoy using FCP X as an editing tool, but while I am editing I want to take clips or a section of timeline and bring it in to After Effects… it’s how I work.

Harry and I were looking for a project, Final Cut Pro X is growing in the marketplace, and I need to get from Final Cut Pro X to After Effects if I am going to use it as an NLE. All of that together meant Automatic Duck should build a bridge from Final Cut Pro X to After Effects.

XimportAE — Final Cut Pro X copy

XimportAE-Final Cut Pro X

Before your plug-in, how were people getting from FCP X to After Effects?
When I started down this path, there was a free utility on the market that would translate a Final Cut Pro X XML into a file format called JavaScript; After Effects would then run that JavaScript to create a comp. I tried it but I couldn’t make it work, which gave us even more reason to jump into this. That particular tool is now in version two and available for purchase through the App Store, but we still feel like what we are creating makes much more sense. Another option is to use Intelligent Assistance’s XtoCC app to convert FCPX XML to FCP7 XML and then import that into After Effects. But that workflow is also not as complete as what Ximport AE can do.

Makes more sense how?
Our solution makes it super easy to get from Final Cut Pro X to After Effects. To get from Final Cut Pro X to After Effects, the first step is to export the XML, then switch to After Effects and import our new product, Automatic Duck Ximport AE. You can change some settings or change some options, but essentially all you have to do now is open the XML file and our plugin brings it directly to After Effects.

Red Giant is selling your new products. Can you explain the relationship?
There is an enormous amount of work that goes into selling a product. What we enjoy the most is making the product, interacting with users and making sure their problems are solved, but dealing with credit cards and that type of thing is less interesting to us and takes our attention away from what we want to do.

Our friend Stu Maschwitz, who designed Magic Bullet, and also Peder Norrby from Trapcode, have been very happy with their relationship with Red Giant, which is essentially publishing their products and doing the sales, marketing and support. Another benefit for us using Red Giant‘s infrastructure for products and distribution is we are now able to offer trial versions of Ximport AE and also Media Copy, our media copying utility.

Media Copy

Media Copy

As we look toward future product development, we’ll be evaluating FCPX as an editing platform to invest in and spend considerable time developing solutions for. The great thing about our partnership with Red Giant is that it gives us access to their expertise and resources. I can foresee opportunities to partner up on products that, all by ourselves, we might not be able to execute or have what we need to make some workflows and solutions possible. I’m excited about what we’ll be able to do both with Red Giant and opportunities that we see coming forward from the FCPX landscape.

Can you talk about Automatic Duck Classic and how that came to be?
After joining Adobe, Automatic Duck retained some products that we were allowed to sell, but we just didn’t have the time to properly support them. So instead we made the products available for free on our website. When we started to prepare for the relaunch and updated our website the download links for the free stuff went away. We didn’t realize people still needed those tools, and I kept seeing posts on social media asking where the links went. We realized that there was still a need for people to get projects between FCP7 and Avid. The old products that we used to give away for free will be coming back on the website at no cost.

—-
For more details on the products click here.

Mocha now plug-in for NLEs, BCC 10 integrates Mocha 5

The big news from Boris FX/Imagineer at IBC this year was that the soon-to-be-released Mocha Pro planar tracking and roto masking technology will be available as a plug-in for Avid, Adobe and OFX. This brings all of the tools from Mocha Pro to these NLEs — no more workarounds needed. This Mocha Pro 5 plug-in, which will be available in a month, incorporates a new effects panel for integrated keying, grain, sharpening and skin smoothing as well as new Python scripting support and more.

“Avid editors have always asked us for the Mocha planar tracking tools on their timeline. Now with the Imagineer/Boris FX collaboration, we are bringing the full Mocha Pro to Avid,” explains Ross Shain, CMO at BorisFX/Imagineer. “Media Composer and Symphony users will be able to handle more complex effects and finishing tasks, without importing/exporting footage. Just drop the Mocha Pro plug-in on your clip and you immediately have access to the same powerful tracking, stabilization and object removal tools used in high-end feature film visual effects.”

The availability of this plugin coincides with the Mocha Pro 5 release.

In other company news, Boris FX’s upcoming Boris Continuum Complete (BCC) 10 will have Mocha planar tracking and masking embedded. This is inside every BCC 10 plug-in and can be used for isolating areas of the effect with Mocha masks. The first version to ship will be BCC 10 for Avid in a few weeks.

Besides integrating Mocha Pro 5, BCC 10 will also offer a new Beauty Studio skin-retouching filter, new 3D titling and animation tools, import of Maxon Cinema 4D models, new image restoration filters, new transitions and more host support.

‘The Stanford Prison Experiment’: director/editor Kyle Patrick Alvarez

By Randi Altman

College students in a 1971 social experiment at Stanford University tried something new, with horrifying results. For writer/director/editor Kyle Patrick Alvarez, changing roles has been a much more positive experience. His third and most recent film as director is The Stanford Prison Experiment, released nationwide in mid-July but screened at Sundance in January.

The movie, based on the book The Lucifer Effect: Understanding How Good People Turn Evil, by the professor who ran the experiment, shows how power can corrupt. This is the third film that Alvarez has helmed (2013’s C.O.G. and 2009’s Easier With Practice), but the first he didn’t write. The screenplay by Tim Talbott, says the director, was one of those well-regarded but un-produced scripts that was known around town.

THE STANFORD PRISON EXPERIMENT Billy Crudup & Cast Photo by Jas Shelton

“I had known of the experiment, but not to the great detail and exacting qualities featured in the script,” explains Alvarez (@kylealvarez). “I thought it was fascinating, this challenge of being able to make a film that stayed true to the real events and still worked and functioned as a piece of cinema.”
Alvarez, who also edited The Stanford Prison Experiment, spoke to us about directing and cutting the film.

How did you transition from editing to directing?
When I first moved to LA, I was picking up editing jobs, but during that time I was also trying to get my first film off the ground. So there wasn’t necessarily a time period where I stopped being an editor.

When you’re directing, are you also wearing your editor hat? Does it influence the way you direct?
Yes, one hundred percent. I’m usually shooting 10 to 15 pages a day. I love getting coverage and love to have more options in the editing room — but many times that luxury doesn’t exist. In a lot of cases it’s thinking ahead and knowing I need certain pieces.

Really what it comes down to is being conservative and mindful of how much time we have to shoot. There was a particular day on this film where I turned to the script supervisor and said, “I’m working as an editor today more than a director because we just need to get these scenes in the can, and we have very little time to do it.”

Even if I don’t cut my films in the future, which is a likely possibility, I think there’s some part of me that’s always going to be thinking that way.

THE STANFORD PRISON EXPERIMENT Michael Angarano & Tye Sheridan & Johnny Simmons & Ezra Miller & Chris Sheffield Photo by Jas Shelton (2)

So it’s essentially muscle memory?
Yes, I also think of writing when I’m a directing, because I wrote my first two films. Sometimes you have to say, “What part of this scene isn’t working? Is it the directing? Is it the editing? Is it the writing?” Then I try to gather what piece needs a little bit of work and figure out where that is.

It works the other way as well. I try to think about editing while I’m writing because I’m thinking, do I need that, do I need this piece, how are these scenes going to really fit together? I feel like that’s a large part of what I do.

What camera did you use?
The movie takes place in the ’70s, and we explored the possibility of using film, but it was not a financial option for us. I then chose the Red Dragon, for many reasons. Part of it is the post process, part of it is being able to cut on set and work with the raw footage. For a movie like this, where I knew we were going to have really tight timelines for shooting, I liked knowing that I would have a massive amount of data.

I wanted to shoot in 5.5K — I’ve always been happy with the latitude and how it works and the color correction. It’s something I’m really comfortable with. So after that decision was made it was just a question of lenses. We shot on some vintage Leitz lenses, and that ended up playing a big hand in the look of the film, maybe even more so than the camera.

What was the look you wanted from the film?
We didn’t want to re-create what a movie from the ’70s looked like. We didn’t want that weird Grindhouse thing where you’re breaking down the image for no reason or putting in colors that shouldn’t be there or doing camera moves that are unmotivated. For me, it was more about the feeling of it. It’s like looking at Alan J. Pakula’s films. We associate that with the ’70s a lot. All the President’s Men and The Parallax View, to me that’s the kind of feeling I wanted.

We ended up with a combination… a movie that felt like it was from the ’70s but using techniques that were a bit more contemporary. It was a balance — looking at each scene and seeing what felt the most right.

Talk to me about being on set. What was the workflow like?
We had a DIT, and we had a guy who I’m close with running dailies. He was ingesting stuff through an Atomos Ninja, and I would go and watch playback there really quickly. The DIT was really working a little more closely with the cinematographer Jas Shelton.

There was also an assistant editor logging the footage. I was able to look at stuff and try some brief assemblies on lunch breaks to see what was working and what wasn’t. We were on the same location for two weeks of the shoot, so there was time to go back. Not a lot of time, but enough that I felt like if there was an insert needed to help bridge moments together we could get it. For me the goal is to overlap the post-production mentality with the production mentality and the pre-production mentality. I find the best stuff comes from when you’re able to get those things to collide as much as possible.

Let’s dig into the post. When did you start editing and on what system?
I started right away, and I used Adobe’s Premiere Pro on an iMac. We wrapped in October and had to show the film to the Sundance programming committee, so that gave us about a three-week turnaround. It premiered at Sundance in January. There are 25 characters in the film and it was a challenging edit. Because I’m the director as well as the editor, usually the first cut is pretty close to the first edit, but I panicked because this came in at three hours. I had to lose an hour of movie. It was a totally different feeling.

Kyle Alvarez at Sundance picking up the Sloan Feature Film Prize. Photo: Calvin Knight.

Kyle Alvarez at Sundance this year picking up the Sloan Feature Film Prize for ‘Experiment.’ Photo: Calvin Knight.

What else was challenging about the cut?
Almost every scene had at least 12 people in it, and everyone had mics on them. We had an extraordinary amount of audio tracks. My assistant editor, Susan Kim, would manage those as I started rough edits of scenes. If you saw the timeline, it was absurd: every track had massive, massive amounts of audio. Obviously we didn’t want to hear all those in the final edit, so it was just about going through and narrowing down those lines. That played a big part in prepping the movie for post delivery too, which also moved incredibly quickly.

Why did you choose Premiere Pro for the edit?
I learned Final Cut in college and I cut my first film with it, but I hated the transcoding process you had to go through at that time. I was shooting digital, but had to wait to cut stuff! When preparing for my second film a couple of years later, I found Adobe Premiere. They were the first ones to offer native R3D editing. I tested it on my laptop, a standard consumer level laptop, and it worked. It was sort of a revelatory moment for me.

Can you talk about the creative process of editing?
We had scenes with a lot of people, so it was about narrowing in the story or narrowing the scenes into the fundamentals of what they were about… and who they were about. You try to chip away at what’s there and see what’s working. Because we had to cut fast, I used line breakdowns where it gets delivered to me in a timeline that has every line of dialogue from every take put next to each other.

For me you start with the best performance of each line. You put that together and sometimes it’s, “that line doesn’t work with that line, because even though those two are the best individual ones, they don’t work together in the right way.” Then you go through and start swapping some out and you get the pieces, the selections of the dialogue, right. Then you go through and start to shape it and put those pieces together and figure out when you’re going to cut other people — when they’re not talking — and at a certain point it boils down to instincts. It has to feel right.

THE STANFORD PRISON EXPERIMENT Billy Crudup Photo by Jas Shelton

Can you point to an example?
There is a moment at the end of the film where a character walks ahead of the camera and goes totally out of focus for a good three or four seconds. As soon as I saw it I thought that really fits that moment. If you’re following some of the rules, that would have ended up in the trash bin, but for some reason as I was cutting, it captured something real. You don’t want to just follow those line breakdowns because you might miss that. It’s making sure no diamonds get lost in the rough of it all.

Any other moments like that?
Not exactly like that, but with this film — thanks specifically to the speed and power of the Adobe system — I did a lot more cropping and zooming in the edit. It wasn’t because we didn’t get what we wanted but because it takes 10 minutes to swap a prime lens out for a zoom. We didn’t have 10 minutes on this movie. If we had primes and I wanted zoom, I knew I was going to have to build it in post.

Thanks to shooting in 5.5K I was able to turn two shots into singles and insert moments when I needed to. I was able to extend zooms so there’s a couple of times where it’s pushing in on or zooming in on a character and the character’s emotions still went on a little bit longer. I was able to just keep that zoom going all the way through. I was doing a lot of that with no render times, and that was massive to me on this movie.

What about the color grade? Who did it and where?
We colored at Light Iron in Hollywood with Ian Vertovec using Quantel Pablo. We never transcoded — we cut straight from our R3Ds and those went straight to Light Iron and they colored straight from that.

What about the audio post?
We used Formosa Group’s Martyn Zub and Paul Carden, both of whom worked with me when I was doing C.O.G. and when they were at Wildfire. They really did an amazing amount of work in a very short period of time. My previous films were these very naturalistic dramadies. This is a movie where the sound was changing, and there’s this crowd and scenes with a lot of people creating chatter. It was a much heavier creative sound endeavor than I was used to. It was definitely an undertaking, but one they tackled head on.

Photos by Jas Shelton

Releases & Updates: We are in this ecosystem together

By Sean Mullen

Just a few weeks ago, Adobe released a major new upgrade to its Creative Cloud services. While these updates are welcomed by the community with excitement, there’s also a period of — for lack of better words — stressful chaos as the third-party software and plug-in developers scramble to ensure their products will be compatible.

When Adobe speaks, the community listens. When Adobe does something new, they listen even closer, because when they do something new, it’s usually some amazing a leap forward that only makes our lives easier and our work look that much better. The latest updates to Adobe Creative Cloud are no different.

All of us at Rampant Design are big fans, and Adobe CC is big part of what we do every day. It’s no mistake that our Style Effects complement Adobe CC so well. But we also understand — being part of this VFX community — that while change is great, those changes have impact on the software and plug-in developers who make their living enhancing the Adobe CC workflow. But I’ll get to that in a minute.

Adobe After Effects CC

Adobe After Effects CC

The Updates
Here are a couple of top-of-mind things that get us excited. We zeroed in on some of the applications and features within CC that impact us most on a daily basis, and those are the features in Premiere Pro and After Effects.

The Iridas acquisition of a couple of years ago is really showing its value, especially with this update. The Lumetri Color panel is amazing!  You’re getting seriously powerful color tools built right into Premiere Pro. That’s pretty significant. Morph Cut is part voodoo and part rocket science — a very cool tool that smoothes out jump cuts and pauses. There are some notable changes to After Effects too. While the AE Comp Scrollbar is now missing, the uninterrupted preview is a fantastic addition. The new Face Tracker is impressive as well.

The Adobe Ecosystem: Plug-Ins
There is most definitely an ecosystem around Adobe, an entire sub-segment of the post production software industry who make tools to enhance the workflow — the plug-in developers.

Adobe Premiere

Adobe Premiere

In any third-party plug-in environment, you have the host developer (in this case Adobe) and the third party plug-in developer  companies like Red Giant, Video Copilot, Genarts, BorisFX, to name a few. While the host developers keep the third parties informed as much as possible, their main focus is on rolling out a solid product release.

So,inevitably, some things slip through the cracks — mainly their ability to interact with the plug-in developers in a timely way — at least from the plug-in developers perspective. As a result, you’ll notice a slew of newsletters and social network posts from these third parties claiming that their products currently do or do not work with the latest release.

I’m sure the weeks up to and following a major release can be a hectic time for developers. Plug-in engineering isn’t free, so there is a small window within that the current build of any given third-party plug-in will work. Major releases come out every year and dot releases happen quite often.

At Rampant, our situation is a little different. We make tools that enhance the CC workflow, but also the plug-ins themselves. Style Effects aren’t alternative to plug-ins, they are complementary. If we were bakers or chefs, Style Effects would be the spices or finishing touches. If we were carpenters, Style Effects would be the varnish. Style Effects work hand in hand with your favorite plug-ins.

Style Effects are QuickTime-based, so as long as you have QuickTime, these effects will work with any Adobe update. In our reality, artists and editors want instant gratification. Very few of us get the time to play. Most producers want to see something yesterday, and this is why the plug-in and Style Effects ecosystems are so critical. Major new host releases will always be challenging — and stressful — but the end product of all of us working together is what helps all of us create amazing content. We’re proud to be a part of it!

Sean Mullen is the founder/president of Rampant Design Tools. He is an award-winning VFX artist, but he’s also the creator of Rampant Style Effects, UHD visual effects and designs. Style Effects are packaged as QuickTime files, enabling artists to drag and drop them to any editing platform.

 

 

VFX bring wheatpaste poster to life in ‘Paper Heart’ music video

Each January, The Silver Sound Showdown music video festival and battle of the bands takes place at Brooklyn Bowl. It pairs the winning director and winning band together to make a music video with Silver Sound Studios in New York City. It was here that Paper Heart, the music video directed by Nick Snyder, produced by Silver Sound and featuring the band Blood and Glass, was born.

Paper Heart, is one of the most ambitious Showdown collaborations to date,” according to festival director and producer Cory Choy, features Blood and Glass lead singer Lisa Moore as a wheatpaste poster on walls across Brooklyn. It was shot on a Red Scarlet camera and features effects created in Adobe’s After Effects and Photoshop. It was edited on Adobe Premiere Pro.

Why the wheatpaste poster look? LA-based Snyder (@nickwsnyder) works in the arts district of downtown, where he sees inspiration in everything. He also liked the idea that the nature and lifespan of the wheatpaste poster seemed to play nicely into the “themes of isolation and fragility found in the song.”

Director Nick Snyder, right.

Director Nick Snyder, right, in front of monitor.

Snyder’s Showdown-winning video Lost Boy Found also combined the techniques of live performance, compositing and animation — silhouettes of actors were composited into a fantasy shadow puppet world — so this was a realm he was comfortable in.

The Production
After several months of prep, Snyder and the band made their way to New York City for the two-day shoot. The first day was dedicated to shooting plates. Locations around Brooklyn had been scouted by Silver Sound, Google street-viewed by Snyder prior to arrival and then scouted in person. So by the time production began, specific moments had been planned to take place in a handful of selected locations. The remaining moments were narrowed down to areas where the filmmakers anticipated chance discoveries. Snyder, DP J. Andrés Cardona and a skeleton crew set out onto the streets of New York to shoot with their Red Scarlet.

Going Green
The second day was shot at Parlay Studios in Jersey City and dedicated exclusively to greenscreen shots. During a brief break in between days, Snyder analyzed the plates. While he shot listed and storyboarded, he also left room for improvisation and collaboration.

greenscreen RED Scarlet

To aid lead singer Lisa Moore in her characterization, extra attention was given to wardrobe, makeup and props. “For example, it was decided beforehand that her prop cane would become a matchstick and that after using it, the matchstick would shrivel and blacken,” explains Snyder. “The art director constructed a practical burnt matchstick prop, but rather than swapping it out during Lisa’s performance, the prop was shot suspended in front of the greenscreen. Then, using an LED light on Lisa’s un-burnt cane, I tracked the movement of the matchstick in After Effects. I then replaced it with the burnt matchstick seen at the end of the video.”

The same technique was used for the origami birds that interact with Moore throughout the video. Practical birds were made, shot against the greenscreen and keyed out in post. The intention was that they could be keyframed in After Effects, but their natural movement would allow for a slightly more organic feel. It was a good time saver. “Green apple boxes, chroma key gloves and even crew members wrapped in green blankets were used to achieve the effect of tactile contact within the video,” explains Snyder. “The performance moments were shot from start to finish in various sizes, and shooting in 4K allowed for any Lisa/plate size relationship miscalculations,” explains Snyder.

The Post
The next step was assembly. This involved mapping Moore to the building surface plates. Premiere Pro was used to assemble performance shots in raw R3D and narrow down her best takes. For performance takes, a six-panel export was made to quickly compare her gestures from the narrowed down shots. From there, a preliminary pass was made on pairing Moore with the plates by adding the chroma key effect in Premiere. “This simplified version of After Effects Keylight allowed us to see what was working without having to check all the shots in the much more sluggish After Effects video playback,” says Snyder. “Additionally, once the assembled shots were ready for AE, the greenscreen clips with this chroma key effect would stay in the metadata of the shot.” Another time saver, he says, was that once the Moore/plate relationships were locked and a cut was close to locked, the compositing could begin.

bird person birds

To save space and make for faster save times, Snyder chose to create separate After Effects files for each shot. The first step was to finalize the look for “Wheatpaste Lisa.” After some trial and error, a look was established and a master file was created that could be imported into each After Effects file, but the process for creating the look wasn’t as easy as copying and pasting a LUT. In some cases, upwards of 20 pre-comps were used.

According to Snyder, the basic process went like this. “The greenscreen shot was keyed out using Keylight, adjusting for spill and greenscreen inconsistencies. Luckily, the DP did an excellent job at lighting Lisa, so this was a breeze. If there was an issue, a simple matte choker was used. Then, this was precomped and a minimal texture was brought in to dirty it up a bit. The overlay blending mode was often used as well as an image mask. It was precomped again; an off-white stroke was added using a layer style stroke. This effect was used to create the white-edge poster look. The stroke size and precomp level varied from shot to shot, depending on the size of shot Lisa was in and also the texture of the plate onto which she was to be composited. At this point the look started to emerge a bit, but a few steps remained in order to completely bring Wheatpaste Lisa to life.”

For Paper Heart, a combination of Adobe CC’s Glass and Texturize were used to give Moore a convincing paper texture as well as authentic surface imperfections, explains Snyder. Most often, two bump maps were used — one for generic surface texture and lighting and a second to pick up the surface of the wall behind her. For the second, a high contrast grayscale image was created in Photoshop to bring out the important parts. Using Dynamic Link, Snyder was able to paint over parts of the bump map that were less important, save and view the results in After Effects.

one two

Lastly, two layers needed to be created to mimic ink on paper and human error. This would also come into play later in the video as the iterations of Wheatpaste Lisa start to erode away. “For this effect, the comp had to be duplicated. Unfortunately, After Effects comp duplication only duplicates the top comp,” explains Snyder. “So in order to duplicate all of the nested comps, a purchased script called True Comp Duplicator had to be used. The newly duplicated comp was then brought into the original comp and placed below. Using the Fill effect, this comp was colored off-white. Then, to add the finishing touches, some final grungifying had to be done to the top layer. Using Photoshop, 5K resolution brush strokes and alpha channel grunge effects were created on multiple layers. Once imported into AE, these could be used in the top Lisa comp. Using the Silhouette Alpha blending mode, the grungy paintbrush strokes subtracted bits of Wheatpaste Lisa, creating imperfections and rough edges that exposed the off-white layer beneath it.

“Finally, back in the master comp with the two Lisa layers, those were precomped once more. At this point, the look was more or less complete,” he continues. “But from shot to shot, additional work was sometimes required to successfully composite Lisa onto the plates.” Some additional tools used were Roughen Edges, another Matte Choker and occasionally another round of Silhouette Alpha grungy paintbrush strokes.

For lighting, Snyder used either the 4-Color Gradient or Gradient Ramp on an Adjustment Layer or on a Solid set to the Hard Light Blending Mode. Opacity was usually in the 10-20 percent range.

umbrella 5 flame

During the process, Snyder and Silver Sound discovered that Wheatpaste Lisa’s movement looked best at 12fps. “We wanted to underscore the fact that Wheatpaste Lisa was an actual wheatpaste entity existing in her own little universe, not just a video projection,” explains Silver Sound’s Choy. “So the choppier feel of 12fps was used to make Lisa’s motions a little less fluid, a little more animation-y and other worldly feeling.” For this effect, the Posterize Time effect was used.

Throughout the compositing process, Snyder created H.264 proxy files from the transcoded R3D footage. This was especially helpful with the origami birds. To save space, the birds were rendered out on their own at a much smaller file size and then re-imported.

The Death of Wheatpaste Lisa
Finally, Wheatpaste Lisa had to die. To achieve the effect of wheatpaste poster weathering, both layers of Wheatpaste Lisa had to erode. “Back inside the top layer — the double layer Lisa comp — individual brush strokes and grunge effects were animated with Silhouette Alpha as their blending mode,” describes Snyder. “Once the weathering looked satisfactory, these animated layers were copied, pasted into the bottom layer Lisa comp and adjusted in movement and timing. This allowed for the top layer to erode just before the bottom layer, pushing the compositing one step closer toward realism. Occasionally, one final matte choker and/or an animated mask was used on the final precomp to eliminate any stray particles or to insure that she dissolved away completely.

The crew

The crew.

Once complete, the shots were rendered at 4K ProRes 4444. The final shots were delivered to Silver Sound colorist Vlad Kucherov with Moore separated from the building surface plates. Using DaVinci Resolve, Kucherov worked with Snyder to achieve a satisfactory look that worked well for the video concept while also helping sell the compositing realism. Having the layers separated gave Silver Sound more control during this process by being able to adjust the levels independently. The goal was to find a look that played to the feel of the song, but also gave the video a confident personalized look of its own.

“In the end, Paper Heart is the result of careful planning, post experimentation, lots of hair pulling and creating a concept that exists within a strict set of limitations,” concludes Snyder.