Tag Archives: Academy Awards

A Conversation: Jungle Book’s Oscar-Winner Rob Legato

By Randi Altman

Rob Legato’s resume includes some titles that might be considered among the best visual effects films of all time: Titanic, Avatar, Hugo, Harry Potter and the Sorcerer’s Stone, Apollo 13 and, most recently, The Jungle Book. He has three Oscars to his credit (Titanic, Hugo, The Jungle Book) along with one other nomination (Apollo 13). And while Martin Scorsese’s The Wolf of Wall Street and The Aviator don’t scream effects, he worked on those as well.

While Legato might be one of the most prodigious visual effects supervisors of all time, he never intended for this to be his path. “The magic of movies, in general, was my fascination more than anything else,” he says, and that led to him studying cinematography and directing at Santa Barbara’s Brooks Institute. They provided intensive courses on the intricacies of working with cameras and film.

Rob Legato worked closely with Technicolor and MPC to realize Jon Favreau’s vision for The Jungle Book, which is nominated for a VFX Oscar this year.

It was this technical knowledge that came in handy at his first job, working as a producer at a commercials house. “I knew that bizarre, esoteric end of the business, and that became known among my colleagues.” So when a spot came in that had a visual effect in it, Legato stepped up. “No one knew how to do it, and this was before on-set visual effects supervisors worked on commercials. I grabbed the camera and I figured out a way of doing it.”

After working on commercials, Legato transitioned to longer-form work, specifically television. He started on the second season of The Twilight Zone series, where he got the opportunity to shoot some footage. He was hoping to direct an episode, but the show got cancelled before he had a chance.

Legato then took his experience to Star Trek at a time when they were switching from opticals to a digital post workflow. “There were very few people then who had any kind of visual effects and live-action experience in television. I became second-unit director and ultimately directed a few shows. It was while working on Next Generation and Deep Space Nine that I learned how to mass produce visual effects on as big a scale as television allows, and that led me to Digital Domain.”

It was at Digital Domain where Legato transitioned to films, starting with Interview With the Vampire. He served as visual effects supervisor on this one. “Director Neil Jordan asked me to do the second unit. I got along really well with DP Philippe Roussselot and was able to direct live-action scenes and personally direct and photograph anything that was not live-action related — including the Tom Cruise puppet that looked like he’s bleeding to death.” This led to Apollo 13 on which he was VFX supervisor.

On set for Hugo (L-R): Martin Scorsese, DP Bob Richardson and Rob Legato.

“I thought as a director did, and I thought as a cameraman, so I was able to answer my own questions. This made it easy to communicate with directors and cameramen, and that was my interest. I attacked everything from the perspective of, ‘If I were directing this scene, what would I do?’ It then became easy for me to work with directors who weren’t very fluent in the visual effects side. And because I shot second unit too, especially on Marty Scorsese’s movies, I could determine what the best way of getting that image was. I actually became quite a decent cameraman with all this practice emulating Bob Richardson’s extraordinary work, and I studied the masters (Marty and Bob) and learned how to emulate their work to blend into their sequences seamlessly. I was also able to maximize the smaller dollar amount I was given by designing both second unit direction and cinematography together to maximize my day.”

Ok, let’s dig in a bit deeper with Legato, a card-carrying member of the ASC, and find out how he works with directors, his workflow and his love for trying and helping to create new technology in order to help tell the story.

Over the years you started to embrace virtual production. How has that technology evolved over the years?
When I was working on Harry Potter, I had to previs a sequence for time purposes, and we used a computer. I would tell the CG animators where to put the camera and lights, but there was something missing — a lot of times you get inspired by what’s literally in front of you, which is ever-changing in realtime. We were able to click the mouse and move it where we needed, but it was still missing this other sense of life.

For example, when I did Aviator, I had to shoot the plane crash; something I’d never done before, and I was nervous. It was a Scorsese film, so it was a given that it was to be beautifully designed and photographed. I didn’t have a lot of money, and I didn’t want to blow my opportunity. On Harry Potter and Titanic we had a lot of resources, so we could fix a mistake pretty easily. Here, I had one crack at it, and it had to be a home run.

So I prevised it, but added a realtime live-action pan and tilt wheels so we could operate and react in realtime — so instead of using a mouse, I was basically using what we use on a stage. It was a great way of working. I was doing the entire scene from one vantage point. I then re-staged it, put a different lens on it and shot the same exact scene from another angle. Then I could edit it as you would a real sequence, just as if I had all the same angles I would have if I had photographed it conventionally and produced a full set of multi-angle live-action dailies.

You edit as well?
I love editing. I would operate the shot and then cut it in the Avid, instantly. All of a sudden I was able to build a sequence that had a certain photographic and editorial personality to it — it felt like there was someone quite specific shooting it.

Is that what you did for Avatar?
Yes. Cameron loves to shoot, operate and edit. He has no fear of technology. I told him what I did on Aviator and that I couldn’t afford to add the more expensive, but extremely flexible, motion capture to it. So on Avatar instead of only the camera having live pan and tilt wheels, it could also be hand-held — you could do Steadicam shots, you could do running shots, you could do hand-held things, anything you wanted, including adding a motion capture live performance by an actor. You could easily stage them, or a representation of that character, in any place or scale in the scene, because in Avatar the characters were nine feet tall. You could preview the entire movie in a very free form and analog way. Jim loved the fact he could impart his personality — the way he moves the camera, the way he frames, the way he cuts — and that the CG-created film would bear the unmistakable stamp of his distinctive live-action movies.

You used the “Avatar-way” on Jungle Book, yes?
Yes. It wasn’t until Jungle Book that I could afford the Avatar-way — a full-on stage with a lot of people to man it. I was able to take what I gave to Jim on Avatar and do it myself with the bells and whistles and some improvements that gave it a life-like sensibility of what could have been an animated film. Instead it became a live film because we used a live-action analog methodology of acquiring images and choosing which one was the right, exact moment per the cut.

The idea behind virtual cinematography is that you shoot it like you would a regular movie. All the editors, cameramen or directors who’ve never done this before are now sort of operating the way they would have if it were real. This very flavor and personality starts to rub off on the patina of the film and begins to feel like a real movie; not animated or computer generated one.

Our philosophy on Jungle Book was we would not make the computer camera do anything that a real camera could not do, so we limited the way we could move it and how fast we could move it, so it wouldn’t defy any kind of gravity. That went part and parcel with the animation and movement of the animals and the actor performing stunts that only a human can accomplish.

So you are in a sense limiting what you can do with the technology?
There was an operator behind the camera and behind the wheels, massaging and creating the various compositional choices that generally are not made in a computer. They’re not just setting keyframes, and because somebody’s behind the camera, this sense of live-action-derived movement is consistent from shot to shot to shot. It’s one person doing it, whereas normally on a CG film, there are as many as 50 people who are placing cameras on different characters within the same scene.

You have to come up with these analog methodologies that are all tied together without even really knowing it. Your choices at the end of the day end up being strictly artistic choices. We’d sort of tap into that for Jungle Book and it’s what Jim tapped into when he did Avatar. The only difference between Avatar and our film is that we set our film in an instantly recognizable place so everybody can judge whether it’s photorealistic or not.

When you start a film, do you create your own system or use something off the shelf?
With every film there is a technology advance. I typically take whatever is off-the-shelf and glue it together with something not necessarily designed to work in unison. Each year you perfect it. The only way to really keep on top of technology is by being on the forefront of it, as opposed to waiting for it to come out. Usually, we’re doing things that haven’t been done before, and invariably it causes something new and innovative.

We’re totally revamping what we did on Jungle Book to achieve the same end on my next film for Disney, but we hope to make it that much better, faster and more intuitive. We are also taking advantage of VR tools to make our job easier, more creative and faster. The faster you can create options, the more iterations you get. More iterations get you a better product sooner and help you elevate the art form by taking it to the next level.

Technology is always driven by the story. We ask ourselves what we want to achieve. What kind of shot do we want to create that creates a mood and a tone? Then once we decide what that is, we figure out what technology we need to invent, or coerce into being, to actually produce it. It’s always driven that way. For example, on Titanic, the only way I could tell that story and make these magic transitions from the Titanic to the wreck and from the wreck back to the Titanic, was by controlling the water, which was impossible. We needed to make computer-generated water that looked realistic, so we did.

THE JUNGLE BOOK (Pictured) BAGHEERA and MOWGLI. ©2016 Disney Enterprises, Inc. All Rights Reserved.CG water was a big problem back then.
But now that’s very commonplace. The water work in Jungle Book is extraordinary compared to the crudeness of what we did on Titanic, but we started on that path, and then over the years other people took over and developed it further.

Getting back to Marty Scorsese, and how you work with him. How does having his complete trust make you better at what you do?
Marty is not as interested in the technical side as Jim is. Jim loves all this stuff, and he likes to tinker and invent. Marty’s not like that. Marty likes to tinker with emotions and explore a performance editorially. His relationship with me is, “I’m not going to micro-manage you. I’m going to tell you what feeling I want to get.” It’s very much like how he would talk to an actor about what a particular scene is about. You then start using your own creativity to come up with the idea he wants, and you call on your own experience and interpretation to realize it. You are totally engaged, and the more engaged you are, the more creative you become in terms of what the director wants to tell his story. Tell me what you want, or even don’t want, and then I’ll fill in the blanks for you.

Marty is an incredible cinema master — it’s not just the performance, it’s not just the camera, it’s not just the edit, it’s all those things working in concert to create something new. His encouragement for somebody like me is to do the same and then only show him something that’s working. He can then put his own creative stamp on it as well once he sees the possibilities properly presented. If it’s good, he’s going to use it. If it’s not good, he’ll tell you why, but he won’t tell you how to if fix it. He’ll tell you why it doesn’t feel right for the scene or what would make it more eloquent. It’s a very soft, artistic push in his direction of the film. I love working with him for this very reason.

You too surround yourself with people you can trust. Can you talk about this for just a second?
I learned early on to surround myself with geniuses. You can’t be afraid of hiring people that are smarter than you are because they bring more to the party. I want to be the lowest common denominator, not the highest. I’ll start with my idea, but if someone else can do it better, I want it to be better. I can show them what I did and tell them to make it better, and they’ll go off and come up with something that maybe I wouldn’t have thought of, or the collusion between you and them creates a new gem.

When I was doing Titanic someone asked me how I did what I did. My answer was that I hired geniuses and told them what I wanted to accomplish creatively. I hire the best I can find, the smartest, and I listen. Sometimes I use it, sometimes I don’t. Sometimes the mistake of somebody literally misunderstanding what you meant delivers something that you never thought of. It’s like, “Wow, you completely misunderstood what I said, but I like that better, so we’re going to do that.”

Part and parcel of doing this is that you’re a little fearless. It’s like, “Well, that sounds good. There’s no proof to it, but we’re going to go for it,” as opposed to saying, “Well, no one has done it before so we better not try it. That’s what I learned from Cameron and Marty and Bob Zemeckis. They’re fearless.

Can you mention what you’re working on now, or no?
I’m working on Lion King.

The A-List: Manchester by the Sea director Kenneth Lonergan

By Iain Blair

It’s been 16 years since filmmaker and playwright Kenneth Lonergan made his prize-winning debut at Sundance with You Can Count on Me, which he wrote and directed. The film won the Sundance Grand Jury Prize and was an Academy Award and Golden Globe nominee for Best Screenplay.

Lonergan’s most recent film is also garnering award attention. Directed by one of the most distinctive writing talents on the American indie scene today, Manchester by the Sea, fulfills that earlier promise and extends Lonergan’s artistic vision.

Kenneth Lonergan

Both an ensemble piece and an intense character study, Manchester by the Sea tells the story of how the life of Lee Chandler (Casey Affleck), a grieving and solitary Boston janitor, is transformed when he reluctantly returns to his hometown to take care of his teenage nephew Patrick (Lucas Hedges) after the sudden death of his older brother Joe (Kyle Chandler). It’s also the story of the Chandlers, a working-class family living in a Massachusetts fishing village for generations, and a deeply poignant, unexpectedly funny exploration of the power of familial love, community, sacrifice and hope.

Co-produced by Matt Damon, the film from Roadside Attractions and Amazon Studios — which received four SAG nominations, a crucial Oscars barometer — has a stellar behind-the-scenes list of collaborators, including DP Jody Lee Lipes (Trainwreck, Martha Marcy May Marlene), editor Jennifer Lame (Mistress America, Paper Towns), composer Lesley Barber (You Can Count on Me) and production designer Ruth De Jong (The Master, The Tree of Life).

I recently spoke with Lonergan about making the film and his workflow.

I heard Matt Damon was very involved in the genesis of this. How did this project come about?
Matt, his producer Chris Moore and John Krasinski were talking on the set of this film they were shooting about ideas for Matt’s directing debut. Matt and John brought me the basic idea and asked me to write it. So, I took some of their suggestions and went off and spent a couple of years working on it and expanding it. I don’t really start off with themes when I write. I always start with characters and stories that seem compelling, and then let the themes emerge as I go, and with this it became about people dealing with terrible loss, with the story of this man who’s carrying a weight that’s just too much to bear. It’s about loss, family and how people cope.

Is it true that Damon was going to star in it originally?
Yes, but what actually happened was that John was going to star and Matt was going to direct it, but then John’s schedule got too busy and then Matt was going to star and direct it, and then he also got too busy, so then I came onboard to also direct.

You ended up with a terrific cast. What did Casey Affleck, Michelle Williams and Lucas Hedges bring to their roles?
Casey’s a truly wonderful actor who brings tremendous emotional depth even without saying much in a scene. He’s very hard working, never has a false moment and really has the ability to navigate through the complicated relationships and in the way he deals with people.

Michelle has a tremendous sense of character and is just brilliant, I think. She brings a beautiful characterization to the film and has to go through some pretty intense emotions. They’re both very generous actors, as there are a lot of people they have to interact with. They’re not show-boaters who just want to get up there and emote. And Lucas is this real find, a very talented young actor just starting out who really captured this character.

You shot this on location all over Cape Ann. How tough was it?
It was a bit grueling, as we shot from March until April and it was pretty cold a lot of the time, especially during prep and scouting in February. We had some schedule and budget pressures, but nothing out of the ordinary. I loved shooting around Cape Ann — the locals were great, and the place really seeped into the film in a way that I’m very happy about.

Do you like the post process?
I love post because of the quiet and the chance to really concentrate on making the film. I also like the lack of administrative duties and the sudden drop in the large number of people I’m responsible for on a set. It’s just you, the editor and editorial staff. Some of the technical finishing procedures can be a bit tiring after you’ve seen the film so many times, but overall post is very enjoyable for me.

I loved my editor, and doing all the sound mixing; it was so much fun putting it all together and seeing the story work, all without the stress of the shoot. You still have pressures, but not on the same scale. We did all the post in New York at Technicolor Postworks, and we worked from May through September so it was a pretty relaxed schedule. We had our basic template done by October, and then we did a bunch of little fixes from that point on so it would be ready for Sundance. Then we did a bit more work on it, but didn’t change much — we added four minutes.

Talk about working with editor Jennifer Lame. Was she on the set?
No, we sent her dailies in New York and we never actually met face-to-face until after the shoot. I had to interview her on the phone when she was in LA working on another job, and we got along right away. She’s a wonderful editor. We began cutting on Avid Media Composer at Technicolor Postworks and then did some over the summer at my rental house in Long Island, where she’d come over and set up. Then we finished up back in New York.

How challenging were all the flashbacks to cut, as they’re quite abrupt?
All the flashbacks were very interesting to put together, but they didn’t really present more of a challenge than anything else because they’re such an intrinsic part of the whole story. We didn’t want to telegraph them and warn the audience by doing them differently. We discussed them a lot. Should they be color-timed differently? Should they be shot differently? Look and sound different?

In the end, we decided they should be indistinguishable from the rest, and it’s mainly only because of the content and behavior that you know they’re flashbacks. They were fun to weave into the story, and the more seamless they were the better we liked it. Jennifer actually pointed out that it was almost like telling two stories, not just one, because that’s how Lee experiences the world. He’s always dealing with memories which pop up when they’re least wanted, and when he returns home to Manchester he’s flooded by memories — for him the past and present are almost the same.

You shot in early spring, but there’s a lot of winter scenes, so you must have needed some visual effects?
Some, but not that much. Hectic Electric in Amsterdam did them all. We had some snow enhancement, we added some smoke, clean-up and did some adjustments for light and weather, but scenes like the house fire were all real.

How important is sound and music to you?
It’s hard to overstate. For me, music has the biggest influence on the feeling of a scene after the acting — even more than the cinematography in how it can instantly change the tone and feeling. You can make it cheerful or sad or ominous or peaceful just with the right music, and it adds all these new layers to the story and goes right to your emotions. So I love working with my composer and finding the right music.

Then I asked [supervising sound editor/re-recording mixer] Jacob Ribicoff to record sounds up in Cape Ann at all our locations — the particular sound of the marina, the woods, the bars — so it was all grounded in reality. The whole idea of post sound, which we did at Technicolor Postworks with Jacob, was to support that verisimilitude. He used Avid Pro Tools. There’s no stylization, and it was also about the ocean and that feeling of never being far from water. So the sound design was all about placing you in this specific environment.

Where did you do the DI?
We did the color correction with Jack Lewars, also at Technicolor Postworks. He did the final grade on Autodesk Flame. We shot digitally but I think the film looks very filmic. They did a great job.

Did it turn out the way you first envisioned it?
Pretty much, but it always changes from the script to the screen, and once you bring in your team and all their contributions and the locations and so on, it just expands in every direction. That’s the magic of movies.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Academy honors 18 Scientific and Technical achievements

The Academy of Motion Picture Arts and Sciences has announced that 18 scientific and technical achievements represented by 34 individual award recipients, as well as five organizations, will be honored at its annual Scientific and Technical Awards Presentation on February 11.

“This year we are particularly pleased to be able to honor not only a wide range of new technologies, but also the pioneering digital cinema cameras that helped facilitate the widespread conversion to electronic image capture for motion picture production,” says Ray Feeney, Academy Award recipient and chair of the Scientific and Technical Awards Committee. “With their outstanding, innovative work, these technologists, engineers and inventors have significantly expanded filmmakers’ creative choices for moving image storytelling.” 

Unlike other Academy Awards to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during 2016. Rather, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.

The Academy Awards for scientific and technical achievements are: 

 Technical Achievement Awards (Academy Certificates)

Thomson Grass Valley for the design and engineering of the pioneering Viper FilmStream digital camera system. The Viper camera enables frame-based logarithmic encoding, which provides uncompressed camera output suitable for importing into existing digital intermediate workflows.

Larry Gritz for the design, implementation and dissemination of Open Shading Language (OSL). OSL is a highly-optimized runtime architecture and language for programmable shading and texturing that has become a de facto industry standard. It enables artists at all levels of technical proficiency to create physically plausible materials for efficient production rendering.

Carl Ludwig, Eugene Troubetzkoy and Maurice van Swaaij for the pioneering development of the CGI Studio renderer at Blue Sky Studios. CGI Studio’s groundbreaking ray-tracing and adaptive sampling techniques, coupled with streamlined artist controls, demonstrated the feasibility of ray-traced rendering for feature film production.

Brian Whited for the design and development of the Meander drawing system at Walt Disney Animation Studios. Meander’s innovative curve-rendering method faithfully captures the artist’s intent, resulting in a significant improvement in creative communication throughout the production pipeline.

Mark Rappaport for the concept, design and development, Scott Oshita for the motion analysis and CAD design, Jeff Cruts for the development of the faux-hair finish techniques, and Todd Minobe for the character articulation and drive-train mechanisms of the Creature Effects Animatronic Horse Puppet. The Animatronic Horse Puppet provides increased actor safety, close integration with live action, and improved realism for filmmakers.

Glenn Sanders and Howard Stark for the design and engineering of the Zaxcom Digital Wireless Microphone System. The Zaxcom system has advanced the state of wireless microphone technology by creating a fully digital modulation system with a rich feature set, which includes local recording capability within the belt pack and a wireless control scheme providing realtime transmitter control and timecode distribution.

David Thomas, Lawrence E. Fisher and David Bundy for the design, development and engineering of the Lectrosonics Digital Hybrid Wireless Microphone System. The Lectrosonics system has advanced the state of wireless microphone technology by developing a method to digitally transmit full-range audio over a conventional analog FM radio link, reducing transmitter size, and increasing power efficiency.

Parag Havaldar for the development of expression-based facial performance-capture technology at Sony Pictures Imageworks. This pioneering system enables large-scale use of animation rig-based facial performance-capture for motion pictures, combining solutions for tracking, stabilization, solving and animator-controllable curve editing.

Nicholas Apostoloff and Geoff Wedig for the design and development of animation rig-based facial performance-capture systems at ImageMovers Digital and Digital Domain. These systems evolved through independent, then combined, efforts at two different studios, resulting in an artist-controllable, editable, scalable solution for the high-fidelity transfer of facial performances to convincing digital characters.

Kiran Bhat, Michael Koperwas, Brian Cantwell and Paige Warner for the design and development of the ILM facial performance-capture solving system. This system enables high-fidelity facial performance transfer from actors to digital characters in large-scale productions while retaining full artistic control, and integrates stable, rig-based solving and the resolution of secondary detail in a controllable pipeline.

Scientific and Engineering Awards (Academy Plaques)

Arri for the pioneering design and engineering of the Super 35 format Alexa digital camera system. With an intuitive design and appealing image reproduction achieved through close collaboration with filmmakers, Arri’s Alexa cameras were among the first digital cameras widely adopted by cinematographers.

Red Digital Cinema for the pioneering design and evolution of the Red Epic digital cinema cameras with upgradeable full-frame image sensors. Red’s design and innovative manufacturing process have helped facilitate the wide adoption of digital image capture in the motion picture industry.

Sony for the development of the F65 CineAlta camera with its pioneering high-resolution imaging sensor, excellent dynamic range and full 4K output. Sony’s photosite orientation and true RAW recording deliver exceptional image quality.             

Panavision and Sony for the conception and development of the Genesis digital motion picture camera. Using a familiar form factor and accessories, the design features of the Genesis allowed it to become one of the first digital cameras to be adopted by cinematographers.

Marcos Fajardo for the creative vision and original implementation of the Arnold Renderer, and to Chris Kulla, Alan King, Thiago Ize and Clifford Stein for their highly-optimized geometry engine and novel ray-tracing algorithms which unify the rendering of curves, surfaces, volumetrics and subsurface scattering as developed at Sony Pictures Imageworks and Solid Angle SL. Arnold’s scalable and memory-efficient single-pass architecture for path tracing, its authors’ publication of the underlying techniques, and its broad industry acceptance were instrumental in leading a widespread adoption of fully raytraced rendering for motion pictures.

Vladimir Koylazov for the original concept, design and implementation of V-Ray from Chaos Group. V-Ray’s efficient production-ready approach to raytracing and global illumination, its support for a wide variety of workflows, and its broad industry acceptance were instrumental in the widespread adoption of fully ray-traced rendering for motion pictures.

Luca Fascione, J.P. Lewis and Iain Matthews for the design, engineering and development of the FACETS facial performance capture and solving system at Weta Digital. FACETS was one of the first reliable systems to demonstrate accurate facial tracking from an actor-mounted camera, combined with rig-based solving, in large-scale productions. This system enables animators to bring the nuance of the original live performances to a new level of fidelity for animated characters.

Steven Rosenbluth, Joshua Barratt, Robert Nolty and Archie Te for the engineering and development of the Concept Overdrive motion control system. This user-friendly hardware and software system creates and controls complex interactions of real and virtual motion in hard realtime, while safely adapting to the needs of on-set filmmakers. 

Academy names winners of Scientific and Technical awards

We in the industry know that the Academy of Motion Picture Arts and Sciences don’t just give out gold statues, they also celebrate technology that has changed the way films are made. With that in mind, the Academy has announced that 10 scientific and technical achievements represented by 33 individual award recipients will be honored at its annual Scientific and Technical Awards Presentation on February 13, at the Beverly Wilshire in Beverly Hills. In addition, the Society of Motion Picture and Television Engineers (SMPTE) will receive a special award recognizing “a century of fundamental contributions to the advancement of motion picture standards and technology.”

“This year’s honorees represent a wide range of new tech, including a modular inflatable airwall system for composited visual effects, a ubiquitous 3D digital paint system and a 3D printing technique for animation,” said Richard Edlund, Academy Award-winning visual effects artist and chair of the Scientific and Technical Awards Committee. “With their outstanding, innovative work, these technologists, engineers and inventors have further expanded filmmakers’ creative opportunities on the big screen.”

Unlike other Academy Awards to be presented this year, achievements receiving Scientific and Technical Awards need not have been developed and introduced during 2015. Rather, the achievements must demonstrate a proven record of contributing significant value to the process of making motion pictures.

The Academy Awards for scientific and technical achievements are:

Technical Achievement Awards (Academy Certificates)
– Michael John Keesling for the design and development of Image Shaker, an optical system that convincingly creates the illusion of the camera shaking in a variable and repeatable manner. The Image Shaker was unique and superior to alternatives in use when it was invented two decades ago, and it continues to be used today.

– David McIntosh, Steve Marshall Smith, Mike Branham and Mike Kirilenko for the engineering and development of the Aircover Inflatables Airwall. This system of modular inflatable panels can be erected on location, at lengths reaching hundreds of feet, with exceptional speed and safety. When used to support blue or green screens, the Airwall permits composite shots of unprecedented scale.

– Trevor Davies, Thomas Wan, Jon Scott Miller, Jared Smith and Matthew Robinson for the development of the Dolby Laboratories PRM Series Reference Color Monitors. The PRM’s design allows the stable, accurate representation of images with the entire luminance range and color gamut used in contemporary theatrical feature presentation.

– Ronald Mallet and Christoph Bregler for the design and engineering of the Industrial Light & Magic Geometry Tracker, a novel, general-purpose tracker and solver. Geometry Tracker facilitates convincing interaction of digital and live-action elements within a scene. Its precise results and tight integration with other ILM animation technologies solve a wider range of match-animation challenges than was previously possible.

– Jim Hourihan, Alan Trombla and Seth Rosenthal for the design and development of the Tweak Software RV system, a highly extensible media player system. RV’s multi-platform toolset for review and playback, with comprehensive APIs, has allowed studios of all sizes to take advantage of a state-of-the-art workflow and has achieved widespread adoption in the motion picture industry.

– Richard Chuang and Rahul Thakkar for the groundbreaking design, and Andrew Pilgrim, Stewart Birnam and Mark Kirk for the review workflows and advanced playback features, of the DreamWorks Animation Media Review System. Over its nearly two decades of development, this pioneering system enabled desktop and digital theater review. It continues to provide artist-driven, integrated, consistent and highly scalable studio-wide playback and interactive reviews.

– Keith Goldfarb, Steve Linn, Brian Green and Raymond Chih for the development of the Rhythm & Hues Global DDR System. This consistent, integrated, production database-backed review system enables a recordable workflow and an efficient, collaborative content review process across multiple sites and time zones.

– J Robert Ray, Cottalango Leon and Sam Richards for the design, engineering and continuous development of Sony Pictures Imageworks Itview. With an extensive plug-in API and comprehensive facility integration including editorial functions, Itview provides an intuitive and flexible creative review environment that can be deployed globally for highly efficient collaboration.

Scientific and Engineering Awards (Academy Plaques)
– Brian McLean and Martin Meunier for pioneering the use of rapid prototyping for character animation in stop-motion film production. Laika’s inventive use of rapid prototyping has enabled artistic leaps in character expressiveness, facial animation, motion blur and effects animation. Through highly specialized pipelines and techniques, 3D printing capabilities have been harnessed with color uniformity, mechanical repeatability, and the scale required to significantly enhance stop-motion animated feature films.

– Jack Greasley, Kiyoyuki Nakagaki, Duncan Hopkins and Carl Rand for the design and engineering of the Mari 3D texture painting system. Combining multilayer painting tools and a unique texture-management system, Mari simplifies working with large, high-resolution texture sets. It has achieved broad adoption in the visual effects industry, often supplanting long-term in-house systems.

The A-List: ‘The Danish Girl’ director Tom Hooper

This relatively low-budget film is generating a ton of Oscar buzz

By Iain Blair

British director Tom Hooper and The King’s Speech — his film about the true-life story of the stuttering King George VI and his Aussie speech therapist — swept the Oscars in 2011, with the film winning him Best Director, along with Best Picture and a Best Actor Oscar for Colin Firth. Now the Oxford-educated Hooper, who got his start shooting commercials and such hit TV shows as Prime Suspect, East-Enders, Elizabeth I and John Adams, and whose film credits include Les Misérables and Red Dust, is getting more Oscar buzz for his latest movie The Danish Girl.

Tom Hooper and Iain Blair

Tom Hooper and Iain Blair

Starring Oscar-winner Eddie Redmayne (The Theory of Everything), The Danish Girl, from Focus Features, tells another real-life story, this time of Lili Elbe, a Danish man who transitioned to a female in the 1920s with the help of his artist wife, played by Alicia Vikander.

I recently spoke with Hooper over lunch about making the film, which was shot with a Red Epic Dragon, and edited on a Media Composer.

Transgender issues are suddenly very much in the zeitgeist, but you must have started working on this quite a while ago?
Yes, and it’s been a long journey, and a real labor of love. I fell in love with the script seven years ago, but it was very hard to finance and risky to do. In fact, a lot of people close to me advised me not to try and make it. But here we are, and I’ll be speaking at The White House on a panel about the film and these issues. So a lot has changed since 2008, and now it’s being embraced, so it’s pretty amazing.

Is it true you first gave Eddie the script over the barricades while you were shooting Les Misérables?
It’s true. I slipped it to him and he called the next day and said, “Yes, let’s go,” and I had to tell him to hold on, that it’s not that easy… we’ve still got to pull it all together.

What did Eddie and Alicia bring to the mix?
They’re both actors with a lot of unusual qualities. Eddie has this emotional openness; an emotional translucency that allows audiences to find it very easy to identify with him, which was key for the role. I first saw that in him when I cast him in Elizabeth 1.

Alicia has so much heart, and is so kind and compassionate, and she brings this inner strength to the role. You never think of her character as the victim. She also makes goodness very interesting, which is incredibly hard to do. It’s far easier to play a villain.

Once again you worked with DP Danny Cohen, who shot The King’s Speech for you. He’s quite a maverick, isn’t he?
That’s exactly the right word. He’s a bit of a rebel — he doesn’t mind breaking the rules — and he doesn’t get attached to a certain formula or way of doing things. This is our fifth film together, and I really feel he’s helped loosen up my style and not feel so bound by all the usual rules. He used some beautiful old lenses and very soft lighting inspired by several Danish artists, and I think the film looks perfect for the story and the period. He’s also the nicest guy in the world.

Tell us about working with editor Melanie Oliver, who cut Les Misérables, The Damned United and several of your TV series. How does that relationship work?
This is our sixth film together, and she did amazing work on the John Adams miniseries. She is a really gifted editor. Basically, she’s my greatest secret weapon, my great collaborator, and she functions almost like a co-director. I really feel that editors are often under-appreciated in that regard, and I rarely change a performance take once she’s selected it and cut it in, as she’s completely right about 90 percent of the time in her first assembly.

B115_C002_0304UA

Performance and editing are like choreography, and they have to go together perfectly like a very intricate dance. If the editor picks the right moments, it can really elevate an actor’s performance, and then the whole film, and she does that all the time.

Where did you do the post? How long was the process?
It was all done in London at Goldcrest, where I usually do post, and then we did all the sound mixing at Halo, all over a period of several months. I love post and the calm after the storm of the shoot where you feel the meter ticking away every minute on set.

For me, the most interesting aspect of it is that, however clear the vision you had of the film before you began, once you sit down in the editing room and start working on, it begins to change into something different. You have to let go of that vision and just look at what’s in front of you, and then it’s all about servicing what the film’s become, and how you can get the best out of what you shot. I love getting the structure right and then the pacing and all the rhythms right — on this we ultimately ended up losing over an hour of material. You hate to cut sceDanishGirl_11447188149nes, but that really tightened it all up.

What about the music and sound? Were they more crucial than usual?
I think so. The next big thing for me in post after editing is adding all the music and sound, and composer Alexandre Desplat responded so well to all the changes. In this film the music acts like the narrator, so we had to be very careful in how we used it. Alexandre went through so many drafts of key scenes where you need to balance the pain and the joy that Eddie’s character is feeling.

Music and sound are always huge for me, but they were even more crucial in this, and a very big challenge, because it’s such a quiet movie. So all the sound effects had to be really effective and just right. I had a great sound team, including Mike Prestwood Smith, our re-recording mixer who did Captain Phillips and Mission Impossible, and we worked hard at getting stuff like the sound of the canals and boats rubbing together exactly right.

Then I love starting to show the film to get a feeling about all that. I always start off showing it to my family first, and then to close friends, and you see how it plays and you learn about it and gradually shape it, and hopefully all the pieces of the puzzle fall into shape by the end of post, and your movie emerges.

CC23_SG03_012541447188143 A043_C002_02140T

Any period piece needs some VFX work. Who did the shots?
There were very few, and they were all done by Double Negative, who I worked with on Les Mis, and they did an amazing job considering the circumstances. This was a very low-budget movie — just about $15 million — and that meant we only had $100,000 for the entire VFX, which isn’t very much. We did need some key VFX shots, such as the shot of the trees at the beginning, and the train on the bridge, which was all CGI. Then there’s a lot of subtle stuff and clean-up work, and colorist Adam Glasman did the DI.

You’ve won an Oscar. How important are they and all the other awards?
I think they really can help with a small film like this. Look what happened with The King’s Speech. I still pinch myself that it did so well and that I won an Oscar.

Where do you keep it?
I know a lot of Brits keep it in the loo, but I keep mine on the mantle by the fireplace

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

 

The A-List: Director George Miller talks ‘Fury Road,’ Oscar season

By Iain Blair

The Oscar-winning writer/director/producer George Miller was instrumental in introducing the new wave of revved-up Aussie cinema to the world stage thanks to his seminal and highly influential apocalyptic road trilogy, Mad Max. But when the first in the series roared onto screens in the late seventies, it wasn’t just a fresh blast of non-stop action reeking of hot engines and even hotter desert winds from down under. Miller’s assured debut, a bleak vision of the future, essentially rewrote the book on how to make a successful low-budget indie action film (for 20 years it held the record as the highest profit-to-cost ratio of any film).

He then went on to create two more much-beloved franchises — Babe and Happy Feet — which did for talking animals what Mad Max did for young up-and-comer Mel Gibson.

Miller, whose diverse credits include directing The Witches of Eastwick, Twilight Zone: The Movie, Lorenzo’s Oil and producing Dead Calm, the thriller that jump-started Nicole Kidman’s career, was in LA recently to talk about Warner Bros. Mad Max: Fury Road. The $375 million-grossing smash is the fourth in the blockbuster series, which left off with Mad Max: Beyond Thunderdome, released exactly 30 years ago.

George Miller and writer Iain Blair

George Miller and writer Iain Blair

Starring Tom Hardy in the iconic Gibson role and Charlize Theron as Imperator Furiosa, the film was shot by John Seale, the acclaimed Aussie cinematographer who won the Oscar for The English Patient and whose credits include Cold Mountain, The Perfect Storm, Rain Man, Harry Potter and the Sorcerer’s Stone and Lorenzo’s Oil with Miller. It was the DP’s first digital film and first time shooting with Arri Alexas (See our coverage on Seale shooting Fury Road here).

Over a nice meal, I spoke with Miller about making the film, posting Fury Road and the Oscars.

We’re heading into awards season. You’ve been nominated for three Oscars and you’ve won once, for Happy Feet. How important are they to you?
It’s a mistake to give it too much thought. It’s enough to just make a film that resonates with audiences, and I used to feel awards just aren’t important, but I’ve come to realize that culturally successful people — whether they’re directors or artists or musicians and so on — have the ability to analyze and reinforce what works. It’s always easy to see why something doesn’t work, but it’s far harder to pin down exactly why it works.

Is it true that you spent three years building 3D rigs from scratch to shoot Fury Road, and then at the last moment decided that the film would be shot in 2D instead?
We started off shooting native 3D with them, but suddenly we began to doubt that they’d hold up in all the heat and dust where we were shooting —the Namibian desert — as we only had six. And by then, stereo conversion was getting really good, so we decided to go digital with the Alexas, which were also supplemented with a Canon EOS 5D Mark 11 and an Olympus P5 used as “crash” cameras in some action sequences.

ONE TWO

Seale told me that — amazingly, given the non-stop action — the film was predominantly a one-camera shoot.
Yeah. Roman Polanski said, “At any given moment, there’s only one perfect camera position,” and I agree. So when I went into animation with the Happy Feet movies, it became really obvious, as you can take exactly the same performance, same set and so on, and by shifting the camera, the perspective and cutting pattern, you can change a scene completely. So yes, I’m a one-camera filmmaker in that sense.

Do you like the post process?
Very much. It’s where you confront your mistakes and where you can work around them, provided you have a good editor. We posted in Sydney in my offices, in this deco theatre, The Metro, and it took over a year. Then we did some extra shooting and the bookends to the movie, back in Australia.

The film was edited by your wife, Margaret Sixel, who also cut Happy Feet. Tell us about that relationship and how it worked.
We shot for nine months and she was back in Sydney, getting massive amounts of footage. Initially she didn’t want to do it because she’d never cut an action film before. I told her that was a great reason to do it since she wouldn’t be following all the clichés and tropes and style of those movies.

Charlize Theron and George Miller on set

Charlize Theron and George Miller on set

I’d seen her work on documentaries, where she’d taken some very bland footage and shaped it into a very strong narrative. She has this great sense of structure and causality of one shot to the next, either spatially or thematically — there’s some connection.

It struck me that this is essentially a silent movie, but with great sound.
That’s exactly what I set out to make, and we did the sound mix, the DI and post-viz as part of editorial, and did a lot of early sound work in Sydney, but then we ended up on the lot at Warners here in LA, and did the final Atmos mix here, too, with a great team: re-recording mixers Chris Jenkins and Gregg Rudloff.

Obviously, a lot of the action effects were shot in-camera, but there’s also huge number of visual effects shots in the film. How many are there?
There are over 2,800 shots in the movie — which is a lot — and I’d say over 2,000 have some VFX elements. Andrew Jackson, who did 300, was the VFX supervisor. A lot of that was done as post-viz — so the team did simple comps or simple animation, erasures and so on, and if they were good enough, we didn’t pass them on to the VFX houses… Method or Iloura in Sydney.

You also brought Eric Whipp, a DI colorist from Toronto who did the Happy Feet films, down to work on it full-time?
Yeah, and in the DI he was really pushing the Baselights to do stuff like sky replacements. The problem was, we shot for nearly 140 days but the story happens over three days, so you needed consistency in the skies, and he was able to do all that very quickly and cheaply in the DI. We did a preliminary DI on the set and were grading our dailies, and we also had our own Baselights in the editing suites in Sydney.

All that was so important — having postviz, editorial and Baselight all working together. And often Margaret or her assistants would comp performances in editorial, so there’s a lot of plasticity between the cuts now that we didn’t have in the past when it was all celluloid. (See our interview with colorist Eric Whipp here)

Digital, especially in post, must really suit your style of filmmaking.
Completely. I learned so much from doing animation in the Babe and Happy Feet movies, and now nearly every film involves animation and CGI to some extent. The biggest advantage of digital on this film was safety — you just erase harnesses, wires and so on. And also being able to erase tire marks from previous takes. That was huge for us!

FURY ROAD THREE

The film has a very gritty and over-saturated look. Was that all done in post or was it a combination?
It was a combination of design and post. We designed it to be pretty monochrome. In a way it’s all variations of reds, browns, yellows and very little color.

There are all these rumors you’re going to shoot Mad Max: Wasteland next. True?
(Laughs) All I can say is it’s not even the real title, but we are definitely talking about it.

Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe

George Miller photos credit: Jasin Boland

RSP’s Tim Crosbie gets VFX Oscar nom for ‘X-Men: Days of Future Past’

VFX supervisor Tim Crosbie, from Australia’s Rising Sun Pictures (RSP) is part of the team nominated for an Academy Award for Best Achievement in Visual Effects for the film X-Men: Days of Future Past.

Crosbie supervised the team of RSP artists involved in creating the film’s “Kitchen” scene where time appears to stand still as the super-fast Quicksilver moves around the Pentagon cafeteria distracting guards who are attacking a group of mutants. Also named in the nomination are the film’s VFX supervisor Richard Stammers, Digital Domain’s Lou Pecora and Object Inc.’s Cameron Waldbauer.

The X-Men “Kitchen” scene involved a blend of live-action, CG objects and visual effects. RSP collaborated with Stammers and director Bryan Singer to bring the scene to life through the production of scores of CG props, including frying pans, knives, pots of boiling soup, carrots and bullets, as well as the cascades of water droplets. Each of these elements needed to be rendered in near microscopic detail, placed precisely within the geometry of the kitchen and choreographed to move and react realistically to lighting, other objects and characters.

RSP also integrated Quicksilver into the near frozen environment. That illusion was accomplished through a combination of live action, a stunt double, greenscreen photography, a partial CG body replacement and a shimmering “rain tunnel” that forms around Quicksilver (caused by his swift passage through the near motionless falling water). All of this had to work properly in 2D and stereo 3D. The studio’s main tool is Houdini.

Crosbie reports that pulling off a sequence like Quicksilver’s visit to the Pentagon is more than a matter of managing data. “This work is grounded in realism,” he says. “Even though it’s a fantastical event you still want to feel as though you are there. The biggest challenge is to find that balance between an exciting, magical event and one that looks real.”

X-Men_TwoShot_Gun

“All of us at RSP are extremely proud of our work on X-Men: Days of Future Past and to have it recognized by our peers is very gratifying,” says RSP executive producer Tony Clark. “We are very grateful to director Bryan Singer, visual effects supervisor Richard Stammers and 20th Century Fox for giving us the opportunity to work on such a fun, exciting and challenging project.”

RSP’s X-Men work has been recognized by other organizations as well. Crosbie is also named in a BAFTA Award nomination for Best Achievement in Special Visual Effects.  RSP artists are named in two VES Award nominations, Outstanding Virtual Cinematography in a Photoreal/Live Action Motion Media Project (Dennis Jones) and Outstanding Effects Simulations in a Photoreal/Live Action Feature Motion Picture (Adam Paschke, Premamurti Paetsch, Sam Hancock and Timmy Lundin). X-Men: Days of Future Past is also nominated for one of the VES Awards’ Outstanding Visual Effects in a Visual Effects-Driven Photoreal/Live Action Feature Motion Picture.

Catching up with some Oscar nominees

By Randi Altman

On the heels of the recent Oscar nominations, postPerspective decided to reach out to a few of those chosen and gather their reactions.

Ben Grossmann was nominated for his work (alongside Roger Guyett, Patrick Tubach, and Burt Dalton) on Star Trek Into Darkness.  He is already the owner of a VFX Oscar statue for his contribution to Hugo (2011). Now a partner in Magnopus, a visual solutions companybased in downtown LA, Grossmann was at Pixomondo while working Star Trek Into Continue reading