Tag Archives: Lost in Space

Netflix’s Lost in Space: mastering for Dolby Vision HDR, Rec.709

There is a world of difference between Netflix’s ambitious science-fiction series Lost in Space (recently renewed for another 10 episodes) and the beloved but rather low-tech, tongue-in-cheek 1960s show most fondly remembered for the repartee between persnickety Dr. Smith and the rather tinny-looking Robot. This series, starring Molly Parker, Toby Stevens and Parker Posey (in a very different take on Dr. Smith), is a very modern, VFX-intensive adventure show with more deeply wrought characters and elaborate action sequences.

Siggy Ferstl

Colorist Siggy Ferstl of Company 3 devoted a significant amount of his time and creative energy to the 10-episode release over the five-and-a-half-month period the group of 10 episodes was in the facility. While Netflix’s approach to dropping all 10 episodes at once, rather than the traditional series schedule of an episode a week, fuels excitement and binge-watching among viewers, it also requires a different kind of workflow, with cross-boarded shoots across multiple episodes and different parts of episodes coming out of editorial for color grading throughout the story arc. “We started on episode one,” Ferstl explains, “but then we’d get three and portions of six and back to four, and so on.”

Additionally, the series was mastered both for Dolby Vision HDR and Rec.709, which added additional facets to the grading process over shows delivered exclusively for Rec.709.

Ferstl’s grading theater also served as a hub where the filmmakers, including co-producer Scott Schofield, executive producer Zack Estrin and VFX supervisor Jabbar Raisani could see iterations of the many effects sequences as they came in from vendors (Cinesite, Important Looking Pirates and Image Engine, among others).

Ferstl himself made use of some new tools within Resolve to create a number of effects that might once have been sent out of house or completed during the online conform. “The process was layered and very collaborative,” says Ferstl. “That is always a positive thing when it happens but it was particularly important because of this series’ complexity.”

The Look
Shot by Sam McCurdy, the show’s aesthetic was designed, “to have a richness and realness to the look,” Ferstl explains. “It’s a family show but it doesn’t have that vibrant and saturated style you might associate with that. It has a more sophisticated kind of look.”

One significant alteration to the look involves changes to the environment of the planet onto which the characters crash land. The filmmakers wanted the exteriors to look less Earthlike with foliage a bit reddish, less verdant than the actual locations. The visual effects companies handled some of the more pronounced changes, especially as the look becomes more extreme in later episodes, but for a significant amount of this work, Ferstl was able to affect the look in his grading sessions — something that until recently would likely not have been achievable.

Ferstl, who has always sought out and embraced new technology to help him do his job, made use of some features that were then brand new to Resolve 14. In the case of the planet’s foliage, he made use of the Color Compressor tool within the OpenFX tab on the color corrector. “This allowed me take a range of colors and collapse that into a single vector of color,” he explains. “This lets you take your selected range of colors, say yellows and greens in this case, and compress them in terms of hue, saturation and luminance.” Sometimes touted as a tool to give colorists more ability to even out flesh tones, Ferstl applied the tool to the foliage and compressed the many shades of green into a narrower range prior to shifting the resulting colors to the more orange look.

“With foliage you have light greens and darker greens and many different ranges within the color green,” Ferstl explains. “If we’d just isolated those ranges and turned them orange individually, it wouldn’t give us the same feel. But by limiting the range and latitude of those greens in the Color Compressor and then changing the hue we were able to get much more desirable results.” Of course, Ferstl also used multiple keys and windows to isolate the foliage that needed to change from the elements of the scenes that didn’t.

He also made use of the Camera Shake function, which was particularly useful in a scene in the second episode in which an extremely heavy storm of sharp hail-like objects hits the planet, endangering many characters. The storm itself was created at the VFX houses, but the additional effect of camera shake on top of that was introduced and fine-tuned in the grade. “I suggested that we could add the vibration, and it worked very well,” he recalls. By doing the work during color grading sessions, Ferstl and the filmmakers in the session could see that effect as it was being created, in context and on the big screen, and could fine-tune the “camera movement” right then and there.

Fortunately, the colorist notes, the production afforded the time to go back and revise color decisions as more episodes came into Company 3. “The environment of the planet changes throughout. But we weren’t coloring episodes one after the other. It was really like working on a 10-hour feature.

“If we start at episode one and jump to episode six,” Ferstl notes, “exactly how much should the environment have changed in-between? So it was a process of estimating where the look should land but knowing we could go back and refine those decisions if it proved necessary once we had the surrounding episodes for context.”

Dolby Vision Workflow
As most people reading this know, mastering in high dynamic range (Dolby Vision in this case) opens up the possibility of working within a significantly expanded contrast range and wider color gamut over Rec.709 standard for traditional HD. Lost in Space was mastered concurrently for both, which required Ferstl to use Dolby’s workflow. And this involves making all corrections for the HDR version and then allowing the Dolby hardware/software to analyze the images to bring them into the Rec.709 space for the colorist to do a standard-def pass.

Ferstl, who worked with two Sony X-300 monitors, one calibrated for Rec.709 and the other for HDR, explains, “Everyone is used to looking at Rec. 709. Most viewers today will see the show in Rec.709 and that’s really what the clients are most concerned with. At some point, if HDR becomes the dominant way people watch television, then that will probably change. But we had to make corrections in HDR and then wait for the analysis to show us what the revised image looked like for standard dynamic range.”

He elaborates that while the Dolby Vision spec allows the brightest whites to read at 4000 nits, he and the filmmakers preferred to limit that to 1000 nits. “If you let highlights go much further than we did,” he says, “some things can become hard to watch. They become so bright that visual fatigue sets in after too long. So we’d sometimes take the brightest portions of the frame and slightly clamp them,” he says of the technique of holding the brightest areas of the frame to levels below the maximum the spec allows.

“Sometimes HDR can be challenging to work with and sometimes it can be amazing,” he allows. Take the vast vistas and snowcapped mountains we first see when the family starts exploring the planet. “You have so much more detail in the snow and an amazing range in the highlights than you could ever display in Rec.709,” he says.

“In HDR, the show conveys the power and majesty of these vast spaces beyond what viewers are used to seeing. There are quite a few sections that lend themselves to HDR,” he continues. But as with all such tools, it’s not always appropriate to the story to use the extremes of that dynamic range. Some highlights in HDR can pull the viewer’s attention to a portion of the frame in a way that simply can’t be replicated in Rec. 709 and, likewise, a bright highlight from a practical or a reflection in HDR can completely overpower an image that tells the story perfectly in standard dynamic range. “The tools can re-map an image mathematically,” Ferstl notes, “but it still requires artists to interpret an image’s meaning and feel from one space to the other.”

That brings up another question: How close do you want the HDR and the Rec.709 to look to each other when they can look very different? Overall, the conclusion of all involved on the series was to constrain the levels in the HDR pass a bit in order to keep the two versions in the same ballpark aesthetically. “The more you let the highlights go in HDR,” he explains, “the harder it is to compress all that information for the 100-nit version. If you look at scenes with the characters in space suits, for example, they have these small lights that are part of their helmets and if you just let those go in HDR, those lights become so distracting that it becomes hard to look at the people’s faces.”

Such decisions were made in the grading theater on a case by case basis. “It’s not like we looked at a waveform monitor and just said, ‘let’s clamp everything above this level,’” he explains, “it was ultimately about the feeling we’d get from each shot.”

Netflix’s Lost in Space: New sounds for a classic series

By Jennifer Walden

Netflix’s Lost in Space series, a remake of the 1965 television show, is a playground for sound. In the first two episodes alone, the series introduces at least five unique environments, including an alien planet, a whole world of new tech — from wristband communication systems to medical analysis devices — new modes of transportation, an organic-based robot lifeform and its correlating technologies, a massive explosion in space and so much more.

It was a mission not easily undertaken, but if anyone could manage it, it was four-time Emmy Award-winning supervising sound editor Benjamin Cook of 424 Post in Culver City. He’s led the sound teams on series like Starz’s Black Sails, Counterpart and Magic City, as well as HBO’s The Pacific, Rome and Deadwood, to name a few.

Benjamin Cook

Lost in Space was a reunion of sorts for members of the Black Sails post sound team. Making the jump from pirate ships to spaceships were sound effects editors Jeffrey Pitts, Shaughnessy Hare, Charles Maynes, Hector Gika and Trevor Metz; Foley artists Jeffrey Wilhoit and Dylan Tuomy-Wilhoit; Foley mixer Brett Voss; and re-recording mixers Onnalee Blank and Mathew Waters.

“I really enjoyed the crew on Lost in Space. I had great editors and mixers — really super-creative, top-notch people,” says Cook, who also had help from co-supervising sound editor Branden Spencer. “Sound effects-wise there was an enormous amount of elements to create and record. Everyone involved contributed. You’re establishing a lot of sounds in those first two episodes that are carried on throughout the rest of the season.”

Soundscapes
So where does one begin on such a sound-intensive show? The initial focus was on the soundscapes, such as the sound of the alien planet’s different biomes, and the sound of different areas on the ships. “Before I saw any visuals, the showrunners wanted me to send them some ‘alien planet sounds,’ but there is a huge difference between Mars and Dagobah,” explains Cook. “After talking with them for a bit, we narrowed down some areas to focus on, like the glacier, the badlands and the forest area.”

For the forest area, Cook began by finding interesting snippets of animal, bird and insect recordings, like a single chirp or little song phrase that he could treat with pitching or other processing to create something new. Then he took those new sounds and positioned them in the sound field to build up beds of creatures to populate the alien forest. In that initial creation phase, Cook designed several tracks, which he could use for the rest of the season. “The show itself was shot in Canada, so that was one of the things they were fighting against — the showrunners were pretty conscious of not making the crash planet sound too Earthly. They really wanted it to sound alien.”

Another huge aspect of the series’ sound is the communication systems. The characters talk to each other through the headsets in their spacesuit helmets, and through wristband communications. Each family has their own personal ship, called a Jupiter, which can contact other Jupiter ships through shortwave radios. They use the same radios to communicate with their all-terrain vehicles called rovers. Cook notes these ham radios had an intentional retro feel. The Jupiters can send/receive long-distance transmissions from the planet’s surface to the main ship, called Resolute, in space. The families can also communicate with their Jupiters ship’s systems.

Each mode of communication sounds different and was handled differently in post. Some processing was handled by the re-recording mixers, and some was created by the sound editorial team. For example, in Episode 1 Judy Robinson (Taylor Russell) is frozen underwater in a glacial lake. Whenever the shot cuts to Judy’s face inside her helmet, the sound is very close and claustrophobic.

Judy’s voice bounces off the helmet’s face-shield. She hears her sister through the headset and it’s a small, slightly futzed speaker sound. The processing on both Judy’s voice and her sister’s voice sounds very distinct, yet natural. “That was all Onnalee Blank and Mathew Waters,” says Cook. “They mixed this show, and they both bring so much to the table creatively. They’ll do additional futzing and treatments, like on the helmets. That was something that Onna wanted to do, to make it really sound like an ‘inside a helmet’ sound. It has that special quality to it.”

On the flipside, the ship’s voice was a process that Cook created. Co-supervisor Spencer recorded the voice actor’s lines in ADR and then Cook added vocoding, EQ futz and reverb to sell the idea that the voice was coming through the ship’s speakers. “Sometimes we worldized the lines by playing them through a speaker and recording them. I really tried to avoid too much reverb or heavy futzing knowing that on the stage the mixers may do additional processing,” he says.

In Episode 1, Will Robinson (Maxwell Jenkins) finds himself alone in the forest. He tries to call his father, John Robinson (Toby Stephens — a Black Sails alumni as well) via his wristband comm system but the transmission is interrupted by a strange, undulating, vocal-like sound. It’s interference from an alien ship that had crashed nearby. Cook notes that the interference sound required thorough experimentation. “That was a difficult one. The showrunners wanted something organic and very eerie, but it also needed to be jarring. We did quite a few versions of that.”

For the main element in that sound, Cook chose whale sounds for their innate pitchy quality. He manipulated and processed the whale recordings using Symbolic Sound’s Kyma sound design workstation.

The Robot
Another challenging set of sounds were those created for Will Robinson’s Robot (Brian Steele). The Robot makes dying sounds, movement sounds and face-light sounds when it’s processing information. It can transform its body to look more human. It can use its hands to fire energy blasts or as a tool to create heat. It says, “Danger, Will Robinson,” and “Danger, Dr. Smith.” The Robot is sometimes a good guy and sometimes a bad guy, and the sound needed to cover all of that. “The Robot was a job in itself,” says Cook. “One thing we had to do was to sell emotion, especially for his dying sounds and his interactions with Will and the family.”

One of Cook’s trickiest feats was to create the proper sense of weight and movement for the Robot, and to portray the idea that the Robot was alive and organic but still metallic. “It couldn’t be earthly technology. Traditionally for robot movement you will hear people use servo sounds, but I didn’t want to use any kind of servos. So, we had to create a sound with a similar aesthetic to a servo,” says Cook. He turned to the Robot’s Foley sounds, and devised a processing chain to heavily treat those movement tracks. “That generated the basic body movement for the Robot and then we sweetened its feet with heavier sound effects, like heavy metal clanking and deeper impact booms. We had a lot of textures for the different surfaces like rock and foliage that we used for its feet.”

The Robot’s face lights change color to let everyone know if it’s in good-mode or bad-mode. But there isn’t any overt sound to emphasize the lights as they move and change. If the camera is extremely close-up on the lights, then there’s a faint chiming or tinkling sound that accentuates their movement. Overall though, there is a “presence” sound for the Robot, an undulating tone that’s reminiscent of purring when it’s in good-mode. “The showrunners wanted a kind of purring sound, so I used my cat purring as one of the building block elements for that,” says Cook. When the Robot is in bad-mode, the sound is anxious, like a pulsing heartbeat, to set the audience on edge.

It wouldn’t be Lost in Space without the Robot’s iconic line, “Danger, Will Robinson.” Initially, the showrunners wanted that line to sound as close to the original 1960’s delivery as possible. “But then they wanted it to sound unique too,” says Cook. “One comment was that they wanted it to sound like the Robot had metallic vocal cords. So we had to figure out ways to incorporate that into the treatment.” The vocal processing chain used several tools, from EQ, pitching and filtering to modulation plug-ins like Waves Morphoder and Dehumaniser by Krotos. “It was an extensive chain. It wasn’t just one particular tool; there were several of them,” he notes.

There are other sound elements that tie into the original 1960’s series. For example, when Maureen Robinson (Molly Parker) and husband John are exploring the wreckage of the alien ship they discover a virtual map room that lets them see into the solar system where they’ve crashed and into the galaxy beyond. The sound design during that sequence features sound material from the original show. “We treated and processed those original elements until they’re virtually unrecognizable, but they’re in there. We tried to pay tribute to the original when we could, when it was possible,” says Cook.

Other sound highlights include the Resolute exploding in space, which caused massive sections of the ship to break apart and collide. For that, Cook says contact microphones were used to capture the sound of tin cans being ripped apart. “There were so many fun things in the show for sound. From the first episode with the ship crash and it sinking into the glacier to the black hole sequence and the Robot fight in the season finale. The show had a lot of different challenges and a lot of opportunities for sound.”

Lost in Space was mixed in the Anthony Quinn Theater at Sony Pictures in 7.1 surround. Interestingly, the show was delivered in Dolby’s Home Atmos format. Cook explains, “When they booked the stage, the producer’s weren’t sure if we were going to do the show in Atmos or not. That was something they decided to do later so we had to figure out a way to do it.”

They mixed the show in Atmos while referencing the 7.1 mix and then played those mixes back in a Dolby Home Atmos room to check them, making any necessary adjustments and creating the Atmos deliverables. “Between updates for visual effects and music as well as the Atmos mixes, we spent roughly 80 days on the dub stage for the 10 episodes,” concludes Cook.