Tag Archives: Patrick Birk

Mixing and sound design for NatGeo’s Cosmos: Possible Worlds

By Patrick Birk

National Geographic’s Cosmos returned for 2020 with Possible Worlds, writer/director/producer Ann Druyan’s reimagining of the house that Carl Sagan built. Through cutting-edge visuals combined with the earnest, insightful narration of astrophysicist Neil deGrasse Tyson, the series aims to show audiences how brilliant the future could be… if we learn to better understand the natural phenomena of which we are a part.

I recently spoke with supervising sound editor/founder Greg King and sound designer Jon Greasley of LA’s King Soundworks about how they tackled the challenges of bringing the worlds of forests and bees to life in Episode 6, “The Search for Intelligent Life on Earth.”

L-R: Greg King and Jon Greasley

In this episode, Neil deGrasse Tyson talks about ripples in space time. It sounds like drops of water, but it also sounds a little synthesized to me. Was that part of the process?
Jon Greasley: Sometimes we do use synthesized sound, but it depends on the project. For example, we use the synthesizer a great deal when we’re doing science-fiction work, like The Orville, to create user interface beeps, spaceship noises and things. But for this show, we stayed away from that because it’s about the organic universe around us, and how we fit into it.

We tried to stick with recordings of real things for this show, and then we did a lot of processing and manipulation, but we tried to do it in a way where everything still sounded grounded and organic and natural. So if there was an instance where we might perhaps want to use some sort of synth bass thing, we would instead, for example, use a real bass guitar or stringed instrument — things that provided the show with an organic feel.

Did you guys provide the score as well?
Greasley: No, Alan Silvestri did the score, but there’s just so much we can do. Everybody that works at King Soundworks, almost without exception, is a musician. We’ve got drummers, guitarists, bass players and keyboard players. Having a sense of musicality really helps with the work that we do, so those are just honestly tools in our tool kit that we can go to very leisurely because it’s second nature to us. There’s a bunch of guitars on the wall at our main office, and everybody’s pulling guitars and basses out and playing throughout the day.

Greg King: We even use a didgeridoo as part one of the elements for the Imagination — the ship that Neil deGrasse Tyson flies around in — because we like the low throbbing oscillating tone and the pitch ranges we can get in it.

Sometimes I wasn’t sure where the sound design and score intersected. How do you balance those two, and what was the creative process like between yourselves and the Silvestri?
King: Alan is one of the top composers in Hollywood. Probably the biggest recent thing he did was the Avenger movies. He’s a super-pro, so he knows the score, he understands what territory the sound design is going to take and when each element is going to take center stage. More often than not, when we’re working with composers, that tends to be when things bump or don’t bump, but when you’re dealing with a pro like Alan, it’s innate with him — when score or design take over.

Due to the show’s production schedule, we were often getting VFX while we were mixing it, which required some improvisation. We’d get input from executive producers Brannon Braga and Ann Druyan, and once we had the VFX, if we needed to move Neil’s VO by a second, we could do that. We could start the music five seconds later, or maybe sound design would need to take over, and we get out of the music for 30 seconds. And conversely, if we just had 30 seconds of this intense sound design moment, we could get rid of our sound effects and sound design and let music carry this scene.

You pre-plan as much as you can, but because of the nature of this show, there was a lot of improvisation happening on the stage as we were mixing. We would very often just try things, and we were given the latitude by Ann and Brannon to try that stuff and experiment. The only rule was to tell the story better.

I heard that sense of musicality you’d mentioned, even in things like the backgrounds of the show. For example, Neil deGrasse Tyson’s walking through the forest, and you have it punctuated with woodpeckers.
Greasley: That was a good layer. There’s a sense of rhythm in nature anyway. We talk about this a lot… not necessarily being able to identify a constant or consistent beat or rhythm, but just the fact that the natural world has all of these ebbs and flows and rhythms and beats.

In music theory classes, they’ll talk about how there’s a reason 4/4 is the most common time signature, and it’s because so many things we do in life are in fours: walking or your heartbeat, anything like that. That’s the theory, anyway.

King: Exactly, because one of the overarching messages of this series is that we’re all connected, everything’s connected. We don’t live in isolation. So from the cellular level in our planet to within our bodies to this big macro level through the universe, things have a natural rhythm in a sense and communicate consciously or unconsciously. So we try to tie things together by building rhythmic beats and hits so they feel connected in some way.

Did you use all of the elements of sound design for that? Backgrounds? Effects?
King: Absolutely. Yeah, we’ll do that in the backgrounds, like when Neil deGrasse is walking across the calendar, we’ll be infusing that kind of thing. So as you go from scene to scene and episode to episode, there’s a natural feel to things. It doesn’t feel like individual events happening, but they’re somehow, even subconsciously, tied together.

It definitely contributed to an emotional experience by the end of the episode. For the mycelium network, what sound effects or recordings did you start off with? Sounds from nature, and then you process them?
King: Yes. And sparking. We had recordings, a whole bunch of different levels of sparking, and we took these electrical arcs and manipulated and processed them to give it that lighter, more organic feeling. Because when we saw the mycelium, we were thinking of connecting the communication of the bees, brain waves and mycelium, sending information among the different plants. That’s an example of things we’re all trying to tie together on that organic level.

It’s all natural, so we wanted to keep it feeling that way so that the mycelium sound would then tie into brain wave sounds or bees communicating.

Greasley: Some of the specific elements used in the mycelium include layers that are made from chimes, like metallic or wooden chimes, that are processed and manipulated. Then we used the sounds of gas — pressure release-type sound of air escaping. That gives you that delicate almost white noise, but in a really specific way. We use a lot of layers of those sorts of things to create the idea of those particles moving around and communicating with each other as well.

You stress the organic nature of the sound design, but at times elements sounded bitcrushed or digitized a bit, and that made sense to me. The way I understand things like neural networks is almost already in a digital context. Did you distort the sounds, mix them together to glue them?
Greasley: There’s definitely a ton of layers, and sometimes yeah, it can help to run everything through one process to help the elements stick. I don’t specifically bitcrush, although we did a lot of stuff with some time stretching. So sometimes you do end up with artifacting, and sometimes it’s desirable and sometimes it isn’t. There’s lots of reverb because reverb is one of the more naturalistic sounding processes you can do.

With reverbs in mind, how much of the reverbs on Tyson’s voice were recorded on set, and how much was added in post?
King: That’s a great question because the show’s production period was long. In one shot, Mr. Tyson may be standing on cliffs next to the ocean, and then the next time you see him he’s in this very lush forest. Not only are those filmed at different times, but because they’re traveling around so much, they often hire a local sound recordist. So not only is his voice recorded at different times and in different locations, but by different sets of equipment.

There’s also a bunch of in-studio narration, and that came in multiple parts as well. As they were editing, they discovered we needed to flush out this line more, or now that we’ve cut it this way, we have to add this information, or change this cadence.

So now you had old studio recordings, new studio recordings and all the various different location recordings. And we’re trying to make it sound like it’s one continuous piece, so you don’t hear all those differences. We used a combination of reverbs so that when you went from one location, you didn’t have a jarring reverb change.

A lot of it was our ADR mixer Laird Fryer, who really took it upon himself to research those original production recordings so when Neil came into the studio here, he could match the microphones as much as possible. Then our ADR supervisor Elliot Thompson would go through and find the best takes that matched. It was actually one of the bigger tasks of the show.

Do you use automated EQ match tools as a starting point?
King: Absolutely. I use iZotope EQ matching all the time. That’s the starting point. And sometimes you get lucky and it matches great right away, and you go, “Wow, that was awesome. Fantastic.” But usually, it’s a starting point, and then it’ll be a combination of additional EQ by ear, and you’ll do reverb matching and EQing the reverb. Then I’ll use a multi-band compressor. I like the FabFilter multiband compressor, and I’ll use that to even further roll the EQ of the dialogue in a gentler way.

I’ve used all those different tools to try getting it as close as I could. And sometimes there will be a shift in the quality of his dialogue, but we decided that was a better way to go because maybe there was just a performance element of the way he delivered a line. So essentially the trade-off was to go with a minor mismatch to keep the performance.

What would your desert island EQ be?
King: We have different opinions. We both do sound effects, and we both do dialogue, so it’s a personal taste. On dialogue, right now I’m a real fan of the FabFilter suite of EQs. For music, I tend to use the McDSP EQs.

Greasley: We’re on the same page for the music EQs. When I’m mixing music, I love McDSP Channel G. Not only the EQ; the compressor is also fantastic on that particular plugin. I use that on all of my sound effects and sound design tracks too. Obviously, before you get to the mix, there’s a whole bunch of other stuff you could use from the design stage, but once I’m actually mixing it, the Channel G is my go-to.

VFX play a heavy role in both the mycelium network and the bee dances. Can you talk about how that affected your workflow/process?
Greasley: When we started prepping the design, some of the visuals were actually not very far along. It’s fun to watch the early cuts of the episodes because what’s ultimately going to end up being Neil standing there with a DNA strand floating above the palm of his hand begins with him standing in front of a greenscreen, and there’s the light bulb in a C stand in his hand.

Sometimes, we had to start working our sound concepts based almost purely on the description of what we were eventually going to be seeing. Based off that, and the conversations that we had with Ann Druyan and Brannon Braga in the spotting sessions, the sound concepts would have to develop in tandem with the visual concepts — both are based off of the intellectual concepts. Then on the mix stage, we would get some of these visual elements in, and we would have to tweak what we had done and what the rest of that team had done right up until the 11th hour.

Were your early sound sketches shown to the VFX department so they could take inspiration from that?
Greasley: That’s a good question. We did provide some stuff, not necessarily to the VFX department, but to the picture editing team. They would ask us to send things not to help so much with conceptualization of things, but with timings. So one of the things they asked us for early on was sounds for the Ship of the Imagination. They would lay those sounds in, and that helped them to get the rhythm of the show and to get a feel for where certain sounds are going to align.

I’m surprised to hear how early in the production process you began working on your sound design, based on how well the bee dance sounds match the light tracer along the back of the bee.
King: That was a lot of improvisation Jon and I were doing on the mix stage. We’re both sound designers, sound editors and mixers, so while we were mixing, we would be getting updates because part of the bee dance sequence is animated — pure hand-drawn animated stuff in the bee sequence — and some of it is actually beehive material, where they show you in a graphical way how the bees communicate with their wiggles and their waggles.

We then figured out a way to grab a bee sound and make it sound like it’s doing those movements, rhythms and wiggles. There’s a big satellite dish in the show, and at the end, you hear these communications coming through the computer panel that are suggested as alien transmissions. We actually took the communication methods we had developed for the bee wiggles and waggles and turned that into alien communication.

What did you process it with to achieve that?
King: Initially, we recorded actual bee sounds. We’re lucky that I live about an hour outside of LA in Santa Paula, which has beehives everywhere. We took constant bee sounds, edited them and used LFOs filters to get the rhythms, and then we’d do sound editing for the start and stops.

For the extraterrestrial communication at the end, we took the bee sounds and really thinned them out and processed them to make them sound a little more radio frequency/telecommunication-like. Then we also took shortwave radio sounds and ran that through the exact process of the LFO filters and the editing so we had the same rhythm. So while the sound is different, it sounds like a similar form of communication.

What I really learned from the series is that there’s all this communication going on that we aren’t aware of, and the mycelium’s a great example of that. I didn’t know different trees and plants communicated with each other — communicate the condition of the soil, root supply and pest invasion. It makes you see a forest in a different way.

It’s the same with the bees. I knew bees were intelligent insects, but I had no idea that a bee could pinpoint an exact location two or three miles away by a sophisticated form of communication. So that gave us the inspiration that runs through the whole series. We’re all dependent on each other; we’re all communicating with each other. In our sound design process, we wanted there to be a thread between all those forms of communication, whatever they are — that they’re basically all coming from the same place.

There’s a scene where a bee goes out to scout a new hive and ends up in a hollowed-out tree. It’s a single bee floating and moving up, down, left, right, front, back. I imagine you’d achieve that movement through panning and the depth would be through volume. Is there any mixing trick that you’re using to do the up and down?
Greasley: That’s such a level of detail. That’s cool that you even asked the question. Yes, left and right obviously; we’re in 5.1, so panning left and right, up and down. As with most things, it’s the simplest things that get you the best results. So EQ and reverb, essentially. You can create the illusion of height with the EQ. Say you do a notch at a certain frequency, and then as the bee flies up, you just roll the center of that frequency up higher. So you track the up and down movement of the bee with a little notch in EQ, and it gives you this extra sense of movement. Since the frequency is moving up and down, you can trick the ear and the brain into perceiving it as height because that’s what you’re looking at. It’s that great symbiosis of audio and video working together.

Then you can use a little bit of a woody-sounding reverb, like a convolution reverb that was recorded in a tight wood room, and then take that as the inside of this hollowed-out tree.

King: A lot of pitch work was done with the bees too. Because when you record a bee, they’re so quiet; it basically goes “bzz” and it’s gone. So you actually end up using a lot of, let’s call them static bees, where the bee is buzzing. Now, you’re having to pitch that in fake dopplers to give the sense of movement. You’re going to have it pitched down as it gets further away and add more reverb, and then do an EQ layer on that, and the same as one’s approaching or one’s flying by. So you’re actually spending a lot of time just creating what feels like very natural sounds but that aren’t really possible to record.

A plugin like Serato Pitch ‘n Time is great for variable pitch too, because if you want something to sound like it’s moving away from you, you have a drop in pitch during the course of it, and the reverse for something approaching you.

Greg King playing guitar

How do you get a single bee sound?
King: You gather a few bees and do a few different things. The easiest way is to get a few bees, bring them into your house, and release them at a brightly lit window. Then the bees are buzzing away like crazy to try to get out the window. You can just track it with the microphone. You’ll then have to go through and edit out any of the louder window knocks.

I’ve tried all different things through the years, like having them in jars and all that kind of stuff, but there’s too much acoustic to that. I’ve discovered that with flies, grasshoppers, or any of the larger winged insects that actually make a noise, doing it in the daytime against the window is the best way because they’ll go for a long time.

What was your biggest takeaway, as an artist, as a sound designer, from working on this project?
Greasley: It was so mind-blowing how much we learned from the people on Cosmos. The people that put the show together can accurately be described as geniuses, particularly Ann. She’s just so unbelievably smart.

Each episode had its individual challenges and taught us things in terms of the craft, but I think for me, the biggest takeaway on a personal and intellectual level is the interconnectedness of everything in the observable world. And the further we get with science, the more we’re able to observe, whether it’s at the subatomic quantum level or billions of light-years away.

Just the level to which all life and matter is interconnected and interdependent.

I also think we’re seeing practical examples of that right now with the coronavirus, in terms of unexpected consequences. It’s like a microcosm for what could happen in the future.
King: On a whole philosophical level, we’re at this particular point in time globally, where we seem to be going down a path of ignoring science, or denying science is there. And when you get to watch a series like Cosmos, you can see science is how we’re going to survive. If we learn to interact with nature, and use nature as a technology, as opposed to using nature as a resource, what we could eventually do is mind-blowing. So I think the timing of this is ideal.

Patrick Birk is a musician, sound engineer and post pro at Silver Sound, a boutique sound house based in New York City.