Quantum F1000

Category Archives: streaming video

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

Amazon’s The Expanse Season 4 gets HDR finish

The fourth season of the sci-fi series The Expanse was finished in HDR for the first time streaming via Amazon Prime Video. Deluxe Toronto handled end-to-end post services, including online editorial, sound remixing and color grading. The series was shot on ARRI Alexa Minis.

In preparation for production, cinematographer Jeremy Benning, CSC, shot anamorphic test footage at a quarry that would serve as the filming stand-in for the season’s new alien planet, Ilus. Deluxe Toronto senior colorist Joanne Rourke then worked with Benning, VFX supervisor Bret Culp, showrunner Naren Shankar and series regular Breck Eisner to develop looks that would convey the location’s uninviting and forlorn nature, keeping the overall look desaturated and removing color from the vegetation. Further distinguishing Ilus from other environments, production chose to display scenes on or above Ilus in a 2.39 aspect ratio, while those featuring Earth and Mars remained in a 16:9 format.

“Moving into HDR for Season 4 of our show was something Naren and I have wanted to do for a couple of years,” says Benning. “We did test HDR grading a couple seasons ago with Joanne at Deluxe, but it was not mandated by the broadcaster at the time, so we didn’t move forward. But Naren and I were very excited by those tests and hoped that one day we would go HDR. With Amazon as our new home [after airing on Syfy], HDR was part of their delivery spec, so those tests we had done previously had prepared us for how to think in HDR.

“Watching Season 4 come to life with such new depth, range and the dimension that HDR provides was like seeing our world with new eyes,” continues Benning. “It became even more immersive. I am very much looking forward to doing Season 5, which we are shooting now, in HDR with Joanne.”

Rourke, who has worked on every season of The Expanse, explains, “Jeremy likes to set scene looks on set so everyone becomes married to the look throughout editorial. He is fastidious about sending stills each week, and the intended directive of each scene is clear long before it reaches my suite. This was our first foray into HDR with this show, which was exciting, as it is well suited for the format. Getting that extra bit of detail in the highlights made such a huge visual impact overall. It allowed us to see the comm units, monitors, and plumes on spaceships as intended by the VFX department and accentuate the hologram games.”

After making adjustments and ensuring initial footage was even, Rourke then refined the image by lifting faces and story points and incorporating VFX. This was done with input provided by producer Lewin Webb; Benning; cinematographer Ray Dumas, CSC; Culp or VFX supervisor Robert Crowther.

To manage the show’s high volume of VFX shots, Rourke relied on Deluxe Toronto senior online editor Motassem Younes and assistant editor James Yazbeck to keep everything in meticulous order. (For that they used the Grass Valley Rio online editing and finishing system.) The pair’s work was also essential to Deluxe Toronto re-recording mixers Steve Foster and Kirk Lynds, who have both worked on The Expanse since Season 2. Once ready, scenes were sent in HDR via Streambox to Shankar for review at Alcon Entertainment in Los Angeles.

“Much of the science behind The Expanse is quite accurate thanks to Naren, and that attention to detail makes the show a lot of fun to work on and more engaging for fans,” notes Foster. “Ilus is a bit like the wild west, so the technology of its settlers is partially reflected in communication transmissions. Their comms have a dirty quality, whereas the ship comms are cleaner-sounding and more closely emulate NASA transmissions.”

Adds Lynds, “One of my big challenges for this season was figuring out how to make Ilus seem habitable and sonically interesting without familiar sounds like rustling trees or bird and insect noises. There are also a lot of amazing VFX moments, and we wanted to make sure the sound, visuals and score always came together in a way that was balanced and hit the right emotions story-wise.”

Foster and Lynds worked side by side on the season’s 5.1 surround mix, with Foster focusing on dialogue and music and Lynds on sound effects and design elements. When each had completed his respective passes using Avid ProTools workstations, they came together for the final mix, spending time on fine strokes, ensuring the dialogue was clear, and making adjustments as VFX shots were dropped in. Final mix playbacks were streamed to Deluxe’s Hollywood facility, where Naren could hear adjustments completed in real time.

In addition to color finishing Season 4 in HDR, Rourke also remastered the three previous seasons of The Expanse in HDR, using her work on Season 4 as a guide and finishing with Blackmagic DaVinci Resolve 15. Throughout the process, she was mindful to pull out additional detail in highlights without altering the original grade.

“I felt a great responsibility to be faithful to the show for the creators and its fans,” concludes Rourke. “I was excited to revisit the episodes and could appreciate the wonderful performances and visuals all over again.”

Quantum F1000

Telestream’s Wirecast now integrated in BoxCast platform

BoxCast has completed the integration of Telestream Wirecast with its BoxCast platform. Telestream Wirecast is a live video production software for Mac or Windows that helps create high-quality live video webcasts from multiple sources, including webcams and screen shares to using multiple cameras, graphics and media for live events.

As a result of the BoxCast/Wirecast integration, users can now easily stream high-quality video using BoxCast’s advanced, cloud-based platform. With unlimited streaming, viewership and destinations, BoxCast manages the challenging part of live video streaming.

The BoxCast Live Streaming Platform provides Wirecast users access to a number of features, including:
• Single Source Simulcasting
• Ticketed Monetization
• Password Protection
• Video Embedding
• Cloud Transcoding
• Live Support

How does it work? Using BoxCast’s RTMP video ingestion option, users can select BoxCast as a streaming destination from within Wirecast. This allows Wirecast to stream directly to BoxCast. It will use the computer for encoding the video and audio, and it will transmit over RTMP.

The setup can be used with either a single-use RTMP or static RTMP channel. However in both cases, the setup must be done within 10 minutes of a scheduled broadcast.

Another way to stream from Wirecast is to send the Wirecast program output to a secondary HDMI or SDI output that is plugged into the BoxCaster or BoxCaster Pro. The BoxCaster’s hardware encoding relieves your computer of encoding the video and audio in addition to taking advantage of specially-designed communication protocols to optimize your available network connectivity.

BoxCast integration with Telestream Wirecast is available immediately.


BoxCast offers end-to-end live streaming

By Jonathan Abrams

My interest in BoxCast originated with their social media publishing capabilities (Facebook Live,
YouTube Live, Twitter). I met with Gordon Daily (CEO/co-founder) and Sam Brenner (VP, marketing) during this year’s NAB Show.

BoxCast’s focus is on end-to-end live streaming and simplifying the process through automation. At the originating, or transmit (XMT), end is either a physical encoder or a software encoder. The two physical encoders are BoxCaster and BoxCaster Pro. The software encoders are Broadcaster and Switcher (for iDevices). The BoxCaster can accept either a 1080p60 (HDMI) or CVBS video input. Separate audio can be connected using two RCA inputs. The BoxCaster Pro ($990, shipping Q3) can accept a 4Kp60 input (12G-SDI or HDMI 2.0a) with High Dynamic Range (HDR10). If you are not using embedded audio, there are two combination XLR/TRS inputs.

Both the BoxCaster and BoxCaster Pro use the H.264 (AVC) codec, while the BoxCaster Pro can also use the H.265 (HEVC) codec, which provide approximately 2x improvement compared to H.264 (AVC). BoxCast is using Amazon Web Services (AWS) as its cloud. The encoder output is uploaded to the cloud using the BoxCast Flow protocol (patent pending), which mitigates lost packets using content-aware forward error correction (FEC) to mitigate lost packets, protocol-diversity (UDP and/or TCP), adaptive recovery, encryption and link quality adjustment for bandwidth flow control. Their FEC implementation does not have an impact on latency. Upload takes place via either Ethernet or Wi-Fi (802.11ac, 2×2 MIMO). The cloud is where distribution and transcoding takes place using BoxCast’s proprietary transcoding architecture. It is also where you can record your event and keep it for either a month or a year, depending upon which monthly cloud offering you subscribe to. Both recordings and the streams can be encrypted using their custom, proprietary solution.

At the receiving end (RCV) is an embedded player if you are not using Facebook Live or YouTube Live.


Jonathan Abrams is Chief Technical Engineer at NYC’s Nutmeg Creative.


Switcher Studio intros Go app for mobile video creators

Switcher Studio’s flagship product allows users to create TV-style multi-camera productions using iPhones and iPads. Over the past year they have seen a trend developing — a new type of live video that started to gain adoption with the integration of live streaming on platforms like Twitter (via Periscope) and Facebook.

Many of these broadcasts tend to be more spontaneous and less pre-planned, so Switcher Studio set out to find a way to make these types of productions a better experience while enhancing the creation process for existing users.

The result is specifically designed for mobile video creators. Switcher Go includes advanced video features that let you go beyond “point-and-shoot” to create more engaging live and recorded video.

Switcher Go allows users to:
– Wirelessly connect to another iPhone or iPad to remotely control the camera from your pocket.
– Sync directly to Facebook Live or YouTube Now to quickly go live with one touch.
– Dial in advanced camera controls such as focus, exposure, white balance and more.
– Personalize video content by adding photos or video from your device’s camera roll while recording or streaming.
– Users will be able mark moments during broadcasts, then easily trim and share clips on social media. This ability is coming soon to the product.

In the next few months, Switcher Go users will have the option to upgrade their free account to add unlimited photos and videos from their camera roll and Switcher’s cloud services and desktop tools, currently only available in their pro Switcher Studio product.