Tag Archives: Timecode Systems

Timecode’s new firmware paves the way for VR

Timecode Systems, which makes wireless technologies for sharing timecode and metadata, has launched a firmware upgrade that enhances the accuracy of its wireless genlock.

Promising sub-line-accurate synchronization, the system allows Timecode Systems products to stay locked in sync more accurately, setting the scene for development of a wireless sensor sync solution able to meet the requirements of VR/AR and motion capture.

“The industry benchmark for synchronization has always been ‘frame-accurate’, but as we started exploring the absolutely mission-critical sync requirements of virtual reality, augmented reality and motion capture, we realized sync had to be even tighter,” said Ashok Savdharia, chief technical officer at Timecode Systems. “With the new firmware and FPGA algorithms released in our latest update, we’ve created a system offering wireless genlock to sub-line accuracy. We now have a solid foundation on which to build a robust and immensely accurate genlock, HSYNC and VSYNC solution that will meet the demands of VR and motion capture.”

A veteran in camera and image sensor technology, Savdharia joined Timecode Systems last year. In addition to building up the company’s multi-camera range of solutions, he is leading a development team to pioneering a wireless sync system for the VR and motion capture market.

Timecode’s :Pulse for multicamera sync and control now available

Timecode Systems, which makes wireless technologies for sharing timecode and metadata, has made its :Pulse multi camera sync and control product available for purchase.

Powered by the company’s robust Blink RF protocol, the :Pulse offers wireless sync and remote device control capability in one product. Used in its simplest form, the :Pulse is a highly accurate timecode, genlock and word clock generator with an integrated RF transceiver to ensure solid synchronization with zero drift between timecode sources.

As well as being a hub for timecode and metadata exchange, it’s also a center for wirelessly controlling devices on multicamera shoots. With a :Pulse set as the timecode master unit, users can activate the device’s integral Wi-Fi or add a wired connection to the Ethernet port to open the free, multiplatform Blink Hub app on their smartphones, tablets or laptops.

Enabled by the Blink RF protocol, the Blink Hub app allows users to not only centrally monitor and control all Timecode Systems timecode sources on set, but also any compatible camera and audio equipment to which they are connected.

Timecode Systems has already developed a bespoke remote device control solution for Sound Devices 6-Series mixer/recorders and is working on adding to the :Pulse the capability to control GoPro, Arri and Red cameras remotely via the Blink Hub app.

“With the production of the SyncBac Pro, our embedded timecode sync accessory for GoPro cameras, now in full flow, we’re very close to launching remote control of Hero4 Silver and Black cameras,” says CEO Paul Scurrell. “Using either the :Pulse’s Wi-Fi or a wired Ethernet connection into the :Pulse, SyncBac Pro users will be able to connect their GoPro Hero4 Black and Silver cameras to the Blink Hub app. This, among other things, unlocks the capability to put a GoPro to sleep remotely and then start recording again from the app when the action starts again. It’s a great way to save the camera’s battery life when it’s gear-mounted or rigged somewhere inaccessible.”

This DIT talks synchronization and action scenes

By Peter Welch

As a digital imaging technician on feature films, I work closely with the director of photography, camera crew, editorial and visual effects teams to make sure the right data is collected for post production to transform raw digital footage into a polished movie as efficiently as possible.

A big part of this role is to anticipate how things could go wrong during a shoot, so I can line up the optimal combination of technical kit to minimize the impact should the worst happen. With all the inherent unpredictability that comes with high-speed chases and pyrotechnics, feature film action sequences can offer up some particularly tough challenges. These pivotal scenes are incredibly costly to get wrong, so every technological requirement is amplified. This includes the need to generate robust and reliable timecode.

I take timecode very seriously, seeing it as an essential part of the camera rather than a nice-to-have add-on. Although it doesn’t really affect us on set, further down the line, a break in timecode can cause other areas a whole world of problems. In the case of last year’s Avengers: Age of Ultron, creating the spectacular scenes and VFX we’ve come to expect from Marvel involved developing solid workflows for some very large multi-camera set-ups. For some shots, as many as 12 cameras were rolling with a total camera package of 27 cameras, including Arri Alexa XTs, Canon C500s with Codex recorders, Red Epics and Blackmagic cameras. The huge amounts of data generated made embedding accurate, perfectly synced timecode into every piece of footage an important technical requirement.

Avengers : Age of Ultron ; Year : 2015 USA ; Director : Joss Whedon ; Chris Evans, Jeremy Renner, Scarlett Johansson, Chris Hemsworth. Image shot 2015. Exact date unknown.One of the largest action sequences for Age of Ultron was filmed in Korea with eight cameras rigged to capture footage — four Arri Alexas and four Canon C500s — and huge volumes of RAW output going to Codex recorders. With this shoot, there was a chance that cameras could be taken out while filming, putting footage at risk of being lost. As a result, while the Alexas were strategically rigged a safe distance from the main action, the less costly C500s were placed in and around the explosion, putting them at an increased risk of being caught in the line of fire.

As an added complication, once the set was built, and definitely once it was hot with explosives, we couldn’t go back in to adjust camera settings. So while I was able to manually jam-sync the Alexas, the C500s had to be set to record with timecode running at the point of rigging. There wasn’t an opportunity to go back later and re-jam midway through the day — they had to stay in sync throughout, whatever twists and turns the filming process took.

With the C500 cameras placed in strategic positions to maximize the action, the Codex recorders, Preston MDRs and power were built into recording and camera control boxes (or ‘safe boxes’) and positioned at a distance from the cameras and then connected via a bespoke set of cables. Within each C500’s “safe box,” I also placed a Timecode Systems Minitrx+ set in receive mode. This was synced over RF to a master unit back outside of the “hot” zone.

With an internal Li-Polymer battery powering it for up to 12 hours, the Mintrx+ units in the C500 “safe boxes” could be left running throughout a long shooting day with complete confidence and no requirement for manual jamming or resetting. This set-up ensured all footage captured by the C500s in the “hot” zone was stamped with the same frame-accurate timecode as the Alexas. The timecode could also be monitored via the return video signals’ embedded SDI feed.

But it’s not just the pyrotechnics that inject unpredictability into shooting this kind of scene — the sheer scale of the locations can be as much of a challenge. The ability to synchronize timecode over RF definitely helps, but even with long-range RF it’s good to have a backup. For example, for one scene in 2015’s Spectre, 007 piloted a motorboat down a sizeable stretch of the Thames in London. For this scene, I rigged one camera with a Minirtx+ on a boat in Putney, powered it up and left it onboard filming James Bond. I then got in my car and raced down the Embankment to Westminster to set up the main technical base with the camera crews, with a Timecode Systems unit set to the same timecode as that on the boat.

Even though the boat started its journey out of range of its paired unit, the crystal inside the Minitrx+ continued to feed timecode to that camera accurately. As soon as the boat drifted into range, it synced back to the master unit again with zero drift. There was no need to reset or re-jam.

Action sequences are certainly getting increasingly ambitious, with footage being captured from an increasing number and variety of camera sources. Although it’s possible to fix sync problems in post, it’s time consuming and expensive. Getting it right at the point of shooting offers considerable efficiencies to the production process, something that every production demands — even those working with superhero budgets.

Peter Welch is a London-based digital imaging technician (DIT) with Camera Facilities.