Category Archives: IBC

Winners: IBC2017 Impact Awards

postPerspective has announced the winners of our postPerspective Impact Awards from IBC2017. All winning products reflect the latest version of the product, as shown at IBC.

The postPerspective Impact Award winners from IBC2017 are:

• Adobe for Creative Cloud
• Avid for Avid Nexis Pro
• Colorfront for Transkoder 2017
• Sony Electronics for Venice CineAlta camera

Seeking to recognize debut products and key upgrades with real-world applications, the postPerspective Impact Awards are determined by an anonymous judging body made up of industry pros. The awards honor innovative products and technologies for the post production and production industries that will influence the way people work.

“All four of these technologies are very worthy recipients of our first postPerspective Impact Awards from IBC,” said Randi Altman, postPerspective’s founder and editor-in-chief. “These awards celebrate companies that push the boundaries of technology to produce tools that actually make users’ working lives easier and projects better, and our winners certainly fall into that category. You’ll notice that our awards from IBC span the entire pro pipeline, from acquisition to on-set dailies to editing/compositing to storage.

“As IBC falls later in the year, we are able to see where companies are driving refinements to really elevate workflow and enhance production. So we’ve tapped real-world users to vote for the Impact Awards, and they have determined what could be most impactful to their day-to-day work. We’re very proud of that fact, and it makes our awards quite special.”

IBC2017 took place September 15-19 in Amsterdam. postPerspective Impact Awards are next scheduled to celebrate innovative product and technology launches at the 2018 NAB Show.

Xytech launches MediaPulse Managed Cloud at IBC

Facility management software provider Xytech has introduced a cloud and managed services offering, MediaPulse Managed Cloud. Hosted in Microsoft Azure, MediaPulse Managed Cloud is a secure platform offering full system management.

MediaPulse Managed Cloud is available through any web browser and compatible with iOS, Android and Windows mobile devices. The new managed services handle most administrative functions, including daily backups, user permissions and screen layouts. The offering is available with several options, including a variety of language packs, allowing for customization and localization.

Slated for shipping in October, MediaPulse Managed Cloud is compliant with European privacy laws and enables secure data transmission across multiple geographies.

Xytech debuted MediaPulse Managed Cloud at IBC2017. The show was the company’s first as a member of the Advanced Media Workflow Association, a community-driven forum focused on advancing business-driven solutions for networked media workflows.

Dell 6.15

Blackmagic’s new Ultimatte 12 keyer with one-touch keying

Building on the 40-year heritage of its Ultimatte keyer, Blackmagic Design has introduced the Ultimatte 12 realtime hardware compositing processor for broadcast-quality keying, adding augmented reality elements into shots, working with virtual sets and more. The Ultimatte 12 features new algorithms and color science, enhanced edge handling, greater color separation and color fidelity and better spill suppression.

The 12G-SDI design gives Ultimatte 12 users the flexibility to work in HD and switch to Ultra HD when they are ready. Sub-pixel processing is said to boost image quality and textures in both HD and Ultra HD. The Ultimatte 12 is also compatible with most SD, HD and Ultra HD equipment, so it can be used with existing cameras.

With Ultimatte 12, users can create lifelike composites and place talent into any scene, working with both fixed cameras and static backgrounds or automated virtual set systems. It also enables on-set previs in television and film production, letting actors and directors see the virtual sets they’re interacting with while shooting against a green screen.

Here are a few more Ultimatte 12 features:

  • For augmented reality, on-air talent typically interacts with glass-like computer-generated charts, graphs, displays and other objects with colored translucency. Adding tinted, translucent objects is very difficult with a traditional keyer, and the results don’t look realistic. Ultimatte 12 addresses this with a new “realistic” layer compositing mode that can add tinted objects on top of the foreground image and key them correctly.
  • One-touch keying technology analyzes a scene and automatically sets more than 100 parameters, simplifying keying as long as the scene is well-lit and the cameras are properly white-balanced. With one-touch keying, operators can pull a key accurately and with minimum effort, freeing them to focus on the program with fewer distractions.
  • Ultimatte 12’s new image processing algorithms, large internal color space, and automatic internal matte generation lets users work on different parts of the image separately with a single keyer.
  • For color handling, Ultimatte 12 has new flare, edge and transition processing to remove backgrounds without affecting other colors. The improved flare algorithms can remove green tinting and spill from any object — even dark shadow areas or through transparent objects.
  • Ultimatte 12 is controlled via Ultimatte Smart Remote 4, a touch-screen remote device that connects via Ethernet. Up to eight Ultimatte 12 units can be daisy-chained together and connected to the same Smart Remote, with physical buttons for switching and controlling any attached Ultimatte 12.

Ultimatte 12 is now available from Blackmagic Design resellers.


Adobe intros updates to Creative Cloud, including Team Projects

Later this year, Adobe will be offering new capabilities within its Adobe Creative Cloud video tools and services. This includes updates for VR/360, animation, motion graphics, editing, collaboration and Adobe Stock. Many of these features are powered by Adobe Sensei, the company’s artificial intelligence and machine learning framework. Adobe will preview these advancements at IBC.

The new capabilities coming later this year to Adobe Creative Cloud for video include:
• Access to motion graphics templates in Adobe Stock and through Creative Cloud Libraries, as well as usability improvements to the Essential Graphics panel in Premiere Pro, including responsive design options for preserving spatial and temporal.
• Character Animator 1.0 with changes to core and custom animation functions, such as pose-to-pose blending, new physics behaviors and visual puppet controls. Adobe Sensei will help improve lip-sync capability by accurately matching mouth shape with spoken sounds.
• Virtual reality video creation with a dedicated viewing environment in Premiere Pro. Editors can experience the deeply engaging qualities of content, review their timeline and use keyboard driven editing for trimming and markers while wearing the same VR head-mounts as their audience. In addition, audio can be determined by orientation or position and exported as ambisonics audio for VR-enabled platforms such as YouTube and Facebook. VR effects and transitions are now native and accelerated via the Mercury playback engine.
• Improved collaborative workflows with Team Projects on the Local Area Network with managed access features that allow users to lock bins and provide read-only access to others. Formerly in beta, the release of Team Projects will offer smoother workflows hosted in Creative Cloud and the ability to more easily manage versions with auto-save history.
• Flexible session organization to multi-take workflows and continuous playback while editing in Adobe Audition. Powered by Adobe Sensei, auto-ducking is added to the Essential Sound panel that automatically adjusts levels by type: dialogue, background sound or music.

Integration with Adobe Stock
Adobe Stock is now offering over 90 million assets including photos, illustrations and vectors. Customers now have access to over 4 million HD and 4K Adobe Stock video footage directly within their Creative Cloud video workflows and can now search and scrub assets in Premiere Pro.

Coming to this new release are hundreds of professionally-created motion graphics templates for Adobe Stock, available later this year. Additionally, motion graphic artists will be able to sell Motion Graphic templates for Premiere Pro through Adobe Stock. Earlier this year, Adobe added editorial and premium collections from Reuters, USA Today Sports, Stocksy and 500px.


LumaForge offering support for shared projects in Adobe Premiere

LumaForge, which designs and sells high-performance servers and shared storage appliances for video workflows, will be at IBC this year showing full support for new collaboration features in Adobe Premiere Pro CC. When combined with LumaForge’s Jellyfish or ShareStation post production servers, the new Adobe features — including multiple open projects and project locking —allow production groups and video editors to work more effectively with shared projects and assets. This is something that feature film and TV editors have been asking for from Adobe.

Project locking allows multiple users to work with the same content. In a narrative workflow, an editing team can divide their film into shared projects per reel or scene. An assistant editor can get to work synchronizing and logging one scene, while the editor begins assembling another. Once the assistant editor is finished with their scene, the editor can refresh their copy of the scene’s Shared Project and immediately see the changes.

An added benefit of using Shared Projects on productions with large amounts of footage is the significantly reduced load time of master projects. When a master project is broken into multiple shared project bins, footage from those shared projects is only loaded once that shared project is opened.

“Adobe Premiere Pro facilitates a broad range of editorial collaboration scenarios,” says Sue Skidmore, partner relations for Adobe Professional Video. “The LumaForge Jellyfish shared storage solution complements and supports them well.”

All LumaForge Jellyfish and LumaForge ShareStation servers will support the Premiere Pro CC collaboration features for both Mac OS and Windows users, connecting over 10Gb Ethernet.

Check out their video on the collaboration here.


Chatting up IBC’s Michael Crimp about this year’s show

Every year, many from our industry head to Amsterdam for the International Broadcasting Convention. With IBC’s start date coming fast, what better time for the organization’s CEO, Michael Crimp, to answer questions about the show, which runs from September 15-19.

IBC is celebrating its 50th anniversary this year. How will you celebrate?
In addition to producing a commemorative book, and our annual party, IBC is starting a new charitable venture, supporting an Amsterdam group that provides support through sport for disadvantaged and disabled children. If you want to play against former Ajax players in our Saturday night match, bid now to join the IBC All-Stars.

It’s also about keeping the conversation going. We are 50 years on and have a huge amount to talk about — from Ultra HD to 5G connectivity, from IP to cyber security.

How has IBC evolved over the past 10 years?
The simple answer is that IBC has evolved along with the industry, or rather IBC has strived to identify the key trends which will transform the industry and ensure that we are ahead of the curve.

Looking back 10 years, digital cinema was still a work in progress: the total transition we have now seen was just beginning. We had dedicated areas focused on mobile video and digital signage, things that we take for granted today. You can see the equivalents in IBC2017, like the IP Showcase and all the work done on interoperability.

Five years ago we started our Leaders’ Summit, the behind-closed-doors conference for CEOs from the top broadcasters and media organizations, and it has proved hugely successful. This year we are adding two more similar, invitation-only events, this time aimed at CTOs. We have a day focusing on cyber security and another looking at the potential for 5G.

We are also trying a new business matchmaking venue this year, the IBC Startup Forum. Working with Media Honeypot, we are aiming to bring startups and scale-ups together with the media companies that might want to use their talents and the investors who might back the deals.

Will IBC and annual trade shows still be relevant in another 50 years?
Yes, I firmly believe they will. Of course, you will be able to research basic information online — and you can do that now. We have added to the online resources available with our IBC365 year-round online presence. But it is much harder to exchange opinions and experiences that way. Human nature dictates that we learn best from direct contact, from friendly discussions, from chance conversations. You cannot do that online. It is why we regard the opportunity to meet old friends and new peers as one of the key parts of the IBC experience.

What are some of the most important decisions you face in your job on a daily basis?
IBC is an interesting business to head. In some ways, of course, my job as CEO is the same as the head of any other company: making sure the staff are all pulling in the same direction, the customers are happy and the finances are secure. But IBC is unlike any other business because our focus is on spreading and sharing knowledge, and because our shareholders are our customers. IBC is organized by the industry for the industry, and at the top of our organization is the Partnership Board, which contains representatives of the six leading professional and trade bodies in the industry: IABM, IEE, IET, RTS, SCTE and SMPTE.

Can you talk a bit about the conference?
One significant development from that first IBC 50 years ago is the nature of the conference. The founders were insistent that an exhibition needed a technical conference, and in 1967 it was based solely on papers outlining the latest research.

Today, the technical papers program still forms the center piece of the conference. But today our conference is much broader, speaking to the creative and commercial people in our community as well as the engineering and operational.

This year’s conference is subtitled “Truth, Trust and Transformation,” and has five tracks running over five days. Session topics range from the deeply technical, like new codec design, to fake news and alternative facts. Speakers range from Alberto Duenas, the principal video architect at chipmaker ARM to Dan Danker, the product director at Facebook.

How are the attendees and companies participating in IBC changing?
The industry is so much broader than it once was. Consumers used to watch television, because that was all that the technology could achieve. Today, they expect to choose what they want to watch, when and where they want to watch it, and on the device and platform which happen to be convenient at the time.

As the industry expands, so does the IBC community. This year, for example, we have the biggest temporary structure we have ever built for an IBC, to house Hall 14, dedicated to content everywhere.

Given that international travel can be painful, what should those outside the EU consider?
Amsterdam is, in truth, a very easy place for visitors in any part of the world to reach. Its airport is a global hub. The EU maintains an open attitude and a practical approach to visas when required, so there should be no barriers to anyone wanting to visit IBC.

The IBC Innovation Awards are always a draw. Can you comment on the calibre of entries this year?
When we decided to add the IBC Innovation Awards to our program, our aim was to reflect the real nature of the industry. We wanted to reward the real-world projects, where users and technology partners got together to tackle a real challenge and come up with a solution that was much more than the sum of its parts.

Our finalists range from a small French-language service based in Canada to Google Earth; from a new approach to transmitters in the USA to an online service in India; and from Asia’s biggest broadcaster to the Spanish national railway company.

The Awards Ceremony on Sunday night is always one of my highlights. This year there is a special guest presenter: the academic and broadcaster Dr. Helen Czerski. The show lasts about an hour and is free to all IBC visitors.

What are the latest developments in adding capacity at IBC?
There is always talk of the need to move to another venue, and of course as a responsible business we keep this continually under review. But where would we move to? There is nowhere that offers the same combination of exhibition space, conference facilities and catering and networking under one roof. There is nowhere that can provide the range of hotels at all prices that Amsterdam offers, nor its friendly and welcoming atmosphere.

Talking of hotels, visitors this year may notice a large building site between hall 12 and the station. This will be a large on-site hotel, scheduled to be open in time for IBC in 2019.

And regulars who have resigned themselves to walking around the hoardings covering up the now not-so-new underground station will be pleased to hear that the North-South metro line is due to open in July 2018. Test trains are already running, and visitors to IBC next year will be able to speed from the centre of the city in under 10 minutes.

As you mentioned earlier, the theme for IBC2017 is “Truth, Trust and Transformation.” What is the rationale behind this?
Everyone has noticed that the terms “fake news” and “alternative facts” are ubiquitous these days. Broadcasters have traditionally been the trusted brand for news: is the era of social media and universal Internet access changing that?

It is a critical topic to debate at IBC, because the industry’s response to it is central to its future, commercially, as well as technically. Providing true, accurate and honest access to news (and related genres like sport) is expensive and demanding. How do we address this key issue? Also, one of the challenges of the transition to IP connectivity is the risk that the media industry will become a major target for malware and hackers. As the transport platform becomes more open, the more we need to focus on cyber security and the intrinsic design of safe, secure systems.

OTT and social media delivery is sometimes seen as “disruptive,” but I think that “transformative” is the better word. It brings new challenges for creativity and business, and it is right that IBC looks at them.

Will VR and AR be addressed at this year’s conference?
Yes, in the Future Zone, and no doubt on the show floor. Technologies in this area are tumbling out, but the business and creative case seems to be lagging behind. We know what VR can do, but how can we tell stories with it? How can we monetize it? IBC can bring all the sides of the industry together to dig into all the issues. And not just in debate, but by seeing and experiencing the state of the art.

Cyber security and security breaches are becoming more frequent. How will IBC address these challenges?
Cyber security is such a critical issue that we have devoted a day to it in our new C-Tech Forum. Beyond that, we have an important session on cyber security on Friday in the main conference with experts from around the world and around the industry debating what can and should be done to protect content and operations.

Incidentally, we are also looking at artificial intelligence and machine learning, with conference sessions in both the technology and business transformation strands.

What is the Platform Futures — Sport conference aiming to address?
Platform Futures is one of the strands running through the conference. It looks at how the latest delivery and engagement technologies are opening new opportunities for the presentation of content.

Sport has always been a major driver – perhaps the major driver – of innovation in television and media. For many years now we have had a sport day as part of the conference. This year, we are dedicating the Platform Futures strand to sport on Sunday.

The stream looks at how new technology is pushing boundaries for live sports coverage; the increasing importance of fan engagement; and the phenomenon of “alternative sports formats” like Twenty20 cricket and Rugby 7s, which provide lucrative alternatives to traditional competitions. It will also examine the unprecedented growth of eSports, and the exponential opportunities for broadcasters in a market that is now pushing towards the half-billion-dollar size.

 


IBC 2016: VR and 8K will drive M&E storage demand

By Tom Coughlin

While attending the 2016 IBC show, I noticed some interesting trends, cool demos and new offerings. For example, while flying drones were missing, VR goggles were everywhere; IBM was showing 8K video editing using flash memory and magnetic tape; the IBC itself featured a fully IP-based video studio showing the path to future media production using lower-cost commodity hardware with software management; and, it became clear that digital technology is driving new entertainment experiences and will dictate the next generation of content distribution, including the growing trend to OTT channels.

In general, IBC 2016 featured the move to higher resolution and more immersive content. On display throughout the show was 360-degree video for virtual reality, as well as 4K and 8K workflows. Virtual reality and 8K are driving new levels of performance and storage demand, and these are just some of the ways that media and entertainment pros are future-zone-2increasing the size of video files. Nokia’s Ozo was just one of several multi-camera content capture devices on display for 360-degree video.

Besides multi-camera capture technology and VR editing, the Future Tech Zone at IBC included even larger 360-degree video display spheres than at the 2015 event. These were from Puffer Fish (pictured right). The smaller-sized spherical display was touch-sensitive so you could move your hand across the surface and cause the display to move (sadly, I didn’t get to try the big sphere).

IBM had a demonstration of a 4K/8K video editing workflow using the IBM FlashSystem and IBM Enterprise tape storage technology, which was a collaboration between the IBM Tokyo Laboratory and IBM’s Storage Systems division. This work was done to support the move to 4K/8K broadcasts in Japan by 2018, with a broadcast satellite and delivery of 8K video streams of the 2020 Tokyo Olympic Games. The combination of flash memory storage for working content and tape for inactive content is referred to as FLAPE (flash and tAPE).

The graphic below shows a schematic of the 8K video workflow demonstration.

The argument for FLAPE appears to be that flash performance is needed for editing 8K content and the magnetic tape provides low-cost storage the 8K content, which may require greater than 18TB for an hour of raw content (depending upon the sampling and frame rate). Note that magnetic tape is often used for archiving of video content, so this is a rather unusual application. The IBM demonstration, plus discussions with media and entertainment professionals at IBC indicate that with the declining costs of flash memory and the performance demands of 8K, 8K workflows may finally drive increased demand for flash memory for post production.

Avid was promoting their Nexis file system, the successor to ISIS. The company uses SSDs for metadata, but generally flash isn’t used for actual editing yet. They agreed that as flash costs drop, flash could find a role for higher resolution and richer media. Avid has embraced open source for their code and provides free APIs for their storage. The company sees a hybrid of on-site and cloud storage for many media and entertainment applications.

EditShare announced a significant update to its XStream EFS Shared Storage Platform (our main image). The update provides non-disruptive scaling to over 5PB with millions of assets in a single namespace. The system provides a distributed file system with multiple levels of hardware redundancy and reduced downtime. An EFS cluster can be configured with a mix of capacity and performance with SSDs for high data rate content and SATA HDD for cost-efficient higher-performance storage — 8TB HDDs have been qualified for the system. The latest release expands optimization support for file-per-frame media.

The IBC IP Interoperability Zone was showing a complete IP-based studio (pictured right) was done with the cooperation of AIMS and the IABM. The zone brings to life the work of the JT-NM (the Joint Task Force on Networked Media, a combined initiative of AMWA, EBU, SMPTE and VSF) and the AES on a common roadmap for IP interoperability. Central to the IBC Feature Area was a live production studio, based on the technologies of the JT-NM roadmap that Belgian broadcaster VRT has been using daily on-air all this summer as part of the LiveIP Project, which is a collaboration between VRT, the European Broadcasting Union (EBU) and LiveIP’s 12 technology partners.

Summing Up
IBC 2016 showed some clear trends to more immersive, richer content with the numerous displays of 360-degree and VR content and many demonstrations of 4K and even 8K workflows. Clearly, the trend is for higher-capacity, higher-performance workflows and storage systems that support this workflow. This is leading to a gradual move to use flash memory to support these workflows as the costs for flash go down. At the same time, the move to IP-based equipment will lead to lower-cost commodity hardware with software control.

Storage analyst Tom Coughlin is president of Coughlin Associates. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide. He also  publishes the Digital Storage Technology Newsletter, the Digital Storage in Media and Entertainment Report.


My first trip to IBC

By Sophia Kyriacou

When I was asked by the team at Maxon to present my work at their IBC stand this year, I jumped at the chance. I’m a London-based working professional with 20 years of experience as a designer and 3D artist, but I had never been to an IBC. My first impression of the RAI convention center in Amsterdam was that it’s super huge and easy to get lost in for days. But once I found the halls relevant to my interests, the creative and technical buzz hit me like heat in the face when disembarking from a plane in a hot humid summer. It was immediate, and it felt so good!

The sounds and lights were intense. I was surrounded by booths with baselines of audio vibrating against the floor changing as you walked along. It was a great atmosphere; so warm and friendly.

My first Maxon presentation was on day two of IBC — it was a show-and-tell of three award-winning and nominated sequences I created for the BBC in London and one for Noon Visual Creatives. As a Cinema 4D user, it was great to see the audience at the stand captivated by my work. and knowing it was streamed live to a large audience globally made it even more exciting.

The great thing about IBC is that it’s not only about companies shouting about their new toys. I also saw how it brings passionate pros from all over the world together — people you would never meet in your usual day-to-day work life. I met people from all over globe and made new friends. Everyone appeared to share the same or similar experience, which was wonderful.

The great thing about having the first presentation of the day at Maxon meant I could take a breather and look around the show. I also sat in on a Dell Precision/Radeon Technologies roundtable event one afternoon. That was a really interesting meeting. We were a group of pros from varied disciplines within the industry. It was great to talk about what hardware works, what doesn’t work, and how it could all get better. I don’t work in a realtime area, but I do know what I would like to see as someone who works in 3D. It was incredibly interesting, and everyone was so welcoming. Thoroughly enjoyed it.

Sunday evening, I went over to the SuperMeet — such an energetic and friendly vibe. The stage demos were very interesting. I was particularly taken with the fayIN tracker plug-in for Adobe After Effects. It appears to be a very effective tool, and I will certainly look into purchasing it. The new Adobe Premiere features look fantastic as well.

Everything about my time at IBC was so enjoyable. I went back London buzzing, and am already looking forward to next year’s IBC show.

Sophia Kyriacou is a London-based broadcast designer and 3D artist who splits her time working as a freelancer and for the BBC.


IBC: Surrounded by sound

By Simon Ray

I came to the 2016 IBC Show in Amsterdam at the start of a period of consolidation at Goldcrest in London. We had just gone through three years of expansion, upgrading, building and installing. Our flagship Dolby Atmos sound mixing theatre finished its first feature, Jason Bourne, and the DI department recently upgraded to offer 4K and HDR.

I didn’t have a particular area to research at the show, but there were two things that struck me almost immediately on arrival: the lack of drones and the abundance of VR headsets.

Goldcrest’s Atmos mixing stage.

360 audio is an area I knew a little about, and we did provide a binaural DTS Headphone X mix at the end of Jason Bourne, but there was so much more to learn.

Happily, my first IBC meeting was with Fraunhofer, where I was updated on some of the developments they have made in production, delivery and playback of immersive and 360 sound. Of particular interest was their Cingo technology. This is a playback solution that lives in devices such as phones and tablets and can already be found in products from Google, Samsung and LG. This technology renders 3D audio content onto headphones and can incorporate head movements. That means a binaural render that gives spatial information to make the sound appear to be originating outside the head rather than inside, as can be the case when listening to traditionally mixed stereo material.

For feature films, for example, this might mean taking the 5.1 home theatrical mix and rendering it into a binaural signal to be played back on headphones, giving the listener the experience of always sitting in the sweet spot of a surround sound speaker set-up. Cingo can also support content with a height component, such as 9.1 and 11.1 formats, and add that into the headphone stream as well to make it truly 3D. I had a great demo of this and it worked very well.

I was impressed that Fraunhofer had also created a tool for creating immersive content, a plug-in called Cingo Composer that could run as both VST and AAX plug-ins. This could run in Pro Tools, Nuendo and other DAWs and aid the creation of 3D content. For example, content could be mixed and automated in an immersive soundscape and then rendered into an FOA (First Order Ambisonics or B-Format) 4-channel file that could be played with a 360 video to be played on VR headsets with headtracking.

After Fraunhofer, I went straight to DTS to catch up with what they were doing. We had recently completed some immersive DTS:X theatrical, home theatrical and, as mentioned above, headphone mixes using the DTS tools, so I wanted to see what was new. There were some nice updates to the content creation tools, players and renderers and a great demo of the DTS decoder doing some live binaural decoding and headtracking.

With immersive and 3D audio being the exciting new things, there were other interesting products on display that related to this area. In the Future Zone Sennheiser was showing their Ambeo VR mic (see picture, right). This is an ambisonic microphone that has four capsules arranged in a tetrahedron, which make up the A-format. They also provide a proprietary A-B format encoder that can run as a VST or AAX plug-in on Mac and Windows to process the outputs of the four microphones to the W,X,Y,Z signals (the B-format).

From the B-Format it is possible to recreate the 3D soundfield, but you can also derive any number of first-order microphones pointing in any direction in post! The demo (with headtracking and 360 video) of a man speaking by the fireplace was recorded just using this mic and was the most convincing of all the binaural demos I saw (heard!).

Still in the Future Zone, for creating brand new content I visited the makers of the Spatial Audio Toolbox, which is similar to the Cingo Creator tool from Fraunhofer. B-Com’s Spatial Audio Toolbox contains VST plug-ins (soon to be AAX) to enable you to create an HOA (higher order ambisonics) encoded 3D sound scene using standard mono, stereo or surround source (using HOA Pan) and then listen to this sound scene on headphones (using Render Spk2Bin).

The demo we saw at the stand was impressive and included headtracking. The plug-ins themselves were running on a Pyramix on the Merging Technologies stand in Hall 8. It was great to get my hands on some “live” material and play with the 3D panning and hear the effect. It was generally quite effective, particularly in the horizontal plane.

I found all this binaural and VR stuff exciting. I am not sure exactly how and if it might fit into a film workflow, but it was a lot of fun playing! The idea of rendering a 3D soundfield into a binaural signal has been around for a long time (I even dedicated months of my final year at university to writing a project on that very subject quite a long time ago) but with mixed success. It is exciting to see now that today’s mobile devices contain the processing power to render the binaural signal on the fly. Combine that with VR video and headtracking, and the ability to add that information into the rendering process, and you have an offering that is very impressive when demonstrated.

I will be interested to see how content creators, specifically in the film area, use this (or don’t). The recreation of the 3D surround sound mix over 2-channel headphones works well, but whether headtracking gets added to this or not remains to be seen. If the sound is matched to video that’s designed for an immersive experience, then it makes sense to track the head movements with the sound. If not, then I think it would be off-putting. Exciting times ahead anyway.

Simon Ray is head of operations and engineering Goldcrest Post Production in London.

IBC: Blackmagic buys Fairlight and Ultimatte

Before every major trade show, we at postPerspective play a little game. Who is Blackmagic going to buy this time? Well, we didn’t see this coming, but it’s cool. Ultimatte and Fairlight are now owned by Blackmagic.

Ultimatte makes broadcast-quality, realtime blue- and greenscreen removal hardware that is used in studios to seamlessly composite reporters and talk show hosts into virtual sets.

Ultimatte was founded in 1976 and has won an Emmy for their realtime compositing technology and a Lifetime Achievement Award from the Academy of Motion Picture Arts and Sciences, as well as an Oscar.

“Ultimatte’s realtime blue- and greenscreen compositing solutions have been the standard for 40 years,” says Blackmagic CEO Grant Petty. Ultimatte has been used by virtually every major broadcast network in the world. We are thrilled to bring Ultimatte and Blackmagic Design together, and are excited about continuing to build innovative products for our customers.”

Fairlight creates professional digital audio products for live broadcast event production, film and television post, as well as immersive 3D audio mixing and finishing. “The exciting part about this acquisition is that it will add incredibly high-end professional audio technology to Blackmagic Design’s video products,” says Petty.

New Products
Teranex AV: A new broadcastquality standards converter designed specifically for AV professionals. Teranex AV features 12G-SDI and HDMI 2.0 inputs, outputs and loop-through, along with AV specific features such as low latency, a still store, freeze frame and HiFi audio inputs for professionals working on live, staged presentations and conferences. Teranex AV will be available in September for $1,695 from Blackmagic resellers.

New Video Assist 4K update: A major new update for Blackmagic Video Assist 4K customers that improves DNxHD and DNxHR support, adds false color monitoring, expanded focus options and new screen rotation features. It is available for download from the Blackmagic website next week, free of charge, for all Blackmagic Video Assist 4K customers.

DeckLink Mini Monitor 4K and Mini Recorder 4K: New DeckLink Mini Monitor 4K and DeckLink Mini Recorder 4K PCIe capture cards that include all the features of the HD DeckLink models but now have Ultra HD and HDR (high dynamic range) features. Both models support all SD, HD and Ultra HD formats up to 2160p30. DeckLink Mini 4K models are available now from Blackmagic resellers for $195 each.

Davinci Resolve 12.5.2: The latest version of Resolve is available free for download from Blackmagic’s site. It adds support for additional Ursa Mini Camera metadata, color space tags on QuickTime export, Fusion Connect for Linux, advanced filtering options and more.