Tag Archives: IBC 2016

IBC 2016: VR and 8K will drive M&E storage demand

By Tom Coughlin

While attending the 2016 IBC show, I noticed some interesting trends, cool demos and new offerings. For example, while flying drones were missing, VR goggles were everywhere; IBM was showing 8K video editing using flash memory and magnetic tape; the IBC itself featured a fully IP-based video studio showing the path to future media production using lower-cost commodity hardware with software management; and, it became clear that digital technology is driving new entertainment experiences and will dictate the next generation of content distribution, including the growing trend to OTT channels.

In general, IBC 2016 featured the move to higher resolution and more immersive content. On display throughout the show was 360-degree video for virtual reality, as well as 4K and 8K workflows. Virtual reality and 8K are driving new levels of performance and storage demand, and these are just some of the ways that media and entertainment pros are future-zone-2increasing the size of video files. Nokia’s Ozo was just one of several multi-camera content capture devices on display for 360-degree video.

Besides multi-camera capture technology and VR editing, the Future Tech Zone at IBC included even larger 360-degree video display spheres than at the 2015 event. These were from Puffer Fish (pictured right). The smaller-sized spherical display was touch-sensitive so you could move your hand across the surface and cause the display to move (sadly, I didn’t get to try the big sphere).

IBM had a demonstration of a 4K/8K video editing workflow using the IBM FlashSystem and IBM Enterprise tape storage technology, which was a collaboration between the IBM Tokyo Laboratory and IBM’s Storage Systems division. This work was done to support the move to 4K/8K broadcasts in Japan by 2018, with a broadcast satellite and delivery of 8K video streams of the 2020 Tokyo Olympic Games. The combination of flash memory storage for working content and tape for inactive content is referred to as FLAPE (flash and tAPE).

The graphic below shows a schematic of the 8K video workflow demonstration.

The argument for FLAPE appears to be that flash performance is needed for editing 8K content and the magnetic tape provides low-cost storage the 8K content, which may require greater than 18TB for an hour of raw content (depending upon the sampling and frame rate). Note that magnetic tape is often used for archiving of video content, so this is a rather unusual application. The IBM demonstration, plus discussions with media and entertainment professionals at IBC indicate that with the declining costs of flash memory and the performance demands of 8K, 8K workflows may finally drive increased demand for flash memory for post production.

Avid was promoting their Nexis file system, the successor to ISIS. The company uses SSDs for metadata, but generally flash isn’t used for actual editing yet. They agreed that as flash costs drop, flash could find a role for higher resolution and richer media. Avid has embraced open source for their code and provides free APIs for their storage. The company sees a hybrid of on-site and cloud storage for many media and entertainment applications.

EditShare announced a significant update to its XStream EFS Shared Storage Platform (our main image). The update provides non-disruptive scaling to over 5PB with millions of assets in a single namespace. The system provides a distributed file system with multiple levels of hardware redundancy and reduced downtime. An EFS cluster can be configured with a mix of capacity and performance with SSDs for high data rate content and SATA HDD for cost-efficient higher-performance storage — 8TB HDDs have been qualified for the system. The latest release expands optimization support for file-per-frame media.

The IBC IP Interoperability Zone was showing a complete IP-based studio (pictured right) was done with the cooperation of AIMS and the IABM. The zone brings to life the work of the JT-NM (the Joint Task Force on Networked Media, a combined initiative of AMWA, EBU, SMPTE and VSF) and the AES on a common roadmap for IP interoperability. Central to the IBC Feature Area was a live production studio, based on the technologies of the JT-NM roadmap that Belgian broadcaster VRT has been using daily on-air all this summer as part of the LiveIP Project, which is a collaboration between VRT, the European Broadcasting Union (EBU) and LiveIP’s 12 technology partners.

Summing Up
IBC 2016 showed some clear trends to more immersive, richer content with the numerous displays of 360-degree and VR content and many demonstrations of 4K and even 8K workflows. Clearly, the trend is for higher-capacity, higher-performance workflows and storage systems that support this workflow. This is leading to a gradual move to use flash memory to support these workflows as the costs for flash go down. At the same time, the move to IP-based equipment will lead to lower-cost commodity hardware with software control.

Storage analyst Tom Coughlin is president of Coughlin Associates. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide. He also  publishes the Digital Storage Technology Newsletter, the Digital Storage in Media and Entertainment Report.

My first trip to IBC

By Sophia Kyriacou

When I was asked by the team at Maxon to present my work at their IBC stand this year, I jumped at the chance. I’m a London-based working professional with 20 years of experience as a designer and 3D artist, but I had never been to an IBC. My first impression of the RAI convention center in Amsterdam was that it’s super huge and easy to get lost in for days. But once I found the halls relevant to my interests, the creative and technical buzz hit me like heat in the face when disembarking from a plane in a hot humid summer. It was immediate, and it felt so good!

The sounds and lights were intense. I was surrounded by booths with baselines of audio vibrating against the floor changing as you walked along. It was a great atmosphere; so warm and friendly.

My first Maxon presentation was on day two of IBC — it was a show-and-tell of three award-winning and nominated sequences I created for the BBC in London and one for Noon Visual Creatives. As a Cinema 4D user, it was great to see the audience at the stand captivated by my work. and knowing it was streamed live to a large audience globally made it even more exciting.

The great thing about IBC is that it’s not only about companies shouting about their new toys. I also saw how it brings passionate pros from all over the world together — people you would never meet in your usual day-to-day work life. I met people from all over globe and made new friends. Everyone appeared to share the same or similar experience, which was wonderful.

The great thing about having the first presentation of the day at Maxon meant I could take a breather and look around the show. I also sat in on a Dell Precision/Radeon Technologies roundtable event one afternoon. That was a really interesting meeting. We were a group of pros from varied disciplines within the industry. It was great to talk about what hardware works, what doesn’t work, and how it could all get better. I don’t work in a realtime area, but I do know what I would like to see as someone who works in 3D. It was incredibly interesting, and everyone was so welcoming. Thoroughly enjoyed it.

Sunday evening, I went over to the SuperMeet — such an energetic and friendly vibe. The stage demos were very interesting. I was particularly taken with the fayIN tracker plug-in for Adobe After Effects. It appears to be a very effective tool, and I will certainly look into purchasing it. The new Adobe Premiere features look fantastic as well.

Everything about my time at IBC was so enjoyable. I went back London buzzing, and am already looking forward to next year’s IBC show.

Sophia Kyriacou is a London-based broadcast designer and 3D artist who splits her time working as a freelancer and for the BBC.

IBC: Surrounded by sound

By Simon Ray

I came to the 2016 IBC Show in Amsterdam at the start of a period of consolidation at Goldcrest in London. We had just gone through three years of expansion, upgrading, building and installing. Our flagship Dolby Atmos sound mixing theatre finished its first feature, Jason Bourne, and the DI department recently upgraded to offer 4K and HDR.

I didn’t have a particular area to research at the show, but there were two things that struck me almost immediately on arrival: the lack of drones and the abundance of VR headsets.

Goldcrest’s Atmos mixing stage.

360 audio is an area I knew a little about, and we did provide a binaural DTS Headphone X mix at the end of Jason Bourne, but there was so much more to learn.

Happily, my first IBC meeting was with Fraunhofer, where I was updated on some of the developments they have made in production, delivery and playback of immersive and 360 sound. Of particular interest was their Cingo technology. This is a playback solution that lives in devices such as phones and tablets and can already be found in products from Google, Samsung and LG. This technology renders 3D audio content onto headphones and can incorporate head movements. That means a binaural render that gives spatial information to make the sound appear to be originating outside the head rather than inside, as can be the case when listening to traditionally mixed stereo material.

For feature films, for example, this might mean taking the 5.1 home theatrical mix and rendering it into a binaural signal to be played back on headphones, giving the listener the experience of always sitting in the sweet spot of a surround sound speaker set-up. Cingo can also support content with a height component, such as 9.1 and 11.1 formats, and add that into the headphone stream as well to make it truly 3D. I had a great demo of this and it worked very well.

I was impressed that Fraunhofer had also created a tool for creating immersive content, a plug-in called Cingo Composer that could run as both VST and AAX plug-ins. This could run in Pro Tools, Nuendo and other DAWs and aid the creation of 3D content. For example, content could be mixed and automated in an immersive soundscape and then rendered into an FOA (First Order Ambisonics or B-Format) 4-channel file that could be played with a 360 video to be played on VR headsets with headtracking.

After Fraunhofer, I went straight to DTS to catch up with what they were doing. We had recently completed some immersive DTS:X theatrical, home theatrical and, as mentioned above, headphone mixes using the DTS tools, so I wanted to see what was new. There were some nice updates to the content creation tools, players and renderers and a great demo of the DTS decoder doing some live binaural decoding and headtracking.

With immersive and 3D audio being the exciting new things, there were other interesting products on display that related to this area. In the Future Zone Sennheiser was showing their Ambeo VR mic (see picture, right). This is an ambisonic microphone that has four capsules arranged in a tetrahedron, which make up the A-format. They also provide a proprietary A-B format encoder that can run as a VST or AAX plug-in on Mac and Windows to process the outputs of the four microphones to the W,X,Y,Z signals (the B-format).

From the B-Format it is possible to recreate the 3D soundfield, but you can also derive any number of first-order microphones pointing in any direction in post! The demo (with headtracking and 360 video) of a man speaking by the fireplace was recorded just using this mic and was the most convincing of all the binaural demos I saw (heard!).

Still in the Future Zone, for creating brand new content I visited the makers of the Spatial Audio Toolbox, which is similar to the Cingo Creator tool from Fraunhofer. B-Com’s Spatial Audio Toolbox contains VST plug-ins (soon to be AAX) to enable you to create an HOA (higher order ambisonics) encoded 3D sound scene using standard mono, stereo or surround source (using HOA Pan) and then listen to this sound scene on headphones (using Render Spk2Bin).

The demo we saw at the stand was impressive and included headtracking. The plug-ins themselves were running on a Pyramix on the Merging Technologies stand in Hall 8. It was great to get my hands on some “live” material and play with the 3D panning and hear the effect. It was generally quite effective, particularly in the horizontal plane.

I found all this binaural and VR stuff exciting. I am not sure exactly how and if it might fit into a film workflow, but it was a lot of fun playing! The idea of rendering a 3D soundfield into a binaural signal has been around for a long time (I even dedicated months of my final year at university to writing a project on that very subject quite a long time ago) but with mixed success. It is exciting to see now that today’s mobile devices contain the processing power to render the binaural signal on the fly. Combine that with VR video and headtracking, and the ability to add that information into the rendering process, and you have an offering that is very impressive when demonstrated.

I will be interested to see how content creators, specifically in the film area, use this (or don’t). The recreation of the 3D surround sound mix over 2-channel headphones works well, but whether headtracking gets added to this or not remains to be seen. If the sound is matched to video that’s designed for an immersive experience, then it makes sense to track the head movements with the sound. If not, then I think it would be off-putting. Exciting times ahead anyway.

Simon Ray is head of operations and engineering Goldcrest Post Production in London.

IBC: Blackmagic buys Fairlight and Ultimatte

Before every major trade show, we at postPerspective play a little game. Who is Blackmagic going to buy this time? Well, we didn’t see this coming, but it’s cool. Ultimatte and Fairlight are now owned by Blackmagic.

Ultimatte makes broadcast-quality, realtime blue- and greenscreen removal hardware that is used in studios to seamlessly composite reporters and talk show hosts into virtual sets.

Ultimatte was founded in 1976 and has won an Emmy for their realtime compositing technology and a Lifetime Achievement Award from the Academy of Motion Picture Arts and Sciences, as well as an Oscar.

“Ultimatte’s realtime blue- and greenscreen compositing solutions have been the standard for 40 years,” says Blackmagic CEO Grant Petty. Ultimatte has been used by virtually every major broadcast network in the world. We are thrilled to bring Ultimatte and Blackmagic Design together, and are excited about continuing to build innovative products for our customers.”

Fairlight creates professional digital audio products for live broadcast event production, film and television post, as well as immersive 3D audio mixing and finishing. “The exciting part about this acquisition is that it will add incredibly high-end professional audio technology to Blackmagic Design’s video products,” says Petty.

New Products
Teranex AV: A new broadcastquality standards converter designed specifically for AV professionals. Teranex AV features 12G-SDI and HDMI 2.0 inputs, outputs and loop-through, along with AV specific features such as low latency, a still store, freeze frame and HiFi audio inputs for professionals working on live, staged presentations and conferences. Teranex AV will be available in September for $1,695 from Blackmagic resellers.

New Video Assist 4K update: A major new update for Blackmagic Video Assist 4K customers that improves DNxHD and DNxHR support, adds false color monitoring, expanded focus options and new screen rotation features. It is available for download from the Blackmagic website next week, free of charge, for all Blackmagic Video Assist 4K customers.

DeckLink Mini Monitor 4K and Mini Recorder 4K: New DeckLink Mini Monitor 4K and DeckLink Mini Recorder 4K PCIe capture cards that include all the features of the HD DeckLink models but now have Ultra HD and HDR (high dynamic range) features. Both models support all SD, HD and Ultra HD formats up to 2160p30. DeckLink Mini 4K models are available now from Blackmagic resellers for $195 each.

Davinci Resolve 12.5.2: The latest version of Resolve is available free for download from Blackmagic’s site. It adds support for additional Ursa Mini Camera metadata, color space tags on QuickTime export, Fusion Connect for Linux, advanced filtering options and more.

Boris FX merges with GenArts

Boris FX, maker of Boris Continuum Complete, has inked a deal to acquire visual effects plug-in developer GenArts, whose high-end plug-in line includes Sapphire. Sapphire has been used in at least one of each year’s VFX Oscar-nominated films since 1996. This acquisition follows the 2015 addition of Imagineer Systems, developer of Academy Award-winning planar tracking tool Mocha. Sapphire will continue to be developed and sold in its current form alongside Boris Continuum Complete (BCC) and Mocha Pro.

“We are excited to announce this strategic merger and welcome the Sapphire team to the Boris FX/Imagineer group,” says owner Boris Yamnitsky. “This acquisition makes Boris FX uniquely positioned to serve editors and effects artists with the industry’s leading tools for motion graphics, broadcast design, visual effects, image restoration, motion tracking and finishing — all under one roof. Sapphire’s suite of creative plug-ins has been used to design many of the last decades’ most memorable film images. Sapphire perfectly complements BCC and mocha as essential tools for professional VFX and we look forward to serving Sapphire’s extremely accomplished users.”

“Equally impressive is the team behind the technology,” continues Yamnitsky. “Key GenArts staff from engineering, sales, marketing and support will join our Boston office to ensure the smoothest transition for customers. Our shared goal is to serve our combined customer base with useful new tools and the highest quality training and technical support.”

 

 

EditShare launches Flow Story at IBC, promotes Peter Lambert

At IBC, EditShare is launching its new Flow Story, a professional remote editing application. A module of the EditShare Flow media asset management solution, Flow Story offers advanced proxy editing and roundtrip workflow support with professional editing features and functions courtesy of the Lightworks NLE engine.

Flow Story allows users to work remotely with secure access to on-premises storage and media assets via an Internet connection. Flow Story lets users assemble content, add voiceovers and collaborate with other NLEs for finishing, delivery or playout of packages. Direct access to on-premises storage accelerates content exchange within the safety of a secure network.

Feature Highlights
• Wide Format Support — Flow Story supports hundreds of formats, including ProRes, Avid DNxHD, AVC-Intra and XDCAM, through to 4K and beyond, such as Red R3D, XAVC, Cinema DNG and DPX. As well as working with low-resolution proxy files, users can import and publish many popular formats to the EditShare storage server.
• Voiceover — Simple-to-use VO tools let users finalize packages at their desk or out in the field. Flow Story auto-detects and enables any connected audio input device. Users can upload newly created voiceover files and clips they have created locally.
• Edit While Capture — Flow Story’s Edit While Capture feature allows any format (including Long GOP) to be accessed during recording using EditShare Flow MAM or Geevs Ingest servers. This is ideal for fast turnaround environments such as live events and sports highlights.
• Realtime collaboration — When connected to any EditShare Flow Database, Flow Story has real time collaboration with other Flow users, such as Flow Browse and AirFlow. Projects, clips, sequences, markers and metadata are all updated and synchronized in realtime.
• NLE Integration — Flow Story supports industry-standard NLEs (and DAWs) such as Avid Media Composer, Adobe Premiere, Blackmagic DaVinci Resolve and Avid Pro Tools. A creative hub, Flow Story facilitates collaboration among editors through AAF, an interchange file format that advances round-trip workflows.
• Work Offline — Flow Story is purpose-built with remote editing in mind. While you only need a regular Internet connection to access your content, that is not always possible. Flow Story can work in a standalone mode, accessing existing Flow projects in motion. Flow Story projects are synchronized via Internet.
• Advanced realtime effects, including Color, Titles and DVEs — Using the power of the graphics card, all the realtime effects can be played back remotely or locally without the need for rendering or flattening.
• Third-party integration with Audio Network — Browse the selection of Audio Network music directly from within Flow Story. Stream MP3 audio files directly over sequences, add search criteria that best suits requirements, then register or sign in to purchase directly. The full quality track is then downloaded and available within the project.

In Other Editshare news, Peter Lambert has been named worldwide sales director. An industry business development executive with more than 25 years of experience, including his start at the BBC as an audio engineer, Lambert recently held the director of sales position for EditShare’s APAC region.

“Since coming on board to manage our Asia Pacific regional business, Peter has been instrumental in rebuilding the channel and has been a steady advocate for building up our technical, sales and administrative staff in the region. Peter has brought order and stability to our business in the region, and largely as a result of his efforts we have seen substantial growth and stronger client relations,” says Andy Liebman, CEO, EditShare. Responsible for the company’s overall sales strategy and reseller partner program, Lambert’s appointment is effective immediately.

Panasonic and Codex team on VariCam Pure targeting episodic TV, features

At IBC in Amsterdam, Panasonic is showing its new cinema-ready version of the VariCam 35, featuring a jointly-developed Codex recorder capable of uncompressed, 4K RAW acquisition.

The VariCam Pure is the latest addition to the company’s family of pro cinematography products. A co-production between Panasonic and Codex, it couples the existing VariCam 35 camera head with a new Codex V-RAW 2.0 recorder, suited for episodic television shows and feature films.

The V-RAW 2.0 recorder attaches directly to the back of the VariCam 35 camera head. As a result, the camera retains the same Super 35 sensor, 14+ stops of latitude and dual native 800/5000 ISO as the original VariCam 35.

Panasonic VariCam Pure“The new VariCam Pure camera system records pure, uncompressed RAW up to 120 fps onto the industry-standard Codex Capture Drive 2.0 media, already widely used by many camera systems, post facilities and studios,” said Panasonic senior product manager Steven Cooperman. “There is significant demand for uncompressed RAW recording in the high-end market. The modular concept of the VariCam has enabled us to meet this demand. We’ve also listened to feedback from cinematographers and camera operators and ensured that the VariCam Pure is rugged, compact and lightweight, weighing just 11 pounds.”

Codex will provide a dailies and archiving workflow available through its Production Suite. In addition, the Codex Virtual File system means users can transfer many file formats, including Panasonic VRAW, Apple ProRes and Avid DNxHR.

Along with the original camera negative, frame-accurate metadata (such as lens and CDL data) can also be captured, streamlining production and post, and delivering time and cost savings.

The V-RAW 2.0 recorder for VariCam Pure is scheduled for release in December 2016 with a suggested list price of $30,000.