Tag Archives: cloud-based workflows

Embracing production in the cloud

By Igor Boshoer

Cloud technology is set to revolutionize film production. That is if studios can be convinced. But since this century-old industry is reluctant to change, these new technologies and promising innovation trends are integrating at a slower pace.

Tried-and-true production methods are steadily becoming outdated. Bringing innovation, a cloud platform offers accessibility to both small and large studios. In the not-so-distant future, what may now be merely a competitive edge will become industry standard practices. But until then, some studios are apprehensive. And the reasons are mostly myth.

The Need for Transition
Core video production applications, computing, storage and other IT services are moving to the cloud at a rapid pace. A variety of industries and businesses — not just film — are being challenged by new customer expectations, which are heavily influenced by consumer applications powered by the cloud.

In visual effects, film and XR, application vendors such as Autodesk, Avere and Aspera are all updating their software to support these cloud workflows. Studios are recognizing that more focus should be placed on creating high-quality content, and far less on in-house software development and infrastructure maintenance. But to grow the topline and stand apart from the competition, it’s imperative for our industry to be proactive and re-imagine the workflow. Cloud providers offer a much faster pace with this innovation than what a studio can internally provide.

In the grand scheme of things, the industry wants to make studio operations more efficient, cost-effective and quantifiable to better serve their customers. And by taking advantage of cloud-based services, studios can increase agility, while decreasing their cost and risk.

Common Misconceptions of the Cloud
Many believe the cloud to be insecure. But there are many very successful and striving startups, even in the finance and healthcare industries. Our industry’s MPAA regulations are much less stringently regulated than their industry’s HIPPA compliance. To the contrary, the cloud providers offer vastly stronger securities than a studio’s very own internal security measures.

Some studios are reluctant because the transfer of mass amounts of data into a cloud platform can prove challenging. But there are still ways to speed up these transfers, including the use of caching and custom UDP-based transport protocols. While this reluctance is valid, it’s still entirely manageable.

Studios also assume that cloud technology is expensive. It is… however, when you truly break down the costs to maintain infrastructure — adding internal storage, hardware setup, multi-year equipment leases, not to mention the ongoing support team — it, in fact, proves more expensive. While the cloud appears costly, it actually saves money and lets you quantify the cost of production. Moreover, studios can scale resources as production demands fluctuate, instead of relying on the typical static, in-house model.

How the Cloud Better Serves Customers
While some are still apprehensive of cloud-based integration, studios that have shifted production pipelines to cloud-based platforms — and embraced it — are finding positive results and success. The cloud can serve customers in a variety of ways. It can deliver a richer, more consistent and personalized experience for a studio’s content creators, as well as offer a collaborative community.

The latest digital technologies are guaranteed to reshape economics, production, and distribution of the entertainment industry. But to be on their game and remain competitive, studios must adapt to these new Internet and computer technologies.

If our industry is willing to push itself through these myths and preconceived assumptions, cloud technology can indeed revolutionize film production. When that begins to happen, more and more studios will adopt this competitive edge, and it will make for an exciting shift.


Igor Boshoer is a media technologist with feature film VFX credits, including The Revenant and The Wolf of Wall Street. His experience building studio technology inspired his company, Linc, a studio platform as a service. He also hosts the monthly media technology meetup Filmologic in the Bay Area.

Public, Private, Hybrid Cloud: the basics and benefits

By Alex Grossman

The cloud is everywhere, and media facilities are constantly being inundated with messages about the benefits the cloud offers in every area, from production to delivery. While it is easy to locate information on how the cloud can be used for ingest, storage, post operations, transcoding, rendering, archive and, of course, delivery, many media facilities still have questions about the public, private and hybrid clouds, and how each of these cloud models can relate to their business. The following is a brief guide intended to answer these questions.

Public
Public cloud is the cloud as most people see it: a set of services hosted outside a facility and accessed through the Web, either securely through a gateway appliance or simply through a browser. The public nature of this cloud model does not mean that content from one person or one company can be accessed by another. It simply means that the same physical hardware is being shared by multiple users — a “multi-tenant” arrangement in which data from different users resides on one system. Through this approach, users get to take advantage of the scale of many servers and storage systems at the cloud facility. This scale can also improve accessibility and performance, which can be key considerations for many content creators.

Public cloud is the most versatile type of cloud, and it can offer a range of services, from hosted applications to “compute” capabilities. For media companies these services range from transcoding, rendering and animation to business services such as project tracking, billing and, in some cases, file sharing. (Box and Dropbox are good examples of file sharing enabled by public cloud.) Services may be generic or branded, and they are most often offered by a software vendor using a public cloud, or by the public cloud vendor itself. Public clouds are popular for content and asset storage, both for short-term transcode to delivery or project-in-process storage and for longer-term “third copy” off-site archive.

Public clouds can be very appealing due to the OPEX or pay-as-you-go nature of billing and the lack of any capital expense with ongoing hardware purchase and refresh, but the downside of this is that public clouds remove control over workflow. While most public cloud vendors today are large and financially stable, it remains important to choose carefully.

Moreover, taking advantage of public cloud is rarely easy. This path involves dealing with new vendors, and possibly with unfamiliar applications and hardware gateways, and there can be unexpected charges for simple operations such as retrieving data. Although content security concerns are mostly overblown, they nevertheless are a source of apprehension for many potential public cloud users. These uncertainties have a lot of media companies looking to private cloud.

Private
Private cloud can most simply be defined for media companies as a walled machine room environment with workflow compute and storage capabilities offering outside connectivity, while at the same time preventing outside intrusions into the facility.

A well-designed private cloud will allow facilities to extend most of their production and archive capabilities to remote users. The main difference between this approach and most current (non-cloud) storage and compute operations in a facility today is simply that a private cloud can isolate the current workflow from the outside world while extending a portion to remote users based on preferences and polices.

The idea of remote access is not confined to private cloud. It is possible to provide facility access to external users through normal networking protocols, but the private cloud takes this a step further through easier access for authorized users and greater security for others. The media facility remains in complete control of its content and assets. Furthermore, it can host its applications, and its content remains in its on-site storage, safe and secure, with no retrieval fees.

A facility that embraces private cloud cannot take advantage of the scale or pay-as-you-go benefits of public cloud. Thus, in order to provide greater accessibility and flexibility, some media companies have adopted a private cloud model as an extension of their online operations. Private cloud can effectively replace much of the current hardware used today in post and archive operations, so it is a more cost-effective solution for many considering cloud benefits.

Hybrid
Hybrid cloud is an interesting proposition. In the enterprise IT world, hybrid cloud implementations are seen as way to bridge private and public and realize the best of both worlds — lower OPEX for certain functions such as SAAS (software as a service) and the security of keeping valuable data back in their data centers.

For media professionals, hybrid cloud may have even greater benefits. Considering the changing delivery requirements facing the industry and the sheer volume of content being created and reviewed — and, of course, keeping in mind the value of re-monetization — hybrid cloud has exciting potential. Well-designed hybrid cloud can provide the benefits of public and private cloud while taking advantage of the cost savings and reduced complexity that come with maintaining on-premise end-to end-hardware. By sharing the load between the hardware at a facility and the massive scale of a public cloud, a media company can extend its workflow easily while controlling every stage — even on a project-by-project basis.

Choosing between public, private and hybrid cloud can be a daunting task. It is a decision that must start with understanding the unique needs and goals of the media company and its operations, and then involve careful mapping of the various vendors’ offering solutions — with cost considerations always in mind. In the end, a facility may choose neither public cloud, private cloud nor hybrid cloud, but then it may miss out on the many and growing benefits enabled by the cloud.

Alex Grossman, a cloud workflow expert, is VP, Media and Entertainment at Quantum. You can follow him on Twitter @activeguy.

Forbidden to demo Forscene’s virtualized post workflow at IBC

Forbidden Technologies, makers of the editing software Forscene, will be at IBC in Amsterdam showing its end-to-end virtualized workflow for posting and distributing of video content. The hardware-independent solution is enabled by Forscene’s integration with the Microsoft Azure cloud-computing platform.

The workflow sees the Forscene ingest server running as a virtual machine on the Microsoft Azure platform to transcode and ingest live broadcast streams into Forscene accounts seconds behind the live feed. Video editors can then create subclips or full highlights packages using Forscene’s NLE from anywhere. Once the edit is complete, they can drop the sequence back onto Azure for faster-than-realtime conforming and distribution.

IBC attendees can experience the virtualized workflow by competing in a simulated car race, editing the race footage in Forscene and then sharing the video on social media — without needing any Forscene hardware.

Thinkbox at SIGGRAPH with cloud-enabled pipelines

Thinkbox Software showed off some new features from the upcoming new release of Deadline, a high-volume compute management solution that enables studios to link any combination of local and cloud-based resources into one logical pipeline. Currently entering beta testing, Deadline 8 will introduce on-demand metered licensing to work with both local and cloud-based farms alongside existing permanent and temporary licenses.

houdini_jigsaw

Deadline 8 also adds a Proxy Server application that allows users to securely connect to and interact with remote and cloud-based render farms over public Internet without the need for a virtual private network (VPN). The new version includes an updated user interface with enhanced interactivity and a new sandboxed Python environment to facilitate additional rendering and event stability.

Currently, Deadline supports cloud rendering on Amazon EC2, Google Cloud Platform, Microsoft Azure and OpenStack, among others.

Leading up to SIGGRAPH 2015, Thinkbox released Deadline Cloud Wizard for Google Cloud Platform and a beta version for Amazon’s Elastic Compute Cloud (EC2). These wizards automate the configuration of cloud-based render farms running Deadline into a one-time automated task completed in less than 30 minutes. Similarly, Deadline was made available on Microsoft Azure Marketplace for one-click access to the software.

Thinkbox also recently launched the beta for Deadline 7.2. New features include new Quick Draft options to create movies from rendered images or perform file conversions without having to create template scripts. Integration with NIM Labs’ NIM pipeline management tool enables the creation of NIM renders, and automates the uploading of thumbnails and movies to NIM when jobs complete. Enhanced support for SideFX Software’s Houdini introduces Jigsaw region rendering, HServer interactive rendering, and HQueue simulation node slicing.

Answering questions about ‘the cloud’

By Randi Altman

The cloud is everywhere. Workflows, as well as companies making tools for those workflows, are popping up all over, but still some post pros are dubious. What exactly is the cloud? How will it help beyond regular workflows? How does it keep my assets secure? Those are just some questions being thrown around by those who have yet to make the transition.

We thought reaching out to a company that capitalized on the cloud early and from a post production perspective might be a good way to get some of these questions answered.

David Peto owned London-based post house Unit Post Production until 2009 when he started Aframe, a cloud platform that enables teams and organizations to collaborate, organize and move media. He designed the product from a user’s perspective. Let’s find out more about the cloud and its benefits for post pros…

David Peto

David Peto

Some people don’t have a clear understanding of the cloud. Can you help them out?
In its simplest definition, the cloud is a combination of software and services that run on Internet-accessible servers rather than on local computers.

Can you describe how your company uses the cloud?
Aframe was built as a cloud platform from the very beginning. We recognized that people were shipping hard drives all around the world, making unnecessary copies and versions of their media, losing comments and other metadata, and generally spending too much time just waiting to work with media. Using the cloud gives organizations a central repository — a one-to-many point of distribution — that enables more people to access media and do their work regardless of where they may be located and what time of day it is.

There are private and public clouds. Can you describe the differences? What are the benefits of each?
Public clouds are generally owned and operated by a third party such as Amazon’s Web Services, Rackspace or Microsoft Azure. They are provisioned for use by many different types of users — banks, pharmaceutical companies or content distribution networks featuring the latest grumpy cat video!

Private clouds are owned by one company for the specific use of its employees and partners, and generally have very high security standards and limited accessibility.

Which does Aframe use, and why?
Aframe sits somewhere in between being a public and private cloud, which we think offers benefits from both types. What we offer is sometimes referred to as a Vendor Cloud. Like a public cloud offering, Aframe is globally available and accessible to all, but has been purpose-built for a specific task: handling large and complex media files.

Like private clouds, we offer very tight security, greater flexibility and features tailored to users. We also own and operate all of the equipment in the datacenters and do not outsource any portion of our infrastructure to third parties.

For content creators and owners dealing with large, often complex, high-resolution media files, dedicated processes and services are required to enable post workflows. Far beyond simply storing content is the need to automatically transcode to different formats and extract and add descriptive metadata, while also providing a method to review and approve assets for all stakeholders on any device.

How do you educate people who have concerns about data security?
The best way is to explain the different security areas that must be considered. First there is file transfer security where anytime your media is moving to or from the cloud, it should be encrypted during transit. Media in transit is encrypted with an extended validation 256-bit SSL encryption at all times. This means that our corporate identity and place of business has been verified by SSL certification.

Application Security is where we use 256-bit SSL encryption at all times in the browser. Like banking online, your browser will display a green box that shows a verified connection when you are logged in. For server security inside our datacenters, access is protected by powerful 2048-bit RSA encryption keys. This can only be accessed by senior members of the Aframe team.

For physical security and backups of the data itself, we firmly believe that it is not safe unless it is verified to be stored redundantly in at least two geographically separate locations. Our customers rest easy knowing that their files are backed up hundreds of miles apart at opposite ends of the country.

Why should a post organization consider a cloud-based solution? What are the advantages?
Most post houses, regardless of their size, are experiencing the headaches I’ve mentioned with regards to teams working in different locations, having to FedEx media and having silos of workflows where not everyone has access to the types of files they need. Only certain people need high-resolution files, others are happy to view proxies.

Being able to add timecode accurate notes and comments for the editor or producer is critical, but so are automatically transcoding, uploading and downloading files, as your workflow requires. All of these are necessary to get the maximum productivity to hit ever-shrinking deadlines and budgets. In the end, people should be concentrating on the creative aspects of their jobs, not the mundane moving of files to different departments and other stakeholders.

Can you give any examples of how post facilities are using Aframe?
There are many workflows being used by our post customers today. Some users upload dailies to the cloud so that stakeholders can view and log comments and even embed complete transcriptions. Others are using Aframe as a central repository where team members in offices across town or across countries can collaborate and get access to the latest footage. All metadata is indexed and preserved so that searching for just the right shot is effortless. Avid, Final Cut or Adobe editors benefit by seeing all comments and feedback when they transfer the metadata into the edit.

How is Aframe different from something like Amazon S3’s cloud offering?
That’s a great question. Amazon is a true cloud solution. It’s big, and in use by countless organizations every day. Unfortunately, that’s exactly where it can fall short for companies working with high-resolution files and broadcast masters. Amazon was built to serve many masters and as such, they have a different business model that can be quite costly for content creators.

Amazon charges for uploads and downloads, Aframe does not, which can make uploading a 500GB show master very expensive. With Amazon, it’s not as clear where your media is stored, and where the backup of that media is stored. Finally, Amazon is sharing their bandwidth across a huge cross-section of customers in a way that Aframe is not. It’s just not built for the media and entertainment industry.

A lot of people say the Internet is not fast enough to support upload/download of full-res video for any meaningful post workflows, how do you answer that?
That’s a completely legitimate question, because it is true that your experience in the cloud is only as good as your Internet connection. However, we have optimized file transfer protocols to be the fastest in the industry. Our transfer speeds are 15x faster than FTP, for example. There are many users working from home or from coffee shops that are quite successfully viewing media and making comments over 3G or 4G connections with as little as 5Mbps.

Obviously, you’ll want more than that if you are uploading dailies, but the good news is that Internet speeds are increasing exponentially every year and most post organizations have very capable connections in place today.

At NAB, Aframe showed a collaboration between Aframe and Adobe Anywhere. Can you talk about that?
Since the beginning of Aframe, I’ve dreamed of true, no-download cloud editing. I’ve seen a lot of people fail for various reasons. However, four years ago, Adobe showed me their Anywhere product which allowed full resolution material to be streamed down a standard Internet connection allowing you to edit in Adobe Premiere Pro on your laptop just as if you were back at the facility. I was blown away at the possibility, because I could imagine hosting Anywhere in Aframe’s cloud platform,  allowing full broadcast quality, no-download edit in the cloud using Adobe Premiere on your laptop.

That’s exactly what we showed at this year’s NAB. It’s pretty amazing to see someone upload material into the cloud and never download it again, through the entire post process, until someone actually sees it.

This is significant for our industry because it means that you could now be editing from the office, home, the beach, the local café… anywhere really. It brings editing into the modern world and unchains the editor from all the storage and big iron workstations when you need to be someplace else!

Main image: Stock Photo

Quantum intros Q-Cloud storage solutions

Quantum has been an early adopter when it comes to creating cloud-based storage. Recently they took that to a new level, offering three new product solutions that integrate the cloud into multi-tier, hybrid storage architectures for data workloads.

The Quantum Q-Cloud solutions allow users to leverage Quantum’s intelligent data management software to store data in the cloud when it makes the most sense for a given workflow or application – essentially customizing a user’s needs.

The new Q-Cloud Archive and Q-Cloud Vault incorporate the power of the public cloud as an off-site tier within a Quantum StorNext 5 workflow environment, while Q-Cloud Protect for AWS enables customers using Quantum’s DXi deduplication appliances to replicate data to the Amazon Web Services (AWS) cloud. With all three offerings, users can get the benefits of the cloud without making changes to existing applications or processes.

Taking into account both the benefits and challenges of the cloud, Quantum says their approach is to enable customers to combine public cloud storage with on-premises storage in a multi-tier, intelligently managed, application-centric architecture. So instead of treating the cloud as a passive repository, they integrate the cloud as an active tier in a hybrid storage infrastructure driven by application requirements.

Key features include:
• Full integration as a tier in StorNext 5-managed workflows, with automated, policy-based data movement
• No additional hardware, separate applications or programming needed
• Agility in addressing changing workload demands
• Reliability of global public cloud infrastructure
• Straightforward pricing and billing directly from Quantum
• Secure, encrypted data transmission and at rest

StorNext users can also employ Q-Cloud Archive and Q-Cloud Vault in conjunction with Quantum Lattus extended online storage, moving data from a Lattus object storage technology-based private cloud on their premises to Q-Cloud.

Offered as a subscription service through the Amazon Marketplace, Q-Cloud Protect for AWS enables users to replicate data from either a physical or virtual DXi appliance on premises to a virtual DXi instance in the AWS cloud.

Key features include:
• Easy access to the Amazon public cloud and its benefits, with no changes to applications or processes
• Cost savings enabled by the most efficient method of data deduplication

Q-Cloud Archive is available immediately in the Americas, EMEA and APAC, while Q-Cloud Vault will be available in the second half of this year. Q-Cloud Protect for AWS will be available next quarter. You can see an overview of the products here.

Nvidia next-gen GPUs focus on speed, the cloud and mobile

At the SIGGRAPH show in Vancouver, Nvidia showed its next-generation of Quadro GPUs, which they describe as “an enterprise-grade visual computing platform” that offers 40 percent faster performance. With these cards, Nvidia is focusing on 4K, the cloud, mobile and collaboration, including remote rendering. Pricing is expected to remain the same.

“The next generation of Quadro GPUs not only dramatically increases graphics and compute performance to handle huge data sets it, extends the concept of visual computing from a graphics card in a workstation to a connected environment,” said Jeff Brown, VP of Professional Visualization at Nvidia. “The new Quadro line-up lets users interact with their designs or data locally on a workstation, remotely on a mobile device or in tandem with cloud-based services.”

A shot from Nvidia's New York press event on the set of ABC's The Chew.

A shot from Nvidia’s New York press event on the set of ABC’s The Chew.

Framestore’s CTO, Steve MacPherson, who spoke at an Nvidia press event recently in New York City, says, “From increased efficiency to new workflow models, the results we’ve achieved with the latest Quadro GPUs are fundamental to our future.”

Framestore, which provides VFX and graphics for major feature films such as Guardians of the Galaxy and Edge of Tomorrow, has been doing a lot of work recently in integrated advertising, including virtual reality, something that MacPherson calls “a blending between the physical and CG worlds.”

He points to Nvidia’s Quadro cards as a key part of the studio’s workflow. “Nvidia Quadro is an essential component to helping us keep our edge, and gives us the reassurance of knowing the graphics technology our artists rely on has been developed specifically for professional users with the highest standards of reliability and compatibility.”

Adobe’s Dennis Radke was at the New York event, talking about how these new cards accelerate the Creative Cloud and how it allows Premiere Pro to take in 4K Red Raw files without a Red Rocket card.

Also in New York, Nvidia showed remote collaboration workflows using Google Tablet running Autodesk Maya. Check out the company’s blog on the topic.

So to sum up, the new generation of Quadro GPUs — the K5200, K4200, K2200, K620 and K420 — allows users to interact with data sets or designs up to twice the size handled by previous generations; remotely interact with graphics applications from a Quadro-based workstation from essentially any device, including PCs, Macs and tablets; run major applications, such as Adobe Creative Cloud and the Autodesk Design Suite — on average 40 percent faster than with previous Quadro cards; and switch easily from local GPU rendering to cloud-based offerings using Nvidia Iray rendering.

 

Review: MediaSilo with Premiere/Prelude integration

By Brady Betzel

A few months ago I reviewed a cloud-based asset management and collaboration platform called Aframe. After that review hit, many people emailed asking how I “really” felt about it, and would it work in different production scenarios.

Competitors also reached out asking if I could review their product too. My Forbidden Technologies review is already up on postPerspective.com, and I recently took MediaSilo up on their offer and ran some of my own tests. I discovered that we really are going to be working in the cloud in the near future — this is not just a fad.

While these products do differ in their offerings, it’s clear there is a race to supremacy in the cloud wars. Forbidden Technology offers Forscene, a Web-based NLE, cloud storage and Continue reading

New Scratch 8 supports native ProRes encoding on Windows, cloud workflows

Santa Clara, California — Assimilate has three bits of news for users today, but let’s start off with this: Scratch and Scratch Lab customers can encode Apple ProRes files on Microsoft Windows 7/8-based PCs.

So that means Windows users get native ProRes 442, HQ, LT, proxy and 4444 encoding; Continue reading

Aframe names David Frasco as VP of sales, North America

New York — Aframe, makers of a cloud video production system with capabilities in collaboration, review and approval, archive and tagging, has named long-time media and entertainment industry vet David Frasco as its new VP of North American sales, based in New York City.

Formerly with Avid, David Frasco has over 30 years of experience in editing, post production and building out marketing and sales teams for solutions that have transformed content creation and production workflows. For the past 14 years Frasco served as Avid’s director of enterprise accounts, assisting clients to deploy file-based workflow and production systems. He was instrumental in the debut of game-changing solutions for the past four Olympic games.

In earlier positions Frasco also was a product manager with expertise in the company’s graphics and editorial solutions. Prior to joining Avid, he was director of marketing for Chyron, a post production product manager for Sony Broadcast, and a freelance video editor handling both online and offline finishing for broadcast TV, corporate video and advertising spots.

As more broadcasters are looking at cloud solutions, Aframe’s (www.aframe.com) appointment of Frasco will help the company tap into the innovative professional video communities in New York and Los Angeles that are eager to leverage new and smarter ways to create and distribute content.

Aframe will be at NAB next month with the next generation of its cloud video platform, Aframe 3.0, including a new desktop app offering automated media movement workflows, custom transcoding technology, an HTML5 player for precision control ahead of the edit and asset management features that save immense time and hassle in broadcast video production.   Aframe 3.0 makes news and sports as well as media & entertainment professionals more productive and helps them leverage existing on-premise assets more efficiently with the peace of mind that assets are fully under control and secure.

New features include:
– A desktop app with smart upload technology that enables Aframe users to save time and hassle in uploading and distributing files out. The app automatically detects the best connectivity settings before sending the clips to Aframe where it automatically transcodes them to a viewing copy. Users can then configure the app to automatically push the clips to other users’ desktops, for maximum convenience.

– Automatic transcoding features including automatic transcoding to a house format or custom transcoding per clip that mean less effort spent transcoding everything to be used, and less expense on transcoding tools.

–  Expanded access controls and refined permission features that provide a “mission control” administrator view, enabling easier management across cost centers and more fluid asset management. With more adaptable user settings and granular permission structure, Aframe provides added peace of mind against potentially damaging incidents.

–  New HTML5 Player – faster, frame accurate and universally compatible for faster footage review even on smartphones and tablets that expedites the organization of vast amounts of video.

Brady Betzel recently reviewed the Aframe system for postPerspective. Click here to see the piece:  https://postperspective.com/review-aframe-cloud-platform.