Tag Archives: Nokia Ozo

Making the jump to 360 Video (Part 1)

By Mike McCarthy

VR headsets have been available for over a year now, and more content is constantly being developed for them. We should expect that rate to increase as new headset models are being released from established technology companies, prompted in part by the new VR features expected in Microsoft’s next update to Windows 10. As the potential customer base increases, the software continues to mature, and the content offerings broaden. And with the advances in graphics processing technology, we are finally getting to a point where it is feasible to edit videos in VR, on a laptop.

While a full VR experience requires true 3D content, in order to render a custom perspective based on the position of the viewer’s head, there is a “video” version of VR, which is called 360 Video. The difference between “Full VR” and “360 Video,” is that while both allow you to look around every direction, 360 Video is pre-recorded from a particular point, and you are limited to the view from that spot. You can’t move your head to see around behind something, like you can in true VR. But 360 video can still offer a very immersive experience and arguably better visuals, since they aren’t being rendered on the fly. 360 video can be recorded in stereoscopic or flat, depending on the capabilities of the cameras used.

Stereoscopic is obviously more immersive, less of a video dome and inherently supported by the nature of VR HMDs (Head Mounted Displays). I expect that stereoscopic content will be much more popular in 360 Video than it ever was for flat screen content. Basically the viewer is already wearing the 3D glasses, so there is no downside, besides needing twice as much source imagery to work with, similar to flat screen stereoscopic.

There are a variety of options for recording 360 video, from a single ultra-wide fisheye lens on the Fly360, to dual 180-degree lens options like the Gear 360, Nikon KeyMission, and Garmin Virb. GoPro is releasing the Fusion, which will fall into this category as well. The next step is more lens, with cameras like the Orah4i or the Insta360 Pro. Beyond that, you are stepping into the much more expensive rigs with lots of lenses and lots of stitching, but usually much higher final image quality, like the GoPro Omni or the Nokia Ozo. There are also countless rigs that use an array of standard cameras to capture 360 degrees, but these solutions are much less integrated than the all-in-one products that are now entering the market. Regardless of the camera you use, you are going to be recording one or more files in a pixel format fairly unique to that camera that will need to be processed before it can be used in the later stages of the post workflow.

Affordable cameras

The simplest and cheapest 360 camera option I have found is the Samsung Gear 360. There are two totally different models with the same name, usually differentiated by the year of their release. I am using the older 2016 model, which has a higher resolution sensor, but records UHD instead of the slightly larger full 4K video of the newer 2017 model.

The Gear 360 records two fisheye views that are just over 180 degrees, from cameras situated back to back in a 2.5-inch sphere. Both captured image circles are recorded onto a single frame, side by side, resulting in a 2:1 aspect ratio files. These are encoded into JPEG (7776×3888 stills) or HEVC (3840×1920 video) at 30Mb and saved onto a MicroSD card. The camera is remarkably simple to use, with only three buttons, and a tiny UI screen to select recording mode and resolution. If you have a Samsung Galaxy phone, there are a variety of other functions that allows, like remote control and streaming the output to the phone as a viewfinder and such. Even without a Galaxy phone, the camera did everything I needed to generate 360 footage to stitch and edit with but it was cool to have a remote viewfinder for the driving shots.

Pricier cameras

One of the big challenges of shooting with any 360 camera is how to avoid getting gear and rigging in the shot since the camera records everything around it. Even the tiny integrated tripod on the Gear 360 is visible in the shots, and putting it on the plate of my regular DSLR tripod fills the bottom of the footage. My solution was to use the thinnest support I could to keep the rest of the rigging as far from the camera as possible, and therefore smaller from its perspective. I created a couple options to shoot with that are pictured below. The results are much less intrusive in the resulting images that are recorded. Obviously besides the camera support, there is the issue of everything else in the shot including the operator. Since most 360 videos are locked off, an operator may not be needed, but there is no “behind the camera” for hiding gear or anything else. Your set needs to be considered in every direction, since it will all be visible to your viewer. If you can see the camera, it can see you.

There are many different approaches to storing 360 images, which are inherently spherical, as a video file, which is inherently flat. This is the same issue that cartographers have faced for hundreds of years — creating flat paper maps of a planet that is inherently curved. While there are sphere map, cube map and pyramid projection options (among others) based on the way VR headsets work, the equirectangular format has emerged as the standard for editing and distribution encoding, while other projections are occasionally used for certain effects processing or other playback options.

Usually the objective of the stitching process is to get the images from all of your lenses combined into a single frame with the least amount of distortion and the fewest visible seams. There are a number of software solutions that do this, from After Effects plugins, to dedicated stitching applications like Kolor AVP and Orah VideoStitch-Studio to unique utilities for certain cameras. Once you have your 360 video footage in the equirectangular format, most of the other steps of the workflow are similar to their flat counterparts, besides VFX. You can cut, fade, title and mix your footage in an NLE and then encode it in the standard H.264 or H.265 formats with a few changes to the metadata.

Technically, the only thing you need to add to an existing 4K editing workflow in order to make the jump to 360 video is a 360 camera. Everything else could be done in software, but the other thing you will want is a VR headset or HMD. It is possible to edit 360 video without an HMD, but it is a lot like grading a film using scopes but no monitor. The data and tools you need are all right there, but without being able to see the results, you can’t be confident of what the final product will be like. You can scroll around the 360 video in the view window, or see the whole projected image all distorted, but it won’t have the same feel as experiencing it in a VR headset.

360 Video is not as processing intensive as true 3D VR, but it still requires a substantial amount of power to provide a good editing experience. I am using a Thinkpad P71 with an Nvidia Quadro P5000 GPU to get smooth performance during all these tests.

Stay tuned for Part 2 where we focus on editing 360 Video.


Mike McCarthy is an online editor/workflow consultant with 10 years of experience on feature films and commercials. He has been working on new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.

Rise Above

Sundance 2017: VR for Good’s Rise Above 

By Elise Ballard

On January 22, during the Sundance Film Festival in Park City, the Oculus House had an event for their VR for Good initiative, described as “helping non-profits and rising filmmakers bring a variety of social missions to life.” Oculus awarded 10 non-profits a $40,000 grant and matched them with VR filmmakers to make a short film related to their community and cause.

One of the films, Rise Above, highlights a young girl’s recovery from sexual abuse and the support and therapy she received from New York City’s non-profit Womankind (formerly New York Asian Women’s Center).

Rise AboveRise Above is a gorgeous film — shot on the Nokia Ozo camera — and really well done, especially in as far as guiding your eye to the storytelling going on in a VR360 environment. I had the opportunity to interview the filmmakers, Ben Ross and Brittany Neff, about their experience. I was curious why they feel VR is one of the best mediums to create empathy and action for social impact. Check out their website.

Referencing the post process, Ross said he wore headsets the entire time as he worked with the editor in order to make sure it worked as a VR experience. All post production for VR for Good films was done at Reel FX. In terms of tools, for stitching the footage they used a combination of the Ozo Creator software from Nokia, Autopano Video from Kolor and the Cara plug-in for Nuke. Reel FX finished all the shots in Nuke (again making major use of Care) and Autodesk’s Flame for seam fixing and rig removal. TD Ryan Hartsell did the graphics work in After Effects using the mettle plug-in to help him place the graphics in 360 space and in 3D.

For more on the project and Reel FX’s involvement visit here.

The Oculus’ VR for Good initiative will be exhibiting will be at other major film festivals throughout the year and will be distributed by Facebook after the festival circuit.

Visit VR for Good here for more information, news and updates, and to stay connected (and apply!) to this inspiring and cutting-edge project.

Elise Ballard is a Los Angeles-based writer and author of Epiphany, True Stories of Sudden Insight, and the director of development at Cognition and Arc/k Project, a non-profit dedicated to preserving cultural heritage via virtual reality and digital media.

China’s online video network Youku calls on Nokia’s Ozo ecosystem for VR  

One of the largest online video platforms in China, Youku, has chosen the Nokia Ozo VR ecosystem of technologies to create and distribute immersive VR content to more than 500 million monthly active users. Their platform features daily views of more than 1.1 billion.

Youku will use the entire Ozo VR solution, which includes the Ozo Camera, Ozo Software Suite, Ozo Live and Ozo Player SDK, in the creation and distribution of content ranging from film and television to news and documentaries, as well as professional user-generated content featuring Youku’s top talent.

Youku will integrate Nokia’s Ozo Player SDK and Ozo Audio solutions into all its platforms, mobile apps and consumer offerings, enabling its enormous audience to enjoy 3D 360-degree VR. The Ozo Player SDK allows VR pros to create VR app experiences on most major platforms with a single, unified development interface.

Full-featured reference players are also included in the SDKfor all supported platforms — including Oculus Desktop, Oculus Mobile/GearVR, HTC Vive and Google VR for Android and iOS. The multi-platform Ozo Player SDK is now available in a free version as well as a Pro tier with more features and larger deployment options.

 

IBC 2016: VR and 8K will drive M&E storage demand

By Tom Coughlin

While attending the 2016 IBC show, I noticed some interesting trends, cool demos and new offerings. For example, while flying drones were missing, VR goggles were everywhere; IBM was showing 8K video editing using flash memory and magnetic tape; the IBC itself featured a fully IP-based video studio showing the path to future media production using lower-cost commodity hardware with software management; and, it became clear that digital technology is driving new entertainment experiences and will dictate the next generation of content distribution, including the growing trend to OTT channels.

In general, IBC 2016 featured the move to higher resolution and more immersive content. On display throughout the show was 360-degree video for virtual reality, as well as 4K and 8K workflows. Virtual reality and 8K are driving new levels of performance and storage demand, and these are just some of the ways that media and entertainment pros are future-zone-2increasing the size of video files. Nokia’s Ozo was just one of several multi-camera content capture devices on display for 360-degree video.

Besides multi-camera capture technology and VR editing, the Future Tech Zone at IBC included even larger 360-degree video display spheres than at the 2015 event. These were from Puffer Fish (pictured right). The smaller-sized spherical display was touch-sensitive so you could move your hand across the surface and cause the display to move (sadly, I didn’t get to try the big sphere).

IBM had a demonstration of a 4K/8K video editing workflow using the IBM FlashSystem and IBM Enterprise tape storage technology, which was a collaboration between the IBM Tokyo Laboratory and IBM’s Storage Systems division. This work was done to support the move to 4K/8K broadcasts in Japan by 2018, with a broadcast satellite and delivery of 8K video streams of the 2020 Tokyo Olympic Games. The combination of flash memory storage for working content and tape for inactive content is referred to as FLAPE (flash and tAPE).

The graphic below shows a schematic of the 8K video workflow demonstration.

The argument for FLAPE appears to be that flash performance is needed for editing 8K content and the magnetic tape provides low-cost storage the 8K content, which may require greater than 18TB for an hour of raw content (depending upon the sampling and frame rate). Note that magnetic tape is often used for archiving of video content, so this is a rather unusual application. The IBM demonstration, plus discussions with media and entertainment professionals at IBC indicate that with the declining costs of flash memory and the performance demands of 8K, 8K workflows may finally drive increased demand for flash memory for post production.

Avid was promoting their Nexis file system, the successor to ISIS. The company uses SSDs for metadata, but generally flash isn’t used for actual editing yet. They agreed that as flash costs drop, flash could find a role for higher resolution and richer media. Avid has embraced open source for their code and provides free APIs for their storage. The company sees a hybrid of on-site and cloud storage for many media and entertainment applications.

EditShare announced a significant update to its XStream EFS Shared Storage Platform (our main image). The update provides non-disruptive scaling to over 5PB with millions of assets in a single namespace. The system provides a distributed file system with multiple levels of hardware redundancy and reduced downtime. An EFS cluster can be configured with a mix of capacity and performance with SSDs for high data rate content and SATA HDD for cost-efficient higher-performance storage — 8TB HDDs have been qualified for the system. The latest release expands optimization support for file-per-frame media.

The IBC IP Interoperability Zone was showing a complete IP-based studio (pictured right) was done with the cooperation of AIMS and the IABM. The zone brings to life the work of the JT-NM (the Joint Task Force on Networked Media, a combined initiative of AMWA, EBU, SMPTE and VSF) and the AES on a common roadmap for IP interoperability. Central to the IBC Feature Area was a live production studio, based on the technologies of the JT-NM roadmap that Belgian broadcaster VRT has been using daily on-air all this summer as part of the LiveIP Project, which is a collaboration between VRT, the European Broadcasting Union (EBU) and LiveIP’s 12 technology partners.

Summing Up
IBC 2016 showed some clear trends to more immersive, richer content with the numerous displays of 360-degree and VR content and many demonstrations of 4K and even 8K workflows. Clearly, the trend is for higher-capacity, higher-performance workflows and storage systems that support this workflow. This is leading to a gradual move to use flash memory to support these workflows as the costs for flash go down. At the same time, the move to IP-based equipment will lead to lower-cost commodity hardware with software control.

Storage analyst Tom Coughlin is president of Coughlin Associates. He has over 30 years in the data storage industry and is the author of Digital Storage in Consumer Electronics: The Essential Guide. He also  publishes the Digital Storage Technology Newsletter, the Digital Storage in Media and Entertainment Report.

IBC: Thoughts on Dolby and Nokia

By Zak Tucker

Strolling the halls of IBC in Amsterdam this past week, I found a lot of interesting tools and tech. Here are just a few thoughts about a of couple companies I visited.

Dolby
On Picture: Dolby is presenting their PQ workflow, which enables HDR and SDR deliverables seamlessly. Recognizing that there will be a real transition period as consumers adopt HDR home viewing environments, Dolby has written algorithms that detect the native specs of each monitor that is Dolby-enabled so that it can interpret the intent of the PQ color and translate it to the specific monitor. In demos, the HDR media is optically more vibrant and true-to-life colors are also more accurately represented compared to traditional SDR. Also, the SDR that Dolby is able to draw from the HDR is optically more vibrant and sharp than the traditional SDR.

On Sound: Dolby is pressing forward with its home immersive sound experience. Through its sound bar and associated sub-woofer, Dolby is producing a home Atmos sound experience that is quite compelling. Dolby can also work with the additional speakers that can be installed by home users. Dolby’s home Atmos is able to dynamically adjust to various home speaker installations.

Nokia OZO
They have developed and delivered a purpose-built VR camera that records both picture and sound. The form factor, not any bigger than a person’s head, is clean and small so as to address the concern of most VR rigs that are large and overly obtrusive — often an issue with talent, for example, when capturing a live event such as a concert. This camera is cable of north of 4K resolution and the current stitched deliverable is a 4K, 3D, VR file. The accompanying software can accomplish both a Fast auto stitch as well as a higher quality stitch. The software is also capable of taking a live stream from the VR camera and transmitting it, stitched, to a platform, such as YouTube, in real time. In the demo, the stitching is quite seamless.

Zak Tucker is president and co-founder of Harbor Picture Company in New York.