Cinnafilm 6.6.19

Category Archives: VR

Lenovo intros next-gen ThinkPads

Lenovo has launched the next generation of its ThinkPad P Series with the release of five new ThinkPads, including the ThinkPad P73, ThinkPad P53, ThinkPad P1 Gen 2 and ThinkPad P53s and P43s.

The ThinkPad P53 features the Nvidia Quadro RTX 5000 GPU with RT and Tensor cores, offering realtime raytracing and AI acceleration. It now features Intel Xeon and 9th Gen Core class CPUs with up to eight cores (including the Core i9) up to 128GB of memory and 6TB of storage.

This mobile workstation also boasts a new OLED touch display with Dolby Vision HDR for superb color and some of the deepest black levels ever. Building on the innovation behind the ThinkPad P1 power supply, Lenovo is also maximizing the portability of this workstation with a 35 percent smaller power supply. The ThinkPad P53 is designed to handle everything from augmented reality and VR content creation to the deployment of mobile AI or ISV workflows. The ThinkPad P53 will be available in July, starting at $1,799.

At 3.74 pounds and 17.2mm thin, Lenovo’s thinnest and lightest 15-inch workstation — the ThinkPad P1 Gen 2 — includes the latest Nvidia Quadro Turing T1000 and T2000 GPUs. The ThinkPad P1 also features eight-core Intel 9th Gen Xeon and Core CPUs and an OLED touch display with Dolby Vision HDR.

The ThinkPad P1 Gen 2 will be available at the end of June starting at $1,949.

With its 17.3-inch Dolby Vision 4K UHD screen and mobility with a 35% smaller power adaptor, Lenovo’s ThinkPad P73 offers users maximum workspace and mobility. Like the ThinkPad 53, it features the Intel Xeon and Core processors and the most powerful Nvidia Quadro RTX graphics. The ThinkPad P73 will be available in August starting at $1,849.

The ThinkPad P43s features a 14-inch chassis and will be available in July starting at $1,499.

Rounding out the line is the ThinkPad P53s which combines the latest Nvidia Quadro graphics and Intel Core processors — all in a thin and light chassis. The ThinkPad P53s will be available in June, starting at $1,499.

For the first time, Lenovo is adding new X-Rite Pantone Factory Color Calibration to the ThinkPad P1 Gen 2, ThinkPad P53 and ThinkPad P73. The unique factory color calibration profile is stored in the cloud to ensure more accurate recalibration. This profile allows for dynamic switching between color spaces, including sRGB, Adobe RGB and DCI-P3 to ensure accurate ISV application performance.

The entire ThinkPad portfolio is also equipped with advanced ThinkShield security features – from ThinkShutter to privacy screens to self-healing BIOS that recover when attacked or corrupted – to help protect users from every angle and give them the freedom to innovate fearlessly.

Apple offers augmented reality with Reality Composer

By Barry Goch

In addition to introducing the new MacPro and the Pro Display XDR, at its Worldwide Developers Conference (WWDC19), Apple had some pretty cool demos. The coolest, in my mind, was the Minecraft augmented reality presentation.

Across the street from the San Jose Convention Center, where the keynote was held, Apple set up “The Studio” in the San Jose Civic. One of the demos there was an AR experience with the new MacPro which in reality, you only saw the space frame of Apple’s tower, but in augmented reality you were able to animate an exploded view. The technology behind this demo is the just-announced ARKit3 and Reality Composer.

Apple had a couple of stations demoing Reality Composer in The Studio. Apple has applied its famous legacy of enabling content creators by making new technology easy to use. Case in point is Reality Composer. I’ve tried building AR experiences in other apps and it’s not very straightforward. You have to learn a new interface and coding as well — and use yet another app for targeting your AR environment into the real world. The demo I saw of Reality Composer made it look easy, working in Motion with drag-and-drop prebuilt behaviors built into the app, along with multiple ways to target your AR experience in the real world.

AR QuickLook technology is part of iOS, and you can even get an AR experience of the new MacPro and Pro Display XDR through Apple’s website. They also mentioned its new file for holding AR elements, usdz. Apple has created a tool to convert other 3D file formats to usdz.

With native AR support across Apple’s ecosystem, there is no better time to experiment and learn about augmented reality.


Barry Goch is a finishing artist at LA’s The Foundation and a UCLA Extension Instructor in post production. You can follow him on Twitter at @Gochya.

Cinnafilm 6.6.19

Dell intros two budget-friendly Precision mobile workstations

Dell is offering two new mobile workstations for designers and graphic artists who are looking for entry-level, workstation-class devices — Dell Precision 3540 and 3541. These budget-friendly machines offer a smaller footprint with high performance. Dell’s Precision line has traditionally been used for intensive workloads, such as machine learning and artificial intelligence, and these entry-level versions are designed to allow artists with smaller budgets access to the Precision line’s capabilities.

The Precision 3540 comes with the latest 4-core Intel Core 8th generation processors, up to 32GB of DDR4 memory, AMD Radeon Pro graphics with 2GB of dedicated memory and 2TB of storage. The Precision 3541 will offer additional power, with 9th generation 8-core Intel Core and 6-core Intel Xeon processor options. It will be available with Nvidia Quadro professional graphics with 4GB of dedicated memory. It will also have extreme battery life for on-the-go productivity.

Both models come with Thunderbolt 3 connectivity and optional features to enhance security, such as fingerprint and smartcard readers, an IR camera and a camera shutter. Both models also have a narrow-edge 15.6-inch display. The 3540 model weighs in at 4.04 pounds, and the 3541 model starts at 4.34 pounds.

The Dell Precision 3540 is available now on Dell.com starting at $799, while the Precision 3541 will be available in late May.


Marvel Studios’ Victoria Alonso to keynote SIGGRAPH 2019

Marvel Studios executive VP of production Victoria Alonso has been name keynote speaker for SIGGRAPH 2019, which will run from July 28 through August 1 in downtown Los Angeles. Registration is now open. The annual SIGGRAPH conference is a melting pot for researchers, artists and technologists, among other professionals.

“Victoria is the ultimate symbol of where the computer graphics industry is headed and a true visionary for inclusivity,” says SIGGRAPH 2019 conference chair Mikki Rose. “Her outlook reflects the future I envision for computer graphics and for SIGGRAPH. I am thrilled to have her keynote this summer’s conference and cannot wait to hear more of her story.”

One of few women in Hollywood to hold such a prominent title, Alonso’s dedication to the industry has been admired for a long time, leading to multiple awards and honors, including the 2015 New York Women in Film & Television Muse Award for Outstanding Vision and Achievement, the Advanced Imaging Society’s first female Harold Lloyd Award recipient, and the 2017 VES Visionary Award (another female first). A native of Buenos Aires, her career began in visual effects and included a four-year stint at Digital Domain.

Alonso’s film credits include productions such as Ridley Scott’s Kingdom of Heaven, Tim Burton’s Big Fish, Andrew Adamson’s Shrek, and numerous Marvel titles — Iron Man, Iron Man 2, Thor, Captain America: The First Avenger, Iron Man 3, Captain America: The Winter Soldier, Captain America: Civil War, Thor: The Dark World, Avengers: Age of Ultron, Ant-Man, Guardians of the Galaxy, Doctor Strange, Guardians of the Galaxy Vol. 2, Spider-Man: Homecoming, Thor: Ragnarok, Black Panther, Avengers: Infinity War, Ant-Man and the Wasp and, most recently, Captain Marvel.

“I’ve been attending SIGGRAPH since before there was a line at the ladies’ room,” says Alonso. “I’m very much looking forward to having a candid conversation about the state of visual effects, diversity and representation in our industry.”

She adds, “At Marvel Studios, we have always tried to push boundaries with both our storytelling and our visual effects. Bringing our work to SIGGRAPH each year offers us the opportunity to help shape the future of filmmaking.”

The 2019 keynote session will be presented as a fireside chat, allowing attendees the opportunity to hear Alonso discuss her life and career in an intimate setting.


Creating audio for the cinematic VR series Delusion: Lies Within

By Jennifer Walden

Delusion: Lies Within is a cinematic VR series from writer/director Jon Braver. It is available on the Samsung Gear VR and Oculus Go and Rift platforms. The story follows a reclusive writer named Elena Fitzgerald who penned a series of popular fantasy novels, but before the final book in the series was released, the author disappeared. Rumors circulated about the author’s insanity and supposed murder, so two avid fans decide to break into her mansion to search for answers. What they find are Elena’s nightmares come to life.

Delusion: Lies Within is based on an interactive play written by Braver and Peter Cameron. Interactive theater isn’t your traditional butts-in-the-seat passive viewing-type theater. Instead, the audience is incorporated into the story. They interact with the actors, search for objects, solve mysteries, choose paths and make decisions that move the story forward.

Like a film, the theater production is meticulously planned out, from the creature effects and stunts to the score and sound design. With all these components already in place, Delusion seemed like the ideal candidate to become a cinematic VR series. “In terms of the visuals and sound, the VR experience is very similar to the theatrical experience. With Delusion, we are doing 360° theater, and that’s what VR is too. It’s a 360° format,” explains Braver.

While the intent was to make the VR series match the theatrical experience as much as possible, there are some important differences. First, immersive theater allows the audience to interact with the actors and objects in the environment, but that’s not the case with the VR series. Second, the live theater show has branching story narratives and an audience member can choose which path he/she would like to follow. But in the VR series there’s one set storyline that follows a group who is exploring the author’s house together. The viewer feels immersed in the environment but can’t manipulate it.

L-R: Hamed_Hokamzadeh and Thomas Ouziel

According to supervising sound editor Thomas Ouziel from Hollywood’s MelodyGun Group, “Unlike many VR experiences where you’re kind of on rails in the midst of the action, this was much more cinematic and nuanced. You’re just sitting in the space with the characters, so it was crucial to bring the characters to life and to design full sonic spaces that felt alive.”

In terms of workflow, MelodyGun sound supervisor/studio manager Hamed Hokamzadeh chose to use the Oculus Developers Kit 2 headset with Facebook 360 Spatial Workstation on Avid Pro Tools. “Post supervisor Eric Martin and I decided to keep everything within FB360 because the distribution was to be on a mobile VR platform (although it wasn’t yet clear which platform), and FB360 had worked for us marvelously in the past for mobile and Facebook/YouTube,” says Hokamzadeh. “We initially concentrated on delivering B-format (2nd Order AmbiX) playing back on Gear VR with a Samsung S8. We tried both the Audio-Technica ATH-M50 and Shure SRH840 headphones to make sure it translated. Then we created other deliverables: quad-binaurals, .tbe, 8-channel and a stereo static mix. The non-diegetic music and voiceover was head-locked and delivered in stereo.”

From an aesthetic perspective, the MelodyGun team wanted to have a solid understanding of the audience’s live theater experience and the characters themselves “to make the VR series follow suit with the world Jon had already built. It was also exciting to cross our sound over into more of a cinematic ‘film world’ than was possible in the live theatrical experience,” says Hokamzadeh.

Hokamzadeh and Ouziel assigned specific tasks to their sound team — Xiaodan Li was focused on sound editorial for the hard effects and Foley, and Kennedy Phillips was asked to design specific sound elements, including the fire monster and the alchemist freezing.

Ouziel, meanwhile, had his own challenges of both creating the soundscape and integrating the sounds into the mix. He had to figure out how to make the series sound natural yet cinematic, and how to use sound to draw the viewer’s attention while keeping the surrounding world feeling alive. “You have to cover every movement in VR, so when the characters split up, for example, you want to hear all their footsteps, but we also had to get the audience to focus on a specific character to guide them through. That was one of the biggest challenges we had while mixing it,” says Ouziel.

The Puppets
“Chapter Three: Trial By Fire” provides the best example of how Ouziel tackled those challenges. In the episode, Virginia (Britt Adams) finds herself stuck in Marion’s chamber. Marion (Michael J. Sielaff) is a nefarious puppet master who is clandestinely controlling a room full of people on puppet strings; some are seated at a long dining table and others are suspended from the ceiling. They’re all moving their arms as if dancing to the scratchy song that’s coming from the gramophone.

The sound for the puppet people needed to have a wiry, uncomfortable feel and the space itself needed to feel eerily quiet but also alive with movement. “We used a grating metallic-type texture for the strings so they’d be subconsciously unnerving, and mixed that with wooden creaks to make it feel like you’re surrounded by constant danger,” says Ouziel.

The slow wooden creaks in the ambience reinforce the idea that an unseen Marion is controlling everything that’s happening. Braver says, “Those creaks in Marion’s room make it feel like the space is alive. The house itself is a character in the story. The sound team at MelodyGun did an excellent job of capturing that.”

Once the sound elements were created for that scene, Ouziel then had to space each puppet’s sound appropriately around the room. He also had to fill the room with music while making sure it still felt like it was coming from the gramophone. Ouziel says, “One of the main sound tools that really saved us on this one was Audio Ease’s 360pan suite, specifically the 360reverb function. We used it on the gramophone in Marion’s chamber so that it sounded like the music was coming from across the room. We had to make sure that the reflections felt appropriate for the room, so that we felt surrounded by the music but could clearly hear the directionality of its source. The 360pan suite helped us to create all the environmental spaces in the series. We pretty much ran every element through that reverb.”

L-R: Thomas Ouziel and Jon Braver.

Hokamzadeh adds, “The session got big quickly! Imagine over 200 AmbiX tracks, each with its own 360 spatializer and reverb sends, plus all the other plug-ins and automation you’d normally have on a regular mix. Because things never go out of frame, you have to group stuff to simplify the session. It’s typical to make groups for different layers like footsteps, cloth, etc., but we also made groups for all the sounds coming from a specific direction.”

The 360pan suite reverb was also helpful on the fire monster’s sounds. The monster, called Ember, was sound designed by Phillips. His organic approach was akin to the bear monster in Annihilation, in that it felt half human/half creature. Phillips edited together various bellowing fire elements that sounded like breathing and then manipulated those to match Ember’s tormented movements. Her screams also came from a variety of natural screams mixed with different fire elements so that it felt like there was a scared young girl hidden deep in this walking heap of fire. Ouziel explains, “We gave Ember some loud sounds but we were able to play those in the space using the 360pan suite reverb. That made her feel even bigger and more real.”

The Forest
The opening forest scene was another key moment for sound. The series is set in South Carolina in 1947, and the author’s estate needed to feel like it was in a remote area surrounded by lush, dense forest. “With this location comes so many different sonic elements. We had to communicate that right from the beginning and pull the audience in,” says Braver.

Genevieve Jones, former director of operations at Skybound Entertainment and producer on Delusion: Lies Within, says, “I love the bed of sound that MelodyGun created for the intro. It felt rich. Jon really wanted to go to the south and shoot that sequence but we weren’t able to give that to him. Knowing that I could go to MelodyGun and they could bring that richness was awesome.”

Since the viewer can turn his/her head, the sound of the forest needed to change with those movements. A mix of six different winds spaced into different areas created a bed of textures that shifts with the viewer’s changing perspective. It makes the forest feel real and alive. Ouziel says, “The creative and technical aspects of this series went hand in hand. The spacing of the VR environment really affects the way that you approach ambiences and world-building. The house interior, too, was done in a similar approach, with low winds and tones for the corners of the rooms and the different spaces. It gives you a sense of a three-dimensional experience while also feeling natural and in accordance to the world that Jon made.”

Bringing Live Theater to VR
The sound of the VR series isn’t a direct translation of the live theater experience. Instead, it captures the spirit of the live show in a way that feels natural and immersive, but also cinematic. Ouziel points to the sounds that bring puppet master Marion to life. Here, they had the opportunity to go beyond what was possible with the live theater performance. Ouziel says, “I pitched to Jon the idea that Marion should sound like a big, worn wooden ship, so we built various layers from these huge wooden creaks to match all his movements and really give him the size and gravitas that he deserved. His vocalizations were made from a couple elements including a slowed and pitched version of a raccoon chittering that ended up feeling perfectly like a huge creature chuckling from deep within. There was a lot of creative opportunity here and it was a blast to bring to life.”


Jennifer Walden is a New Jersey-based audio engineer and writer. Follow her on Twitter @audiojeney.


IDEA launches to create specs for next-gen immersive media

The Immersive Digital Experiences Alliance (IDEA) will launch at the NAB 2019 with the goal of creating a suite of royalty-free specifications that address all immersive media formats, including emerging light field technology.

Founding members — including CableLabs, Light Field Lab, Otoy and Visby — created IDEA to serve as an alliance of like-minded technology, infrastructure and creative innovators working to facilitate the development of an end-to-end ecosystem for the capture, distribution and display of immersive media.

Such a unified ecosystem must support all displays, including highly anticipated light field panels. Recognizing that the essential launch point would be to create a common media format specification that can be deployed on commercial networks, IDEA has already begun work on the new Immersive Technology Media Format (ITMF).

ITMF will serve as an interchange and distribution format that will enable high-quality conveyance of complex image scenes, including six-degrees-of-freedom (6DoF), to an immersive display for viewing. Moreover, ITMF will enable the support of immersive experience applications including gaming, VR and AR, on top of commercial networks.

Recognized for its potential to deliver an immersive true-to-life experience, light field media can be regarded as the richest and most dense form of visual media, thereby setting the highest bar for features that the ITMF will need to support and the new media-aware processing capabilities that commercial networks must deliver.

Jon Karafin, CEO/co-founder of Light Field Lab, explains that “a light field is a representation describing light rays flowing in every direction through a point in space. New technologies are now enabling the capture and display of this effect, heralding new opportunities for entertainment programming, sports coverage and education. However, until now, there has been no common media format for the storage, editing, transmission or archiving of these immersive images.”

“We’re working on specifications and tools for a variety of immersive displays — AR, VR, stereoscopic 3D and light field technology, with light field being the pinnacle of immersive experiences,” says Dr. Arianne Hinds, Immersive Media Strategist at CableLabs. “As a display-agnostic format, ITMF will provide near-term benefits for today’s screen technology, including VR and AR headsets and stereoscopic displays, with even greater benefits when light field panels hit the market. If light field technology works half as well as early testing suggests, it will be a game-changer, and the cable industry will be there to help support distribution of light field images with the 10G platform.”

Starting with Otoy’s ORBX scene graph format, a well-established data structure widely used in advanced computer animation and computer games, IDEA will provide extensions to expand the capabilities of ORBX for light field photographic camera arrays, live events and other applications. Further specifications will include network streaming for ITMF and transcoding of ITMF for specific displays, archiving, and other applications. IDEA will preserve backwards-compatibility on the existing ORBX format.

IDEA anticipates releasing an initial draft of the ITMF specification in 2019. The alliance also is planning an educational seminar to explain more about the requirements for immersive media and the benefits of the ITMF approach. The seminar will take place in Los Angeles this summer.

Photo Credit: All Rights Reserved: Light Field Lab. Future Vision concept art of room-scale holographic display from Light Field Lab, Inc.


Behind the Title: Light Sail VR MD/EP Robert Watts

This creative knew as early as middle school that he wanted to tell stories. Now he gets to immerse people in those stories.

NAME: Robert Watts

COMPANY: LA-based Light Sail VR (@lightsailvr)

CAN YOU DESCRIBE YOUR COMPANY?
We’re an immersive media production company. We craft projects end-to-end in the VR360, VR180 and interactive content space, which starts from bespoke creative development all the way through post and distribution. We produce both commercial work and our own original IP — our first of which is called Speak of the Devil VR, which is an interactive, live-action horror experience where you’re a main character in your own horror movie.

WHAT’S YOUR JOB TITLE?
Managing Partner and Executive Producer

WHAT DOES THAT ENTAIL?
A ton. As a startup, we wear many hats. I oversee all production elements, acting as producer. I run operations, business development and the financials for the company. Then Matt Celia, my business partner and creative director, collaborates on the overall creative for each project to ensure the quality of the experience, as well as making sure it works natively (i.e.: is the best in) the immersive medium.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I’m very hands-on on set, almost to a fault. So I’ve ended up with some weird (fake) credits, such as fog team, stand-in, underwater videographer, sometimes even assistant director. I do whatever it takes to get the job done — that’s a producer’s job.

WHAT TOOLS DO YOU USE?
Excluding all the VR headsets and tech, on the producing side Google Drive and Dropbox are a producer’s lifeblood, as well as Showbiz Budgeting from Media Services.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I love being on set watching the days and weeks of pre-production and development coalesce. There’s an energy on set that’s both fun and professional, and that truly shows the crew’s dedication and focus to get the job done. As the exec producer, it’s nice being able to strike a balance between being on set and being in the office.

Light Sail VR partners (L-R): Matt Celia and Robert Watts

WHAT’S YOUR LEAST FAVORITE?
Tech hurdles. They always seem to pop up. We’re a production company working on the edge of the latest technology, so something always breaks, and there’s not always a YouTube tutorial on how to fix it. It can really set back one’s day.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
We do “Light Sail Sandwich Club” at lunch and cater a smorgasbord of sandwich fixings and crafty services for our teams, contractors and interns. It’s great to take a break from the day and sit down and connect with our colleagues in a personal way. It’s relaxed and fun, and I really enjoy it.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I love what I do, but I also like giving back. I think I’d be using my project management skills in a way that would be a force for good, perhaps at an NGO or entity working on tackling climate change.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
Middle school. My family watched a lot of television and films. I wanted to be an archaeologist after watching Indiana Jones, a paleontologist after Jurassic Park, a submarine commander after Crimson Tide and I fancied being a doctor after watching ER. I got into theater and video productions in high school, and I realized I could be in entertainment and make all those stories I loved as a kid.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
At the tail end of 2018, we produced 10 360-degree episodes for Refinery29 (Sweet Digs 360), 10 VR180 episodes (Get Glam, Hauliday) and VR180 spots for Bon Appetit and Glamour. We also wrapped on a music video that’s releasing this year.

On top of it all, we’ve been hard at work developing our next original, which we will reveal more details about soon. We’ve been busy! I’m extremely thankful for the wonderful teams that helped us make it all happen.

Now Your Turn

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I am very proud of the diversity project we did with Google, Google: Immerse, as well as our first original, Speak of the Devil. But I think our first original series Now Your Turn is the one I’m going to pick. It’s a five-episode VR180 series that features Geek & Sundry talent showcasing some amazing board games. It’s silly and fun, and we put in a number of easter eggs that make it even better when you’re watching in a headset. I’m proud of it because it’s an example of where the VR medium is going — series that folks tune into week to week.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
My Mac for work and music — I’m constantly listening to music while I work. My Xbox One is where I watch all my content and, lastly, my VIVE set up at home. I like to check out all the latest in VR, from experiences to gaming, and I even work out with it playing BoxVR or Beat Saber.

WHAT KIND OF MUSIC DO YOU LISTEN TO AT WORK?
My taste spans from classic rock to techno/EDM to Spanish guitar.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I try to have a work-life balance. I don’t set my email notifications to “push.” Instead, I make the choice of when I check my emails. I do it frequently enough I don’t ever feel I’m out of the loop, but that small choice helps me feel in control of all the hundreds of things that happen on a day-to-day basis.

I make time every night and on the weekends to spend time with my lovely wife, Jessica. When we’re not watching stuff, we’re seeing friends and playing board games — we’re big nerds. It’s important to have fun!


Sandbox VR partners with Vicon on Amber Sky 2088 experience

VR gaming company Sandbox VR has been partnering and working with Vicon motion capture tools to create next-generation immersive experiences. By using Vicon’s motion capture cameras and its location-based VR (LBVR) software Evoke, the Hong Kong-based Sandbox VR is working to transport up to six people at a time into the Amber Sky 2088 experience, which takes place in a future where the fate of humanity lies in the balance.

Sandbox VR’s adventures resemble movies where the players become the characters. With two proprietary AAA-quality games already in operation across Sandbox VR’s seven locations, for its third title, Amber Sky 2088, a new motion capture solution was needed. In the futuristic game, users step into the role of androids, granting players abilities far beyond the average human while still scaling the game to their actual movements. To accurately convey that for multiple users in a free-roam environment, precision tracking and flexible scalability were vital. For that, Sandbox VR turned to Vicon.

Set in the twilight of the 21st century, Amber Sky 2088 takes players to a futuristic version of Hong Kong, then through the clouds to the edge of space to fight off an alien invasion. Android abilities allow players to react with incredible strength and move at speeds fast enough to dodge bullets. And while the in-game action is furious, participants in the real-world — equipped with VR headsets —  freely roam an open environment as Vicon LBVR motion capture cameras track their movement.

Vicon’s motion capture cameras record every player movement, then send the data to its Evoke software, a solution introduced last year as part of its LBVR platform, Origin. Vicon’s solution offers  precise tracking, while also animating player motion in realtime, creating a seamless in-game experience. Automatic re-calibration also makes the experience’s operation easier than ever despite its complex nature, and the system’s scalability means fewer cameras can be used to capture more movement, making it cost-effective for large scale expansion.

Since its founding in 2016, Sandbox VR has been creating interactive experiences by combining motion capture technology with virtual reality. After opening its first location in Hong Kong in 2017, the company has since expanded to seven locations across Asia and North America, with six new sites on the way. Each 30- to 60-minute experience is created in-house by Sandbox VR, and each can accommodate up to six players at a time.

The recent partnership with Vicon is the first step in Sandbox VR’s expansion plans that will see it open over 40 experience rooms across 12 new locations around the world by the end of the year. In considering its plans to build and operate new locations, the VR makers chose to start with five systems from Vicon, in part because of the company’s collaborative nature.


Lowepost offering Scratch training for DITs, post pros

Oslo, Norway-based Lowepost, which offers an online learning platform for post production, has launched an Assimilate Scratch Training Channel targeting DITs and post pros. This training includes an extensive series of tutorials that help guide a post pro or DIT through the features of an entire Scratch workflow. Scratch products offer dailies to conform, color grading, visual effects, compositing, finishing, VR and live streaming.

“We’re offering in-depth training of Scratch via comprehensive tutorials developed by Lowepost and Assimilate,” says Stig Olsen, manager of Lowepost. “Our primary goal is to make Scratch training easily accessible to all users and post artists for building their skills in high-end tools that will advance their expertise and careers. It’s also ideal for DaVinci Resolve colorists who want to add another excellent conform, finishing and VR tool to their tool kit.”

Lowepost is offering three-month free access to the Scratch training. The first tutorial, Scratch Essential Training, is also available now. A free 30-day trial offer of Scratch is available via their website.

Lowepost’s Scratch Training Channel is available for an annual fee of $59 (US).

Behind the Title: Left Field Labs ECD Yann Caloghiris

NAME: Yann Caloghiris

COMPANY: Left Field Labs (@LeftFieldLabs)

CAN YOU DESCRIBE YOUR COMPANY?
Left Field Labs is a Venice-California-based creative agency dedicated to applying creativity to emerging technologies. We create experiences at the intersection of strategy, design and code for our clients, who include Google, Uber, Discovery and Estée Lauder.

But it’s how we go about our business that has shaped who we have become. Over the past 10 years, we have consciously moved away from the traditional agency model and have grown by deepening our expertise, sourcing exceptional talent and, most importantly, fostering a “lab-like” creative culture of collaboration and experimentation.

WHAT’S YOUR JOB TITLE?
Executive Creative Director

WHAT DOES THAT ENTAIL?
My role is to drive the creative vision across our client accounts, as well as our own ventures. In practice, that can mean anything from providing insights for ongoing work to proposing creative strategies to running ideation workshops. Ultimately, it’s whatever it takes to help the team flourish and push the envelope of our creative work.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Probably that I learn more now than I did at the beginning of my career. When I started, I imagined that the executive CD roles were occupied by seasoned industry veterans, who had seen and done it all, and would provide tried and tested direction.

Today, I think that cliché is out of touch with what’s required from agency culture and where the industry is going. Sure, some aspects of the role remain unchanged — such as being a supportive team lead or appreciating the value of great copy — but the pace of change is such that the role often requires both the ability to leverage past experience and accept that sometimes a new paradigm is emerging and assumptions need to be adjusted.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Working with the team, and the excitement that comes from workshopping the big ideas that will anchor the experiences we create.

WHAT’S YOUR LEAST FAVORITE?
The administrative parts of a creative business are not always the most fulfilling. Thankfully, tasks like timesheeting, expense reporting and invoicing are becoming less exhaustive thanks to better predictive tools and machine learning.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
The early hours of the morning, usually when inspiration strikes — when we haven’t had to deal with the unexpected day-to-day challenges that come with managing a busy design studio.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I’d probably be somewhere at the cross-section between an artist, like my mum was, and an engineer like my dad. There is nothing more satisfying than to apply art to an engineering challenge or vice versa.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I went to school in France, and there wasn’t much room for anything other than school and homework. When I got my Baccalaureate, I decided that from that point onward that whatever I did, it would be fun, deeply engaging and at a place where being creative was an asset.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
We recently partnered with ad agency RK Venture to craft a VR experience for the New Mexico Department of Transportation’s ongoing ENDWI campaign, which immerses viewers into a real-life drunk-driving scenario.

ENDWI

To best communicate and tell the human side of this story, we turned to rapid breakthroughs within volumetric capture and 3D scanning. Working with Microsoft’s Mixed Reality Capture Studio, we were able to bring every detail of an actor’s performance to life with volumetric performance capture in a way that previous techniques could not.

Bringing a real actor’s performance into a virtual experience is a game changer because of the emotional connection it creates. For ENDWI, the combination of rich immersion with compelling non-linear storytelling proved to affect the participants at a visceral level — with the goal of changing behavior further down the road.

Throughout this past year, we partnered with the VMware Cloud Marketing Team to create a one-of-a-kind immersive booth experience for VMworld Las Vegas 2018 and Barcelona 2018 called Cloud City. VMware’s cloud offering needed a distinct presence to foster a deeper understanding and greater connectivity between brand, product and customers stepping into the cloud.

Cloud City

Our solution was Cloud City, a destination merging future-forward architecture, light, texture, sound and interactions with VMware Cloud experts to give consumers a window into how the cloud, and more specifically how VMware Cloud, can be an essential solution for them. VMworld is the brand’s penultimate engagement where hands-on learning helped showcase its cloud offerings. Cloud City garnered 4000-plus demos, which led to a 20% lead conversion in 10 days.

Finally, for Google, we designed and built a platform for the hosting of online events anywhere in the world: Google Gather. For its first release, teams across Google, including Android, Cloud and Education, used Google Gather to reach and convert potential customers across the globe. With hundreds of events to date, the platform now reaches enterprise decision-makers at massive scale, spanning far beyond what has been possible with traditional event marketing, management and hosting.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Recently, a friend and I shot and edited a fun video homage to the original technology boom-town: Detroit, Michigan. It features two cultural icons from the region, an original big block ‘60s muscle car and some gritty electro beats. My four-year-old son thinks it’s the coolest thing he’s ever seen. It’s going to be hard for me to top that.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Human flight, the Internet and our baby monitor!

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Instagram, Twitter, Medium and LinkedIn.

CARE TO SHARE YOUR FAVORITE MUSIC TO WORK TO?
Where to start?! Music has always played an important part of my creative process, and the joy I derive from what we do. I have day-long playlists curated around what I’m trying to achieve during that time. Being able to influence how I feel when working on a brief is essential — it helps set me in the right mindset.

Sometimes, it might be film scores when working on visuals, jazz to design a workshop schedule or techno to dial-up productivity when doing expenses.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Spend time with my kids. They remind me that there is a simple and unpretentious way to look at life.