Category Archives: 3D

VFX house Kevin adds three industry vets

Venice, California-based visual effects house Kevin, founded by Tim Davies, Sue Troyan and Darcy Parsons, has beefed up its team even further with the hiring of head of CG Mike Dalzell, VFX supervisor Theo Maniatis and head of technology Carl Loeffler. This three-month-old studio has already worked on spots for Jaguar, Land Rover, Target and Old Spice, and is currently working on a series of commercials for the Super Bowl.

Dalzell brings years of experience as a CG aupervisor and lead artist — he started as a 3D generalist before focusing on look development and lighting — at top creative studios including Digital Domain, MPC and Psyop, The Mill, Sony Imageworks and Method. He was instrumental in look development for VFX Gold Clio and British Arrow-winner Call of Duty Seize Glory and GE’s Childlike Imagination. He has also worked on commercials for Nissan, BMW, Lexus, Visa, Cars.com, Air Force and others. Early on, Dalzell honed his skills on music videos in Toronto, and then on feature films such as Iron Man 3 and The Matrix movies, as well as The Curious Case of Benjamin Button.

Maniatis, a Flame artist and on-set VFX supervisor, has a wide breadth of experience in the US, London and his native Sydney. “Tim [Davies] and I used to work together back in Australia, so reconnecting with him and moving to LA has been a blast.”

Maniatis’s work includes spots for Apple Watch 3 + Apple Music’s Roll (directed by Sam Brown), TAG Heuer’s To Jack (directed by and featuring Patrick Dempsey), Destiny 2’s Rally the Troops and Titanfall 2’s Become One (via Blur Studios), and PlayStation VR’s Batman Arkham and Axe’s Office Love, both directed by Filip Engstrom. Prior to joining Kevin, Maniatis worked with Blur Studios, Psyop, The Mill, Art Jail and Framestore.

Loeffler is creating the studio’s production model using the latest Autodesk Flame systems, high-end 3D workstations and render nodes and putting new networking and storage systems into place. Kevin’s new Culver City studio will open its doors in Q1, 2018 and Loeffler will guide the current growth in both hardware and software, plan for the future and make sure Kevin’s studio is optimized for the needs of production. He has over two decades of experience building out and expanding the technologies for facilities including MPC and Technicolor.

Image: (L-R) Mike Dalzell, Carl Loeffler and Theo Maniatis.

Quick Chat: Ntropic CD, NIM co-founder Andrew Sinagra

Some of the most efficient tools being used by pros today were created by their peers, those working in real-world post environments who develop workflows in-house. Many are robust enough to share with the world. One such tool is NIM, a browser-based studio management app for post houses that tracks a production pipeline from start to finish.

Andrew Sinagra, co-founder of NIM Labs and creative director of Ntropic, a creative studio that provides VFX, design, color and live action, was kind enough to answer some trends questions relating to tight turnarounds in post and visual effects.

What do you feel are the biggest challenges facing post and VFX studios in the coming year?
It’s an interesting time for VFX, in general. The post-Netflix-era has ushered in a whole new range of opportunities, but the demands have shifted. We’re seeing quality expectations for television soar, but schedules and budgets have remained the same — or have tightened.

The challenges that face post production studios is going to be how do they continue to create quality and competitive work while also working with faster turnarounds and ever fluctuating budgets. It seems like an impossible problem, but thankfully tools, technology and talent continue to improve and deliver better results at a fraction of the time. By investing in those three Ts, the forward-thinking studios can balance expectation with necessary cost.

What have you found to be the typical pain points for studios with regards to project management in the past? What are the main complaints you hear time and time again?
Throughout my career I have met with many industry pros, from on-the-box artists and creative directors through to heads of production and studio owners. They have all shared their trials and tribulations – as well as their methods for staying ahead of the curve. The common pain point question is always the same: “How can I get a clearer view of my studio operations on a daily basis from resource utilization through running actuals?” It’s a growing concern. Managing budgets has been a major pain point for studios. Most just want a better way to visualize and gain back some control over what’s being spent and where. It’s all about the need for efficiency and clarity of vision on a project.

Is business intelligence very important to post studios at this point? Do you see it as an emerging trend over 2018?
Yes, absolutely. Studios need to know what’s going on, on any project, at a moment’s notice. They need to know if it will be affected by endless change orders, or if they’re consistently underbidding on a specific discipline, or if they’re marking something up that is actually affecting their overall margins. These can be the kind of statistics and influences that can impact the bottom line but, the problem is, they are incredibly difficult to pull out from an ocean of numbers on a spreadsheet.

Studios that invest in business intelligence, and can see such issues immediately quantified, will be capable of performing at a much higher efficiency level than those that do not. The status quo of comparing spreadsheets and juggling emails works to an extent, but it’s very difficult to pull analysis out of that. Studios instead need solutions that can help them to better visualize their approach from the inside out. It enables stakeholders to make decisions going by their brain, rather than their gut. I can’t imagine any studio heading into 2018 will want to brave the turbulent seas without having that kind of business intelligence on their side.

What are the limitations with today’s approaches to bidding and the time and materials model? What changes do you see around financial modeling in VFX in the coming years?
The time and materials model seems largely dead, and has been for quite some time.  I have seen a few studios still working with the time and materials model in regards to specific clients, but as a whole I find studios working to flat bids with explicitly clear statements of work. The burden is then on the studio to stay within their limits and find creative solutions to the project challenges. This puts extra stress on producers to fully understand the financial ramifications of decisions made on a day-to-day basis. Will slipping in a client request push the budget when we don’t have the margin to spare? How can I reallocate my crew to be more efficient? Can we reorganize the project so that waiting for client feedback doesn’t stop us dead in the water. These are just a few of the questions that, when answered, can squeeze out that extra 10% to get the job done.

Additionally, having the right information arms the studio with the right ammunition to approach the client for overages when the time comes. Having all the information at your fingertips to the extent of time that has been spent on a project and what any requested changes would require allows studios the opportunity to educate their clients. And educating clients is a big part of being profitable.

What will studios need to do in 2018 to ensure continued success? What advice would you give them at this stage?
Other than business intelligence, staying ahead of the curve in today’s environment will also mean staying flexible, scalable and nimble. Nimbleness is perhaps the most important of the three — studios need to have this attribute to work in the ever-changing world of post production. It is rare that projects reach the finish line with the deliveries matching exactly what was outlined in the initial bid. Studios must be able to respond to the inevitable requested changes even in the middle of production. That means being able to make informed decisions that meet the client’s expectations, while also remaining within the scope of the budget. That can mean the difference between a failed project and a triumphant delivery.

Basically, my advice is this: Going into 2018, ask yourself, “Are you using your resources to your maximum potential, or are you leaving man hours on the table?” Take a close look at everything your doing and ensure you’re not pouring budget into areas it’s simply not needed. With so many moving pieces in production it’s imperative to understand at a glance where your efforts are being placed and how you can better use your artists.

Dell 6.15

House of Moves add Selma Gladney-Edelman, Alastair Macleod

Animation and motion capture studio House of Moves (HOM) has strengthened its team with two new hires — Selma Gladney-Edelman was brought on as executive producer and Alastair Macleod as head of production technology. The two industry vets are coming on board as the studio shifts to offer more custom short- and long-form content, and expands its motion capture technology workflows to its television, feature film, video game and corporate clients.

Selma Gladney-Edelman was most recently VP of Marvel Television for their primetime and animated series. She has worked in film production, animation and visual effects, and was a producer on multiple episodic series at Walt Disney Television Animation, Cartoon Network and Universal Animation. As director of production management across all of the Discovery Channels, she oversaw thousands of hours of television and film programming including TLC projects Say Yes To the Dress, Little People, Big World and Toddlers and Tiaras, while working on the team that garnered an Oscar nom for Werner Herzog’s Encounters at the End of the World and two Emmy wins for Best Children’s Animated Series for Tutenstein.

Scotland native Alastair Macleod is a motion capture expert who has worked in production, technology development and as an animation educator. His production experience includes work on films such as Lord of the Rings: The Two Towers, The Matrix Reloaded, The Matrix Revolutions, 2012, The Twilight Saga: Breaking Dawn — Part 2 and Kubo and the Two Strings for facilities that include Laika, Image Engine, Weta Digital and others.

Macleod pioneered full body motion capture and virtual reality at the research department of Emily Carr University in Vancouver. He was also the head of animation at Vancouver Film School and an instructor at Capilano University in Vancouver. Additionally, he developed PeelSolve, a motion capture solver plug-in for Autodesk Maya.


Behind the Title: Artist Jayse Hansen

NAME: Jayse Hansen

COMPANY: Jayse Design Group

CAN YOU DESCRIBE YOUR COMPANY?
I specialize in designing and animating completely fake-yet-advanced-looking user interfaces, HUDs (head-up displays) and holograms for film franchises such as The Hunger Games, Star Wars, Iron Man, The Avengers, Guardians of the Galaxy, Spiderman: Homecoming, Big Hero 6, Ender’s Game and others.

On the side, this has led to developing untraditional, real-world, outside-the-rectangle type UIs, mainly with companies looking to have an edge in efficiency/data-storytelling and to provide a more emotional connection with all things digital.

Iron Man

WHAT’S YOUR JOB TITLE?
Designer/Creative Director

WHAT DOES THAT ENTAIL?
Mainly, I try to help filmmakers (or companies) figure out how to tell stories in quick reads with visual graphics. In a film, we sometimes only have 24 frames (one second) to get information across to the audience. It has to look super complex, but it has to be super clear at the same time. This usually involves working with directors, VFX supervisors, editorial and art directors.

With real-world companies, the way I work is similar. I help figure out what story can be told visually with the massive amount of data we have available to us nowadays. We’re all quickly finding that data is useless without some form of engaging story and a way to quickly ingest, make sense of and act on that data. And, of course, with design-savvy users, a necessary emotional component is that the user interface looks f’n rad.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
A lot of R&D! Movie audiences have become more sophisticated, and they groan if a fake UI seems outlandish, impossible or Playskool cartoon-ish. Directors strive to not insult their audience’s intelligence, so we spend a lot of time talking to experts and studying real UIs in order to ground them in reality while still making them exciting, imaginative and new.

WHAT’S YOUR FAVORITE PART OF THE JOB?
Research, breaking down scripts and being able to fully explore and do things that have never been done before. I love the challenge of mixing strong design principles with storytelling and imagination.

WHAT’S YOUR LEAST FAVORITE?
Paperwork!

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Early morning and late nights. I like to jam on design when everyone else is sleeping.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
I actually can’t imagine doing anything else. It’s what I dream about and obsess about day and night. And I have since I was little. So I’m pretty lucky that they pay me well for it!

If I lost my sight, I’d apply for Oculus or Meta brain implants and live in the AR/VR world to keep creating visually.

SO YOU KNEW THIS WAS YOUR PATH EARLY ON?
When I was 10 I learned that they used small models for the big giant ships in Star Wars. Mind blown! Suddenly, it seemed like I could also do that!

As a kid I would pause movies and draw all the graphic parts of films, such as the UIs in the X-wings in Star Wars, or the graphics on the pilot helmets. I never guessed this was actually a “specialty niche” until I met Mark Coleran, an amazing film UI designer who coined the term “FUI” (Fictional User Interface). Once I knew it was someone’s “everyday” job, I didn’t rest until I made it MY everyday job. And it’s been an insanely great adventure ever since.

CAN YOU TALK MORE ABOUT FUI AND WHAT IT MEANS?
FUI stands for Fictional (or Future, Fantasy, Fake) User Interface. UIs have been used in films for a long time to tell an audience many things, such as: their hero can’t do what they need to do (Access Denied) or that something is urgent (Countdown Timer), or they need to get from point A to point B, or a threat is “incoming” (The Map).

Mockingjay Part I

As audiences are getting more tech-savvy, the potential for screens to act as story devices has developed, and writers and directors have gotten more creative. Now, entire lengths of story are being told through interfaces, such as in The Hunger Games: The Mockingjay Part I where Katniss, Peeta, Beetee and President Snow have some of their most tense moments.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
The most recent projects I can talk about are Guardians of the Galaxy 2 and Spider-Man: Homecoming, both with the Cantina Creative team and Marvel. For Guardians 2, I had a ton of fun designing and animating various screens, including Rocket, Gamora and Star-Lord’s glass screens and the large “Drone Tactical Situation Display” holograms for the Sovereign (gold people). Spider-Man was my favorite superhero as a child, so I was honored to be asked to define the “Stark-Designed” UI design language of the HUDs, holograms and various AR overlays.

I spent a good amount of time researching the comic book version of Spider-man. His suit and abilities are actually quite complex, and I ended up writing a 30-plus page guide to all of its functions so I could build out the HUD and blueprint diagrams in a way that made sense to Marvel fans.

In the end, it was a great challenge to blend the combination of the more military Stark HUDs for Iron Man, which I’m very used to designing, and a new, slightly “webby” and somewhat cute “training-wheels” UI that Stark designed for the young Peter Parker. I loved the fact that in the film they played up the humor of a teenager trying to understand the complexities of Stark’s UIs.

Star Wars: The Force Awakens

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I think Star Wars: The Force Awakens is the one I was most proud to be a part of. It was my one bucket list film to work on from childhood, and I got to work with some of the best talents in the business. Not only JJ Abrams and his production team at Bad Robot, but with my longtime industry friends Navarro Parker and Andrew Kramer.

WHAT SOFTWARE DID YOU RELY ON?
As always, we used a ton of Maxon Cinema 4D, Adobe’s After Effects and Illustrator and Element 3D to pull off rather complex and lengthy design sequences such as the Starkiller Base hologram and the R2D2/BB8 “Map to Luke Skywalker” holograms.

Cinema 4D was essential in allowing us to be super creative while still meeting rather insane deadlines. It also integrates so well with the Adobe suite, which allowed us to iterate really quickly when the inevitable last-minute design changes came flying in. I would do initial textures in Adobe Illustrator, then design in C4D, and transfer that into After Effects using the Element 3D plugin. It was a great workflow.

YOU ALSO CREATE VR AND AR CONTENT. CAN YOU TELL US MORE ABOUT THAT?
Yes! Finally, AR and VR are allowing what I’ve been doing for years in film to actually happen in the real world. With a Meta (AR) or Oculus (VR) you can actually walk around your UI like an Iron Man hologram and interact with it like the volumetric UI’s we did for Ender’s Game.

For instance, today with Google Earth VR you can use a holographic mapping interface like in The Hunger Games to plan your next vacation. With apps like Medium, Quill, Tilt Brush or Gravity Sketch you can design 3D parts for your robot like Hiro did in Big Hero 6.

Big Hero 6

While wearing a Meta 2, you can surround yourself with multiple monitors of content and pull 3D models from them and enlarge them to life size.

So we have a deluge of new abilities, but most designers have only designed on flat traditional monitors or phone screens. They’re used to the two dimensions of up and down (X and Y), but have never had the opportunity to use the Z axis. So you have all kinds of new challenges like, “What does this added dimension do for my UI? How is it better? Why would I use it? And what does the back of a UI look like when other people are looking at it?”

For instance, in the Iron Man HUD, most of the time I was designing for when the audience is looking at Tony Stark, which is the back of the UI. But I also had to design it from the side. And it all had to look proper, of course, from the front. UI design becomes a bit like product design at this point.

In AR and VR, similar design challenges arise. When we are sharing volumetric UIs — we will see other people’s UIs from the back. At times, we want to be able to understand them, and at other times, they should be disguised, blurred or shrouded for privacy reasons.

How do you design when your UI can take up the whole environment? How can a UI give you important information without distracting you from the world around you? How do you deal with additive displays where black is not a color you can use? And on and on. These are all things we tackle with each film, so we have a bit of a head start in those areas.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
I love tech, but it would be fun to be stuck with just a pen, paper and a book… for a while, anyway.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I’m on Twitter (@jayse_), Instagram (@jayse_) and Pinterest (skyjayse). Aside from that I also started a new FUI newsletter to discuss some behind the scenes of this type of work.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
Heck yeah. Lately, I find myself working to Chillstep and Deep House playlists on Spotify. But check out The Cocteau Twins. They sing in a “non-language,” and it’s awesome.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I chill with my best friend and fiancé, Chelsea. We have a rooftop wet-bar area with a 360-degree view of Las Vegas from the hills. We try to go up each evening at sunset with our puppy Bella and just chill. Sometimes it’s all fancy-like with a glass of wine and fruit. Chelsea likes to make it all pretty.

It’s a long way from just 10 years ago where we were hunting spare-change in the car to afford 99-cent nachos from Taco Bell, so we’re super appreciative of where we’ve come. And because of that, no matter how many times my machine has crashed, or how many changes my client wants — we always make time for just each other. It’s important to keep perspective and realize your work is not life or death, even though in films sometimes they try to make it seem that way.

It’s important to always have something that is only for you and your loved ones that nobody can take away. After all, as long as we’re healthy and alive, life is good!


Chaos Group acquires Render Legion and its Corona Renderer

Chaos Group has purchased Prague-based Render Legion, creator of the Corona Renderer. With this new product and Chao’s own V-Ray, the company is offering even more rendering solutions for M&E and the architectural visualization world.

Known for its ease of use, the Corona Renderer has become a popular choice for architectural visualization, but according to Chaos Group’s David Tracy, “There are a few benefits for M&E. Corona plans to implement some VFX-related features, such as hair and skin with the help of the V-Ray team. Also, Corona is sharing technology, like the way they optimize dome lights. That will definitely be a benefit for V-Ray users in the VFX space.”

The Render Legion team, including its founders and developers, will join Chaos Group as they continue to develop Corona using additional support and resources provided through the deal.

Chaos Group’s Academy Award-winning renderer, V-Ray will continue to be a core component of the company’s portfolio. Both V-Ray and Corona will benefit from joint collaborations, bringing complementary features and optimizations to each product.

The Render Legion acquisition is Chaos Group’s largest investment to date. It is the third investment in a visualization company in the last two years, including interactive presentation platform CL3VER and virtual reality pioneer Nurulize. According to Chaos Group, the computer graphics industry is expected to reach $112 billion in 2019, fueled by a rise in the demand for 3D visuals. This, they say, has presented a prime opportunity for companies who make the creation of photorealistic imagery more accessible.

Main Image: ( L-R) Chaos Group co-founder Vlado Koylazov and Render Legion CEO/co-founder Ondřej Karlík.


Red Giant Trapcode Suite 14 now available

By Brady Betzel

Red Giant has released an update to its Adobe After Effects focused plug-in toolset Trapcode Suite 14, including new versions of Trapcode Particular and Form as well as an update to Trapcode Tao.

The biggest updates seem to be in Red Giant’s flagship product Trapcode Particular 3. Trapcode Particular is now GPU accelerated through OpenGL with a proclaimed 4X speed increase over previous versions. The Designer has been re-imagined and seems to take on a more Magic Bullet-esque look and feel. You can now include multiple particle systems inside the same 3D space, which will add to the complexity and skill level needed to work with Particular.

You can now also load your own 3D model OBJ files as emitters in the Designer panel or use any image in your comp as a particle. There are also a bunch of new presets that have been added to start you on your Particular system building journey — over 210 new presets, to be exact.

Trapcode Form has been updated to version 3 with the updated Designer, ability to add 3D models and animated OBJ sequences as particle grids, load images to be used as a particle, new graphing system to gain more precise control over the system and over 70 presets in the designer.

Trapcode Tao has been updated with depth of field effects to allow for that beautiful camera-realistic blur that really sets pro After Effects users apart.

Trapcode Particular 3 and Form 3 are paid updates while Tao is free for existing users. If you want to only update Tao make sure you only select Tao for the update otherwise you will install new Trapcode plug-ins over your old ones.

Trapcode Particular 3 is available now for $399. The update is $149 and the academic version is $199. You can also get it as a part of the Trapcode Suite 14 for $999.

Trapcode Form 3 is available now for $199. The update is $99 and the academic costs $99. It can be purchased as part of the Trapcode Suite 14 for $999.

Check out the new Trapcode Suite 14 bundle.

 


Maxon debuts Cinema 4D Release 19 at SIGGRAPH

Maxon was at this year’s SIGGRAPH in Los Angeles showing Cinema 4D Release 19 (R19). This next-generation of Maxon’s pro 3D app offers a new viewport and a new Sound Effector, and additional features for Voronoi Fracturing have been added to the MoGraph toolset. It also boasts a new Spherical Camera, the integration of AMD’s ProRender technology and more. Designed to serve individual artists as well as large studio environments, Release 19 offers a streamlined workflow for general design, motion graphics, VFX, VR/AR and all types of visualization.

With Cinema 4D Release 19, Maxon also introduced a few re-engineered foundational technologies, which the company will continue to develop in future versions. These include core software modernization efforts, a new modeling core, integrated GPU rendering for Windows and Mac, and OpenGL capabilities in BodyPaint 3D, Maxon’s pro paint and texturing toolset.

More details on the offerings in R19:
Viewport Improvements provide artists with added support for screen-space reflections and OpenGL depth-of-field, in addition to the screen-space ambient occlusion and tessellation features (added in R18). Results are so close to final render that client previews can be output using the new native MP4 video support.

MoGraph enhancements expand on Cinema 4D’s toolset for motion graphics with faster results and added workflow capabilities in Voronoi Fracturing, such as the ability to break objects progressively, add displaced noise details for improved realism or glue multiple fracture pieces together more quickly for complex shape creation. An all-new Sound Effector in R19 allows artists to create audio-reactive animations based on multiple frequencies from a single sound file.

The new Spherical Camera allows artists to render stereoscopic 360° virtual reality videos and dome projections. Artists can specify a latitude and longitude range, and render in equirectangular, cubic string, cubic cross or 3×2 cubic format. The new spherical camera also includes stereo rendering with pole smoothing to minimize distortion.

New Polygon Reduction works as a generator, so it’s easy to reduce entire hierarchies. The reduction is pre-calculated, so adjusting the reduction strength or desired vertex count is extremely fast. The new Polygon Reduction preserves vertex maps, selection tags and UV coordinates, ensuring textures continue to map properly and providing control over areas where polygon detail is preserved.

Level of Detail (LOD) Object features a new interface element that lets customers define and manage settings to maximize viewport and render speed, create new types of animations or prepare optimized assets for game workflows. Level of Detail data exports via the FBX 3D file exchange format for use in popular game engines.

AMD’s Radeon ProRender technology is now seamlessly integrated into R19, providing artists a cross-platform GPU rendering solution. Though just the first phase of integration, it provides a useful glimpse into the power ProRender will eventually provide as more features and deeper Cinema 4D integration are added in future releases.

Modernization efforts in R19 reflect Maxon’s development legacy and offer the first glimpse into the company’s planned ‘under-the-hood’ future efforts to modernize the software, as follows:

  • Revamped Media Core gives Cinema 4D R19 users a completely rewritten software core to increase speed and memory efficiency for image, video and audio formats. Native support for MP4 video without QuickTime delivers advantages to preview renders, incorporate video as textures or motion track footage for a more robust workflow. Export for production formats, such as OpenEXR and DDS, has also been improved.
  • Robust Modeling offers a new modeling core with improved support for edges and N-gons can be seen in the Align and Reverse Normals commands. More modeling tools and generators will directly use this new core in future versions.
  • BodyPaint 3D now uses an OpenGL painting engine giving R19 artists painting color and adding surface details in film, game design and other workflows, a real-time display of reflections, alpha, bump or normal, and even displacement, for improved visual feedback and texture painting. Redevelopment efforts to improve the UV editing toolset in Cinema 4D continue with the first-fruits of this work available in R19 for faster and more efficient options to convert point and polygon selections, grow and shrink UV point selects, and more.

Calabash animates characters for health PSA

It’s a simple message, told in a very simple way — having a health issue and being judged for it, hurts. An animated PSA for The Simon Foundation, titled Rude2Respect, was animated by Chicago’s Calabash in conjunction with the creative design studio Group Chicago.

Opening with typography “Challenging Health Stigma,” the PSA features two friends — a short, teal-colored tear-dropped blob known simply as Blue and his slender companion Pink — taking a walk on a bright sunny day in the city. Blue nervously says, “I’m not sure about this,” to which Pink responds, “You can’t stay home forever.” From there the two embark on what seems like a simple stroll to get ice cream, but there is a deeper message about how such common events can be fraught with anxiety for those suffering from an array of health conditions that often results in awkward stares, well-intentioned but inappropriate comments or plain rudeness. Blue and Pink decide it’s the people with the comments that are in the wrong and continue on to get ice cream. The spot ends with the simple words “Health stigma hurts. We can change lives” followed by a link to www.rude2respect.org.

“We had seen Calabash’s work and sought them out,” says Barbara Lynk, Group Chicago’s creative director. “We were impressed with how well their creative team immediately understood the characters and their visual potential. Creatively they brought a depth of experience on the conceptual and production side that helped bring the characters to life. They also understood the spare visual approach we were trying to achieve. It was a wonderful creative collaboration throughout the process, and they are a really fun group of creatives to work with.”

Based on illustrated characters created by Group Chicago’s founder/creative director Kurt Meinecke, Calabash creative director Wayne Brejcha notes that early on in the creative process they decided to go with what he called a “two-and-a-half-D look.”

“There is a charm in the simplicity of Kurt’s original illustrations with the flat shapes that we had to try very hard to keep as we translated Blue and Pink to the 3D world,” Brejcha says. “We also didn’t want to overly complicate it with a lot of crazy camera moves rollercoastering through the space or rotating around the characters. We constrained it to feel a little like two-and-a-half dimensions – 2D characters, but with the lighting and textures and additional physical feel you expect with 3D animation.

“We spent a good deal of time with thumbnail boardomatics, a scratch track and stand-in music as it began to gel,” he continues. “Kurt searched out some piano music for the intro and outro, which also set tone, and we cast for voices with the personalities of the figures in mind. After a few conversations with Kurt and Barb we understood the personalities of Blue and Pink very well. They’re archetypes or incarnations of two stages of dealing with, say, going out in public with some medical apparatus you’re attached to that’s plainly visible to everyone. The Blue guy is self-conscious, defensive, readily upset and also ready to bring a little push-back to the folks who call out his non-normative qualities. Pink is a little further along in accepting the trials. She can shake off with equanimity all the outright insults, dopey condescension and the like. She’s something of a mentor or role model for Blue. The characters are made of simple shapes, so animator Nick Oropezas did a lot of tests and re-animation to get just the right movements, pauses, timing and expressions to capture those spirits.”

For Sean Henry, Calabash’s executive producer, the primary creative obstacles centered on finding the right pacing for the story. “We played with the timing of the edits all the way through production,” he explains. “The pace of it had a large role to play in the mood, which is more thoughtful than your usual rapid-fire ad. Also, finding the right emotions for the voices was also a major concern. We needed warmth and a friendly mentoring feel for Pink, and a feisty, insecure but likeable voice for Blue. Our voice talent nailed those qualities. Additionally, the dramatic events in the spot happen only in the audio with Pink and Blue responding to off-screen voices and action, so the sound design and music had a major storytelling role to play as well.”

Calabash called on Autodesk Maya for the characters and Foundry’s Nuke for effects/compositing. Adobe Premiere was used for the final edit.


Nugen adds 3D Immersive Extension to Halo Upmix

Nugen Audio has updated its Halo Upmix with a new 3D Immersive Extension, adding further options beyond the existing Dolby Atmos bed track capability. The 3D Immersive Extension now provides ambisonic-compatible output as an alternative to channel-based output for VR, game and other immersive applications. This makes it possible to upmix, re-purpose or convert channel-based audio for an ambisonic workflow.

With this 3D Immersive Extension, Halo fully supports Avid’s newly announced Pro Tools V.2.8, now with native 7.1.2 stems for Dolby Atmos mixing. The combination of Pro Tools 12.8 and Halo 3D Immersive Extension can provide a more fluid workflow for audio post pros handling multi-channel and object-based audio formats.

Halo Upmix is available immediately at a list price of $499 for both OS X and Windows, with support for Avid AAX, AudioSuite, VST2, VST3 and AU formats. The new 3D Immersive Extension replaces the Halo 9.1 Extension and can now be purchased for $199. Owners of the existing Halo 9.1 Extension can upgrade to the Halo 3D Immersive Extension for no additional cost. Support for native 7.1.2 stems in Avid Pro Tools 12.8 is available on launch.

Red’s Hydrogen One: new 3D-enabled smartphone

In their always subtle way, Red has stated that “the future of personal communication, information gathering, holographic multi-view, 2D, 3D, AR/VR/MR and image capture just changed forever” with the introduction of Hydrogen One, a pocket-sized, glasses-free “holographic media machine.”

Hydrogen One is a standalone, full-featured, unlocked multi-band smartphone, operating on Android OS, that promises “look around depth in the palm of your hand” without the need for separate glasses or headsets. The device features a 5.7-inch professional hydrogen holographic display that switches between traditional 2D content, holographic multi-view content, 3D content and interactive games, and it supports both landscape and portrait modes. Red has also embedded a proprietary H30 algorithm in the OS system that will convert stereo sound into multi-dimensional audio.

The Hydrogen system incorporates a high-speed data bus to enable a comprehensive and expandable modular component system, including future attachments for shooting high-quality motion, still and holographic images. It will also integrate into the professional Red camera program, working together with Scarlet, Epic and Weapon as a user interface and monitor.

Future-users are already talking about this “nifty smartphone with glasses-free 3D,” and one has gone so far as to describe the announcement as “the day 360-video became Betamax, and AR won the race.” Others are more tempered in their enthusiasm, viewing this as a really expensive smartphone with a holographic screen that may or might not kill 360 video. Time will tell.

Initially priced between $1,195 and $1,595, the Hydrogen One is targeted to ship in Q1 of 2018.