Category Archives: motion graphics

Industry vet Alex Moulton joins NYC’s Trollbäck+Company

New York’s Trollbäck+Company has hired Alex Moulton as chief creative officer where he has been tasked with helping businesses and organizations develop sustainable brands through design-driven strategy and mixed media.

Moulton, who joins the agency from Vice Media, was recently at the helm of NBC Universo’s highly regarded brand refresh, as well as show packaging for ESPN’s The Undefeated In-Depth: Serena With Common.

“Alex brings an invaluable perspective to Trollbäck+Company as both an artist and entrepreneur,” says founder Jakob Trollbäck. “In his short time here, he has already reinvigorated the collective creative energy of our company. This clearly stems from his constant quest to dig deeper as a creative problem solver, which falls perfectly into our philosophy of ‘Discard everything that means nothing.’”

Says Moulton, “My vision for Trollbäck+Company is very clear: design culturally relevant, sustainable brands — from initial strategy and positioning to content and experiential activations —  with a nimble and holistic approach that makes us the ultimate partner for CMOs that care about designing an enduring brand and bringing it to market with integrity.”

Prior to Trollbäck+Company, as senior director, creative and content at Vice, Moulton helped launch digital content channel Live Nation TV (LNTV) — a joint venture for which he led brand creative, content development, production and partnership initiatives.

As executive creative director at advertising agency Eyeball, Moulton led product launches, rebrands and campaigns for major brands, including Amazon, New York Public Radio, Wildlife Conservation Society’s New York Aquarium, A&E, CMT, Disney, E!, Nickelodeon, Oxygen, Ovation and VH1.

An early adopter of audio branding, Moulton founded his own branding agency and record label, Expansion Team, in 2002. As chief creative officer of the company, he created the sonic identities of Aetna, Amazon Studios/Originals, Boeing, JetBlue and Rovi, as well as more than 15 TV networks, including CNN International, Discovery, PBS, Universal and Comedy Central.

A DJ, composer and speaker about topics that combine music and design, Moulton has been featured in Billboard, V Man, Electronic Musician and XLR8R and has performed at The Guggenheim.

Behind the Title: Director/Designer Ash Thorp

NAME: Ash Thorp (@ashthorp)

COMPANY: ALT Creative, Inc.

CAN YOU DESCRIBE YOUR COMPANY?
ALT Creative is co-owned by my wife Monica and myself. She helps coordinate and handle the company operations, while I manage the creative needs of clients. We work with a select list of outside contractors as needed, mainly depending on the size and scale of the project.

WHAT’S YOUR JOB TITLE?
I fulfill many roles, but if I had to summarize I would say I most commonly am hired for the role of director or designer.

WHAT DOES THAT ENTAIL?
Directing is about facilitating the team to achieve the best outcome on a given project. My ability to communicate with and engage my team toward a visionary goal is my top priority as a director. As a designer, I look at my role as an individual problem solver. My goal is to find the root of what is needed or requested and solve it using design as a mental process of solution.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
I believe that directing is more about communication and not how well you can design, so many would be surprised by the amount of time and energy needed outside of “creative” tasks, such as emails, critiques, listening, observation and deep analysis.

WHAT’S YOUR FAVORITE PART OF THE JOB?
As a director, I love the freedom to expose the ideas in my mind to others and work closely with them to bring them to life. It’s immensely liberating and rewarding.

WHAT’S YOUR LEAST FAVORITE?
Redundancy often eats up my ambitions. Instructing my vision repeatedly to numerous teammates and partners can be taxing on my subconscious at times.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
The late evening because that is often when I have my mind to myself and am free of outside world distractions and noise.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Nothing. I strongly believe that this is what I was put on earth to do. This is the path I have been designed and focused on since I was a child.

SO YOU KNEW EARLY ON THIS WOULD BE YOUR PATH?
I grew up with a very artistic family; my mother’s side of the family displays creative traits in one media or another. They were and still are all very deeply committed to supporting me in my creative endeavors. Based on my upbringing, it was a natural progression to also be a creative person.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
As for client projects that are publicly released, I most recently worked on the Assassin’s Creed feature film and Call of Duty: Infinite Warfare video game.

For my own projects, I designed and co-directed a concept short for Lost Boy with Anthony Scott Burns. In addition, I released two personal projects: None is a short expression film devised to capture a tone and mood of finding oneself in a city of darkness, and Epoch
is an 11-minute space odyssey that merges my deep love of space and design.

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
With Epoch being the most recently released project, I have received so many kind and congratulatory correspondences from viewers about how much they love the film. I am very proud of all the hard work and internal thought, development and personal growth it took to build this project with Chris Bjerre. I believe Epoch shows who I truly am, and I consider it one of the best projects of my personal career to date.

WHAT SOFTWARE DID YOU RELY ON FOR EPOCH?
We used a pretty wide spectrum of tools. Our general production tool kit was comprised of Adobe Photoshop for images and stills, texture building and 2D image editing; Adobe Bridge for reviewing frames and keeping a clear vision of the project; Adobe Premiere for editing everything from the beginning animatic to the final film; and, of course, our main staple in 3D was Maxon Cinema 4D, which we used to construct all of the final scenes and render everything using Octane Renderer.

We used Cinema 4D for everything — from building shots for the rough animatic to compiling entire scenes and shots for final render. We used it to animate the planets, moons, orbits, lights and the Vessel. It really is a rock-solid piece of software that I couldn’t imagine trying to build a film like Epoch without it. It allowed us to capture the animations, look, lighting and shots seamlessly from the project’s inception.

WHAT WAS YOUR INSPIRATION FOR THIS WORK?
I am personally inspired by so many things. Epoch was a personal tribute to Stanley Kubrick’s 2001: A Space Odyssey, Alien, Carl Sagan, my love of space and space travel, classical sci-fi art and literature, and my personal love of graphic design all combined into one. We put tremendous effort into Epoch to pay proper homage to these things, yet also invite a new audience to experience something uniquely new. We hope you all enjoyed it!

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
The Internet, computers and physical traveling devices (like cars, planes).

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
I try and limit my time spent on social media, but I have two Facebooks, Instagram, Twitter and a Behance account.

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
I frequently listen to music while I work as it helps me fall deep into my mentally focused work state of mind. The type of music varies as some genres work better than others because they trigger different emotions for different tasks. When I am in deep thought, I listen to composers that have no lyrics in their work that may pull away my mind’s focus. When I am doing ordinary tasks or busy work, I listen to anything from heavy metal to drum and bass. The scale of music really varies for me as it’s also often based on my current mood. Music is a big part of my workday and my life.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
I actually let the stress in and let it shape my decision making. I feel if I run away from it or unwind my mind, it takes double the effort to go back in to work. I embrace it as being a part of the high consumption industry in which I have chosen to work. It’s not always ideal and is often very demanding, but I often let it be the spark of the fire of my work.

G-Tech 6-15

Digging Deep: Helping launch the OnePlus 3T phone

By Jonathan Notaro

It’s always a big deal when a company drops a new smartphone. The years of planning and development culminate in a single moment, and the consumers are left to judge whether or not the new device is worthy of praise and — more importantly — worthy of purchase.

For bigger companies like Google and Apple, a misstep with a new phone release can often amount to nothing more than a hiccup in their operations. But for newer upstarts like OnePlus, it’s a make or break event. When we got the call at Brand New School to develop a launch spot for the company’s 3T smartphone, along with the agency Carrot Creative, we didn’t hesitate to dive in.

The Idea
OnePlus has built a solid foundation of loyal fans with their past releases, but with the 3T they saw the chance to build their fanbase out to more everyday consumers who may not be as tech-obsessed as their existing fans. It is an entirely new offering and, as creatives, the chance to present such a technologically advanced device to a new, wider audience was an opportunity we couldn’t pass up.

Carrot wanted to create something for OnePlus that gave viewers a unique sense of what the phone was capable of — to capture the energy, momentum and human element of the OnePlus 3T. The 3T is meant to be an extension of its owner, so this spot was designed to explore the parallels between man and machine. Doing this can run the risk of being cliché, so we opted for futuristic, abstract imagery that gets the point across effectively without being too heavy handed. We focused on representing the phone’s features that set it apart from other devices in this market, such as its powerful processor and its memory and storage capabilities.

How We Did It
Inspired by the brooding, alluring mood reflected in the design for the title sequence of The Girl With the Dragon Tattoo, we set out to meld lavish shots of the OnePlus 3T with robotically-infused human anatomy, drawing up initial designs in Autodesk Maya and Maxon Cinema 4D.

When the project moved into the animation phase, we stuck with Maya and used Nuke for compositing. Type designs were done in Adobe Illustrator and animated in Adobe After Effects.

Collaboration is always a concern when there are this many different scenes and moving parts, but this was a particular challenge. With a CG-heavy production like this, there’s no room for error, so we had to make sure that all of the different artists were on the same page every step along the way.

Our CG supervisor Russ Wootton and technical director Dan Bradham led the way and compiled a crack team to make this thing happen. I may be biased, but they continue to amaze me with what they can accomplish.

The Final Product
The project was two-month production process. Along the way, we found that working with Carrot and the brand was a breath of fresh air, as they were very knowledgeable and amenable to what we had in mind. They afforded us the creative space to take a few risks and explore some more abstract, avant-garde imagery that I felt represented what they were looking to achieve with this project.

In the end, we created something that I hope cuts through the crowded landscape of product videos and appeals to both the brand’s diehard-tech-savvy following and consumers who may not be as deep into that world. (Check it out here.)

Fueled by the goal of conveying the underlying message of “raw power” while balancing the scales of artificial and human elements, we created something I believe is beautiful, compelling and completely unique. Ultimately though, the biggest highlight was seeing the positive reaction the piece received when it was released. Normally, reaction from consumers would be centered solely on the product, but to have the video receive praise from a very discerning audience was truly satisfying.


Jonathan Notaro is a director at Brand New School, a bicoastal studio that provides VFX, animation and branding. 


Nickelodeon gets new on-air brand refresh

The children’s network Nickelodeon has debuted an all-new brand refresh of its on-air and online look and feel. Created with animation, design, global branding and creative agency Superestudio, based in Buenos Aires, Argentina, Nick’s new look features an array of kids interacting with the real world and Nick’s characters in live-action and graphic environments.

The new look consists of almost 300 deliverables, including bumpers, IDs, promo toolkits and graphic developments that first rolled out across the network’s US linear platform, followed by online, social media and off-channel. Updated elements for the network’s international channels will follow.

“We really wanted to highlight how much surprise and fun are parts of kids’ lives, so we took as our inspiration the surreal nature of GIFs, memes and emoticons and created an entire new visual vocabulary,” says Michael Waldron, SVP, creative director art and design for Nickelodeon Group and Nick@Nite. “Using a mix of real kids and on-air talent, the refresh looks through the lens of how kids see things — the unpredictable, extraordinary and joyful nature of a child’s imagination. Superestudio was the right company for this refresh because they use a great mix of different techniques, and they brought a fresh viewpoint that had just the right amount of quirk and whimsy.”

Nickelodeon’s new look was created by combining real kids with 2D and 3D graphics to create imaginative reinterpretations of Nickelodeon’s properties and characters as they became real-world playgrounds for kids to bring to life, rearrange and redesign. From turning SpongeBob’s face into a tongue-twisted fun zone to kids rearranging and rebuilding Lincoln Loud from The Loud House, everything from the overhead and docu-style camera angles to the seamless blend of real-world and tactile elements.

Nickelodeon’s classic orange logo is now set against an updated color palette of bright tones, including purple, light blue, lime and cream.

According to Superestudio executive creative director Ezequiel Rormoser, “The software that we used is Adobe After Effects and Maxon Cinema 4D. I think the most interesting thing is how we mixed live action with graphics, not in terms of technical complexity, but in the way they interact in an unexpected way. “


Flavor Detroit welcomes VFX artist/designer Scott Stephens

Twenty-year industry veteran Scott Stephens has joined Flavor Detroit as senior VFX artist/designer. Previously the lead designer at Postique, Stephens has been a key part of the post boutique Section 8 as co-founder and lead designer since its launch in 2001.

Known for his work with top brands and directors on major commercial campaigns for Blue Cross Blue Shield (BCBS), Chrysler, Expedia, Food Network, Mazda and Six Flags, to name but a few, Stephens also brings vast experience creating content that maximizes unique environments and screens of all sizes.

Recent projects include the Amazon Kindle release in Times Square, the Ford Focus theatrical release for the Electric Music Festival, BCBS media for the Pandora app, Buick’s multi-screen auto show installations and the Mount St. Helens installation for the National Park Service.


Lost in Time game show embraces ‘Interactive Mixed Reality’

By Daniel Restuccio

The Future Group — who has partnered with Fremantle Media, Ross Video and Epic Games — have created a new super-agile entertainment platform that blends linear television and game technology into a hybrid format called “Interactive Mixed Reality.”

The brainchild of Bård Anders Kasin, this innovative content deployment medium generated a storm of industry buzz at NAB 2016, and their first production Lost in Time — a weekly primetime game show — is scheduled to air this month on Norwegian television.

The Idea
The idea originated more than 13 years ago in Los Angeles. In 2003, at age 22, Kasin, a self-taught multimedia artist from Notodden, Norway, sent his CV and a bunch of media projects to Warner Bros. in Burbank, California, in hopes of working on The Matrix. They liked it. His interview was on a Wednesday and by Friday he had a job as a technical director.

Kasin immersed himself in the cutting-edge movie revolution that was The Matrix franchise. The Wachowskis visionary production was a masterful inspiration and featured a compelling sci-fi action story, Oscar-winning editing, breakthrough visual effects (“bullet-time”) and an expanded media universe that included video games and an animè-style short The Animatrix. The Matrix Reloaded and The Matrix Revolutions were shot at the same time, as well as more than an hour of footage specifically designed for the video game. The Matrix Online, an Internet gaming platform, was a direct sequel to The Matrix Revolutions.

L-R: Bård Anders Kasin and Jens Petter Høili.

Fast forward to 2013 and Kasin has connected with software engineer and serial entrepreneur Jens Petter Høili, founder of EasyPark and Fairchance. “There was this producer I knew in Norway,” explains Kasin, “who runs this thing called the Artists’ Gala charity. He called and said, ‘There’s this guy you should meet. I think you’ll really hit it off.’” Kasin met Høili had lunch and discussed projects they each were working on. “We both immediately felt there was a connection,” recalls Kasin. No persuading was necessary. “We thought that if we combined forces we were going to get something that’s truly amazing.”

That meeting of the minds led to the merging of their companies and the formation of The Future Group. The mandate of Oslo-based The Future Group is to revolutionize the television medium by combining linear TV production with cutting-edge visual effects, interactive gameplay, home viewer participation and e-commerce. Their IMR concept ditches the individual limiting virtual reality (VR) headset, but conceptually keeps the idea of creating content that is a multi-level, intricate and immersive experience.

Lost in Time
Fast forward again, this time to 2014. Through another mutual friend, The Future Group formed an alliance with Fremantle Media. Fremantle, a global media company, has produced some of the highest-rated and longest-running shows in the world, and is responsible for top international entertainment brands such as Got Talent, Idol and The X Factor.

Kasin started developing the first IMR prototype. At this point, the Lost in Time production had expanded to include Ross Video and Epic Games. Ross Video is a broadcast technology innovator and Epic Games is a video game producer and the inventor of the Unreal game engine. The Future Group, in collaboration with Ross Video, engineered the production technology and developed a broadcast-compatible version of the Unreal game engine called Frontier, shown at NAB 2016, to generate high-resolution, realtime graphics used in the production.

On January 15, 2015 the first prototype was shown. When Freemantle saw the prototype, they were amazed. They went directly to stage two, moving to the larger stages at Dagslys Studios. “Lost in Time has been the driver for the technology,” explains Kasin. “We’re a very content-driven company. We’ve used that content to drive the development of the platform and the technology, because there’s nothing better than having actual content to set the requirements for the technology rather than building technology for general purposes.”

In Lost in Time, three studio contestants are set loose on a greenscreen stage and perform timed, physical game challenges. The audience, which could be watching at home or on a mobile device, sees the contestant seamlessly blended into a virtual environment built out of realtime computer graphics. The environments are themed as western, ice age, medieval times and Jurassic period sets (among others) with interactive real props.

The audience can watch the contestants play the game or participate in the contest as players on their mobile device at home, riding the train or literally anywhere. They can play along or against contestants, performing customized versions of the scripted challenges in the TV show. The mobile content uses graphics generated from the same Unreal engine that created the television version.

“It’s a platform,” reports partner Høili, referring to the technology behind Lost in Time. A business model is a way you make money, notes tech blogger Jonathan Clarks, and a platform is something that generates business models. So while Lost in Time is a specific game show with specific rules, built on television technology, it’s really a business technology framework where multiple kinds of interactive content could be generated. Lost in Time is like the Unreal engine itself, software that can be used to create games, VR experiences and more, limited only by the imagination of the content creator. What The Future Group has done is create a high-tech kitchen from which any kind of cuisine can be cooked up.

Soundstages and Gear
Lost in Time is produced on two greenscreen soundstages at Dagslys Studios in Oslo. The main “gameplay set” takes up all of Studio 1 (5,393 square feet) and the “base station set” is on Studio 3 (1,345 square feet). Over 150 liters (40 gallons) of ProCyc greenscreen paint was used to cover both studios.

Ross Video, in collaboration with The Future Group, devised an integrated technology of hardware and software that supports the Lost in Time production platform. This platform consists of custom cameras, lenses, tracking, control, delay, chroma key, rendering, greenscreen, lighting and switcher technology. This system includes the new Frontier hardware, introduced at NAB 2016, which runs the Unreal game engine 3D graphics software.

Eight Sony HDC-2500 cameras running HZC-UG444 software are used for the production. Five are deployed on the “gameplay set.” One camera rides on a technocrane, two are on manual pedestal dollies and one is on Steadicam. For fast-action tracking shots, another camera sits on the Furio RC dolly that rides on a straight track that runs the 90-foot length of the studio. The Furio RC pedestal, controlled by SmartShell, guarantees smooth movement in virtual environments and uses absolute encoders on all axes to send complete 3D tracking data into the Unreal engine.

There is also one Sony HDC-P1 camera that is used as a static, center stage, ceiling cam flying 30 feet above the gameplay set. There are three cameras in the home base set, two on Furio Robo dollies and one on a technocrane. In the gameplay set, all cameras (except the ceiling cam) are tracked with the SolidTrack IR markerless tracking system.

All filming is done at 1080p25 and output RGB 444 via SDI. They use a custom LUT on the cameras to avoid clipping and an expanded dynamic range for post work. All nine camera ISOs, separate camera “clean feeds,” are recorded with a “flat” LUT in RGB 444. For all other video streams, including keying and compositing, they use LUT boxes to invert the signal back to Rec 709.

Barnfind provided the fiber optic network infrastructure that links all the systems. Ross Video Dashboard controls the BarnOne frames as well as the router, Carbonite switchers, Frontier graphics system and robotic cameras.

A genlock signal distributed via OpenGear syncs all the gear to a master clock. The Future Group added proprietary code to Unreal so the render engine can genlock, receive and record linear timecode (LTC) and output video via SDI in all industry standard formats. They also added additional functionality to the Unreal engine to control lights via DMX, send and receive GPI signals, communicate with custom sensors, buttons, switches and wheels used for interaction with the games and controlling motion simulation equipment.

In order for the “virtual cameras” in the graphics systems and the real cameras viewing the real elements to have the exact same perspectives, an “encoded” camera lens is required that provides the lens focal length (zoom) and focus data. In addition the virtual lens field of view (FOV) must be properly calibrated to match the FOV of the real lens. Full servo digital lenses with 16-bit encoders are needed for virtual productions. Lost in Time uses three Canon lenses with these specifications: Canon Hj14ex4.3B-IASE, Canon Hj22ex7.6B-IASE-A and Canon Kj17ex7.7B-IASE-A.

The Lost in Time camera feeds are routed to the Carbonite family hardware: Ultrachrome HR, Carbonite production frame and Carbonite production switcher. Carbonite Ultrachrome HR is a stand-alone multichannel chroma key processor based on the Carbonite Black processing engine. On Lost in Time, the Ultrachrome switcher accepts the Sony camera RGB 444 signal and uses high-resolution chroma keyers, each with full control of delay management, fill color temperature for scene matching, foreground key and fill, and internal storage for animated graphics.

Isolated feeds of all nine cameras are recorded, plus two quad-splits with the composited material and the program feed. Metus Ingest, a The Future Group proprietary hardware solution, was used for all video recording. Metus Ingest can simultaneously capture and record  up to six HD channels of video and audio from multiple devices on a single platform.

Post Production
While the system is capable of being broadcast live, they decided not to go live for the debut. Instead they are only doing a modest amount of post to retain the live feel. That said, the potential of the post workflow on Lost in Time arguably sets a whole new post paradigm. “Post allows us to continue to develop the virtual worlds for a longer amount of time,” says Kasin. “This gives us more flexibility in terms of storytelling. We’re always trying to push the boundaries with the creative content. How we tell the story of the different challenges.”

All camera metadata, including position, rotation, lens data, etc., and all game interaction, were recorded in the Unreal engine with a proprietary system. This allowed graphics playback as a recorded session later. This also let the editors change any part of the graphics non-destructively. They could choose to replace 3D models or textures or in post change the tracking or point-of-view of any of the virtual cameras as well as add cameras for more virtual “coverage.”

Lost in Time episodes were edited as a multicam project, based on the program feed, in Adobe Premiere CC. They have a multi-terabyte storage solution from Pixit Media running Tiger Technology’s workflow manager. “The EDL from the final edit is fed through a custom system, which then builds a timeline in Unreal to output EXR sequences for a final composite.”

That’s it for now, but be sure to visit this space again to see part two of our coverage on The Future Group’s Lost in Time. Our next story will include the real and virtual lighting systems, the SolidTrack IR tracking system, the backend component, and interview with Epic Games’ Kim Libreri about Unreal engine development/integration and a Lost in Time episode editor.


Daniel Restuccio, who traveled to Oslo for this piece, is a writer, producer and teacher. He is currently multimedia department chairperson at California Lutheran in Thousand Oaks.


Alkemy X adds creative director Geoff Bailey

Alkemy X, which offers live-action production, design, high-end VFX and post services, has added creative director Geoff Bailey to its New York office, which has now almost doubled in staff. The expansion comes after Alkemy X served as the exclusive visual effects company on M. Night Shyamalan’s Split.

Alkemy X and Bailey started collaborating in 2016 when the two worked together on a 360 experiential film project for EY (formerly Ernst & Young) and brand consultancy BrandPie. Bailey was creative director on the project, which was commissioned for EY’s Strategic Growth Forum held in Palm Desert, California, last November. The project featured Alkemy X’s live-action, VFX, animation, design and editorial work.

“I enjoy creating at the convergence of many disciplines and look forward to leveraging my branding knowledge to support Alkemy X’s hybrid creation pipeline — from ideation and strategy, to live-action production, design and VFX,” says Bailey.

Most recently, Bailey was a creative director at Loyalkaspar, where he creatively led the launch campaign for A&E’s Bates Motel. He also served as creative director/designer on the title sequence for the American launch of A&E’s The Returned, and as CD/director on a series of launch spots for the debut of Vice Media’s TV channel Viceland.

Prior to that, Bailey freelanced for several New York design firms as a director, designer and animator. His freelance résumé includes work for HBO, Showtime, Hulu, ABC, Cinemax, HP, Jay-Z, U2, Travel Channel, Comedy Central, CourtTV, Fuse, AMC Networks, Kiehl’s and many more. Bailey holds an MFA in film production from Columbia University.


Nutmeg adds Broadway Video’s former design group

New York City-based Nutmeg, a creative marketing and post production house, has acquired Broadway Video’s design team formerly known as FAC5. Under the Nutmeg brand, they are now known as NTMG Design.

The team of four — executive creative producer Doug LeBow, executive creative director Fred Salkind, creative director David Rogers and art director Karolina Dawson — is an Emmy, Telly and PromaxBDA award-winning creative collective working on design across multiple media platforms. Existing clients that could benefit from the new services include broadcast networks, cable channels and brands.

With services that include main titles and show packaging, experiential and event design, promotions and image campaigns, the group has worked with a variety of clients on a wide range of projects, including Nickelodeon HALO Awards; Nickelodeon Kids’ Choice Awards; The Emmys for Don Mischer Productions; Indy 500 100th Anniversary for ESPN; HBO’s Rock and Roll Hall of Fame Induction Ceremony for Line-by-Line Productions; Thursday Night Football and Sunday Night Football tune-in promo packaging for CBS Sports; AT&T Concert Series for iHeart Media; The Great Human Race for National Geographic Channel; The Peabody Awards for Den of Thieves and others.

“Nutmeg has always embraced growth,” says Nutmeg executive producer Laura Vick. “As our clients and the marketplace shift to engage end users, the addition of a full-service design team allows us to offer all aspects of content creation under one roof. We can now assist at the inception of an idea to help create complete visual experiences — show opens, trade shows, corporate interiors or digital billboards.”

“We look at these new design capabilities as both a new frontier unto itself, and as yet another component of what we’re already doing — telling compelling stories,” says Nutmeg executive creative director Dave Rogan. “Nothing at Nutmeg is created in a vacuum, so these new areas of design crossing over into an interactive web environment, for example, is natural.”

The new NTMG Design team will be working within Nutmeg’s midtown location. Their suite contains five workstations supported by a 10-box renderfarm, Maxon Cinema 4D, Adobe After Effects, one seat of Flame, Assimilate Scratch access for color and an insert stage for practical shooting. It is further supported by 28TBs of Infortrend storage. 

While acknowledging tools are important, executive creative director Fred Salkind says, “Sometimes when I’m asked what we work with, I say Scotch tape and scissors, because it’s the idea that puts the tools to work, not the other way around.”

Main Photo by Eljay Aguillo. L-R: Fred Salkind, David Rogers, Doug LeBow and Karolina Dawson.


Review: Maxon Cinema 4D Studio Release 18

By Brady Betzel

Each year I get to test out some of the latest and greatest software and hardware releases our industry has to offer. One of my favorites — and most challenging — is Maxon’s Cinema 4D. I say challenging because while I love Cinema 4D, I don’t use it every day. So, in order to test it thoroughly, I watched tutorials on Cineversity to brush up on what I forgot and what’s new. Even though I don’t use it every day, I do love it.

I’ve reviewed Cinema 4D Release 15 through R18. I started using the product when I was studying at California Lutheran University in Thousand Oaks, California, which coincidentally is about 10 minutes from Maxon’s Newbury Park office.

Voronoi Fracture

Each version update has been packed full of remarkable additions and updates. From the grass generator in R15, addition of the Reflectance channel in R16, lens distortion tools in R17 to the multitude of updates in R18 — Cinema 4D keeps on cranking out the hits. I say multitude because there are a ton of updates packed into the latest Cinema 4D Release 18 update. You can check out a complete list of them as well as comparisons between Cinema 4D Studio, Visualize, Broadcast, Prime, BodyPaint 3D and Lite Release 18 versions on the Maxon site.

For this review, I’m going to touch on three of what I think are the most compelling updates in Release 18: the new Voronoi Fracture, Thin Film Shader and the Push Apart Effector. Yes, I know there are a bazillion other great updates to Cinema 4D R18 — such as Weight Painting, new Knife Tools, Inverse Ambient Occlusion, the ability to save cache files externally and many more — but I’m going to stick to the features that I think stand out.

Keep in mind that I am using Cinema 4D Studio R18 for this review, so if you don’t have Studio, some of the features might not be available in your version. For instance, I am going to touch on some of the MoGraph toolset updates, and those are only inside the Studio and Broadcast versions. Finally, while you should use a super powerful workstation to get the smoothest and most robust experience when using Cinema 4D R18, I am using a tablet that uses a quad core Intel i7 3.1GHz processor, 8GB of RAM and an Intel Iris graphics 6100 GPU. Definitely on the lower end of processing power for this app, but it works and I have to credit Maxon for making it work so well.

Voronoi Fracture
If, like me, you’ve never heard of the term Voronoi, check out the first paragraph of this Wiki page. A very simple way to imagine a Voronoi diagram is a bunch of cell-like polygons that are all connected (there’s a much more intricate and deeply mathematical definition, but I can barely understand it, and it’s really beyond the scope of this review). In Cinema 4D Studio R18, the Voronoi Fracture object allows us to easily, and I mean really easily, procedurally break apart objects like MoGraph text, or any other object, without the need for external third-party plug-ins such as Nitro4D’s Thrausi.

Voronoi Fracture

To apply Voronoi Fracture in as few steps as possible, you apply the Voronoi Fracture located in the MoGraph menu to your object, adjust parameters under the Sources menu (like distribution type or point amount) add effectors to cause dispersion, keyframe values and render. With a little practice you can explode your raytraced MoGraph text in no time. The best part is your object will not look fractured until animated, which in the past took some work so this is a great update.

Thin Film Shader
Things that are hard to recreate in a photorealistic way are transparent objects, such as glass bottles, windows and bubbles. In Cinema 4D R18, Maxon has added the new Thin Film Shader, which can add the film-like quality that you see on bubbles or soap. It’s an incredible addition to Cinema 4D, furthering the idea that Maxon is concentrating on adding features that improve efficiency for people like me who want to use Cinema 4D, but sometimes don’t because making a material like Thin Film will take a long time.

To apply the Thin Film to your object, find the Reflectance channel of your material that you want to add the Thin Film property to add a new Beckmann or GGX layer, lower the Specular Strength of this layer to zero, under Layer Color choose Texture > Effects > Thin Film. From there, if you want to see the Thin Film as a true layer of film you need to change your composite setting to Add on your layer; you should then see it properly. You can get some advanced tips from the great tutorials over at Cineversity and from Andy Needham (Twitter: @imcalledandy) on lynda.com. One tip I learned from Andy is to change the Index of Refraction to get some different looks, which can be found under the Shader properties.

Push Apart Effector

Push Apart Effector
The new Push Apart Effector is a simple but super-powerful addition to Cinema 4D. The easiest way to describe the Push Apart Effector is to imagine a bunch of objects in an array or using a Cloner where all of your objects are touching — the Push Apart Effector helps to push them away from each other. To decrease the intersection of your clones, you can dial-in the specific radius of your objects (like a sphere) and then tell Cinema 4D R18 how many times you want it to look through the scene by specifying iterations. The more iterations the less chance your objects will intersect, but the more time it will take to compute.

Summing Up
I love Maxon’s continual development of Cinema 4D in Release 18. I specifically love that while they are adding new features, like Weight Painting and Update Knife Tools, they are also helping to improve efficiency for people like me who love to work in Cinema 4D but sometimes skip it because of the steep learning curve and technical know-how you need in order to operate it. You should not fear though, I cannot emphasize how much you can learn at Cineversity, Lynda.com, and on YouTube from an expert like Sean Frangella. Whether you are new to the world of Cinema 4D, mildly experienced like me, or an expert you can always learn something new.

Something I love about Maxon’s licensing for education is that if you go to a qualified school, you can get a free Cinema 4D license. Instructors can get access to Cineversity to use the tutorials in their curriculum as well as project files to use. It’s an amazing resource.

Thin Film Render

If you are an Adobe After Effects user, don’t forget that you automatically get a free version of Cinema 4D bundled with After Effects — Cinema 4D Lite. Even though you have to have After Effects open to use the Cinema 4D Lite, it is still a great way to dip your toes into the 3D world, and maybe even bring your projects back into After Effects to do some compositing.

Cinema 4D Studio R18’s pricing breaks down like this: Commercial Pricing/Annual License Pricing/Upgrade R17 to R18 pricing — Cinema 4D Studio Release 18: $3,695/$650 /$995; Cinema 4D Visualize Release 18: $2,295/$500/$795; Cinema 4D Broadcast Release 18: $1,695/$400 /$795; Cinema 4D Prime Release 18: $995/$250/$395.

Another interesting option is Maxon’s short-term licensing in three- or six-month chunks for the Studio version ($600/$1,100) and 75 percent of the fees you pay for a short-term license can be applied to your purchase of a full license later. Keep in mind, when using such a powerful and robust software like Cinema 4D you are making an investment that will payoff with concentrated effort in learning the software. With a few hours of training from some of the top trainers — like Tim Clapham on www.helloluxx.com, Greyscalegorilla.com and Motionworks.com — you will be off and running in 3D land.

For everyday Cinema 4D creations and inspiration, check out @beeple_crap on Instagram. He produces amazing work all the time.

In this review, I tested some of the new updates to Cinema 4D Studio R18 with sample projects from Andy Needham’s Lynda.com class Cinema 4D R18: New Features and Joren Kandel’s awesome website, which offers tons of free content to play with while learning the new tools.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Ingenuity Studios helps VFX-heavy spot get NASCAR-ready

Hollywood-based VFX house Ingenuity Studios recently worked on a 60-second Super Bowl spot for agency Pereira & O’Dell promoting Fox Sports’ coverage of the Daytona 500, which takes place on February 26. The ad, directed by Joseph Kahn, features people from all over the country gearing up to watch the Daytona 500, including footage from NASCAR races, drivers and, for some reason, actor James Van Der Beek.

The Ingenuity team had only two weeks to turn around this VFX-heavy spot, called Daytona Day. Some CG elements include a giant robot, race cars and crowds. While they were working on the effects, Fox was shooting footage in Charlotte, North Carolina and Los Angeles.

“When we were initially approached about this project we knew the turnaround would be a challenge,” explains creative director/VFX supervisor Grant Miller. “Editorial wasn’t fully locked until Thursday before the big game! With such a tight deadline preparing as much as we could in advance was key.”

Portions of the shoot took place at the Daytona Speedway, and since it was an off day the stadium and infield were empty. “In preparation, our CG team built the entire Daytona stadium while we were still shooting, complete with cheering CG crowds, RVs filling the interior, pit crews, etc.,” says Miller. “This meant that once shots were locked we simply needed to track the camera, adjust the lighting and render all the stadium passes for each shot.”

Additional shooting took place at the Charlotte Motor Speedway, Downtown Los Angeles and Pasadena, California.

In addition to prepping CG for set extensions, Ingenuity also got a head start on the giant robot that shows up halfway through the commercial.  “Once the storyboards were approved and we were clear on the level of detail required, we took our ‘concept bot’ out of ZBrush, retopologized and unwrapped it, then proceeded to do surfacing and materials in Substance Painter. While we had some additional detailing to do, we were able to get the textures 80 percent completed by applying a variety of procedural materials to the mesh, saving a ton of manual painting.”

Other effects work included over 40 CG NASCAR vehicles to fill the track, additional cars for the traffic jam and lots of greenscreen and roto work to get the scenes shot in Charlotte into Daytona. There was also a fair bit of invisible work that included cleaning up sets, removing rain, painting out logos, etc.

Other tools used include Autodesk’s Maya, The Foundry’s Nuke and BorisFX’s Mocha.