Author Archives: Dayna McCallum

VFX company Kevin launches in LA

VFX vets Tim Davies, Sue Troyan and Darcy Parsons have partnered to open the Los Angeles-based VFX house, Kevin. The company is currently up and running in a temp studio in Venice, while construction is underway on Kevin’s permanent Culver City location, scheduled to open early next year.

When asked about the name, as none of the partners are actually named Kevin, Davies said, “Well, Kevin is always there for you! He’s your best mate and will always have your back. He’s the kind of guy you want to have a beer with whenever he’s in town. Kevin knows his stuff and works his ass off to make sure you get what you need and then some!” Troyan added, “Kevin is a state of mind.”

Davies is on board as executive creative director, overseeing the collective creative output of the company. Having led teams of artists for over 25 years, he was formerly at Asylum Visual Effects and The Mill as creative director and head of 2D. Among his works are multiple Cannes Gold Lion-winning commercials, including HBO’s “Voyeur” campaign for Jake Scott, Nike Golf’s Ripple for Steve Rogers, Old Spice’s Momsong for Steve Ayson, Old Spice’s Dadsong for Andreas Nilsson, and Old Spice’s Whale and Rocket Car for Steve Rogers.

Troyan will serve as senior executive producer of Kevin, having previously worked on campaigns at The Mill and Method. Parsons, owner and partner of Kevin, has enjoyed a career covering multiple disciplines, including producer, VFX producer and executive producer.

Launch projects for Kevin include recent spots for Wieden + Kennedy Portland, The Martin Agency and Spark44.

Main Image: L-R: Darcy Parsons, Sue Troyan, Tim Davies

Xytech launches MediaPulse Managed Cloud at IBC

Facility management software provider Xytech has introduced a cloud and managed services offering, MediaPulse Managed Cloud. Hosted in Microsoft Azure, MediaPulse Managed Cloud is a secure platform offering full system management.

MediaPulse Managed Cloud is available through any web browser and compatible with iOS, Android and Windows mobile devices. The new managed services handle most administrative functions, including daily backups, user permissions and screen layouts. The offering is available with several options, including a variety of language packs, allowing for customization and localization.

Slated for shipping in October, MediaPulse Managed Cloud is compliant with European privacy laws and enables secure data transmission across multiple geographies.

Xytech debuted MediaPulse Managed Cloud at IBC2017. The show was the company’s first as a member of the Advanced Media Workflow Association, a community-driven forum focused on advancing business-driven solutions for networked media workflows.

Axle Video rebrands as Axle AI

Media management company Axle Video has rebranded as Axle AI. The company has also launched their new Axle AI software, allowing users to automatically index and search large amounts of video, image and audio content.

Axle AI is available either as software, which runs on standard Mac hardware, or as a self-contained software/hardware appliance. Both options provide integrations with leading cloud AI engines. The appliance also includes embedded processing power that supports direct visual search for thousands of hours of footage with no cloud connectivity required. Axle AI has an open architecture, so new third-party capabilities can be added at any time.

Axle has also launched Axle Media Cloud with Wasabi, a 100% cloud-based option for simple media management. The offering is available now and is priced at $400 per month for 10 terabytes of managed storage, 10 user accounts and up to 10 terabytes of downloaded media per month.

In addition, Axle Embedded is a new version of axle software that can be run directly on storage solutions from a range of industry partners, including, G-Technology and Panasas. As with Axle Media Cloud, all of Axle AI’s automated tagging and search capabilities are simple add-ons to the system.

Blackmagic’s new Ultimatte 12 keyer with one-touch keying

Building on the 40-year heritage of its Ultimatte keyer, Blackmagic Design has introduced the Ultimatte 12 realtime hardware compositing processor for broadcast-quality keying, adding augmented reality elements into shots, working with virtual sets and more. The Ultimatte 12 features new algorithms and color science, enhanced edge handling, greater color separation and color fidelity and better spill suppression.

The 12G-SDI design gives Ultimatte 12 users the flexibility to work in HD and switch to Ultra HD when they are ready. Sub-pixel processing is said to boost image quality and textures in both HD and Ultra HD. The Ultimatte 12 is also compatible with most SD, HD and Ultra HD equipment, so it can be used with existing cameras.

With Ultimatte 12, users can create lifelike composites and place talent into any scene, working with both fixed cameras and static backgrounds or automated virtual set systems. It also enables on-set previs in television and film production, letting actors and directors see the virtual sets they’re interacting with while shooting against a green screen.

Here are a few more Ultimatte 12 features:

  • For augmented reality, on-air talent typically interacts with glass-like computer-generated charts, graphs, displays and other objects with colored translucency. Adding tinted, translucent objects is very difficult with a traditional keyer, and the results don’t look realistic. Ultimatte 12 addresses this with a new “realistic” layer compositing mode that can add tinted objects on top of the foreground image and key them correctly.
  • One-touch keying technology analyzes a scene and automatically sets more than 100 parameters, simplifying keying as long as the scene is well-lit and the cameras are properly white-balanced. With one-touch keying, operators can pull a key accurately and with minimum effort, freeing them to focus on the program with fewer distractions.
  • Ultimatte 12’s new image processing algorithms, large internal color space, and automatic internal matte generation lets users work on different parts of the image separately with a single keyer.
  • For color handling, Ultimatte 12 has new flare, edge and transition processing to remove backgrounds without affecting other colors. The improved flare algorithms can remove green tinting and spill from any object — even dark shadow areas or through transparent objects.
  • Ultimatte 12 is controlled via Ultimatte Smart Remote 4, a touch-screen remote device that connects via Ethernet. Up to eight Ultimatte 12 units can be daisy-chained together and connected to the same Smart Remote, with physical buttons for switching and controlling any attached Ultimatte 12.

Ultimatte 12 is now available from Blackmagic Design resellers.

postPerspective Impact Award winners from SIGGRAPH 2017

Last April, postPerspective announced the debut of our Impact Awards, celebrating innovative products and technologies for the post production and production industries that will influence the way people work. We are now happy to present our second set of Impact Awards, celebrating the outstanding offerings presented at SIGGRAPH 2017.

Now that the show is over, and our panel of VFX/VR/post pro judges has had time to decompress, dig out and think about what impressed them, we are happy to announce our honorees.

And the winners of the postPerspective Impact Award from SIGGRAPH 2017 are:

  • Faceware Technologies for Faceware Live 2.5
  • Maxon for Cinema 4D R19
  • Nvidia for OptiX 5.0  

“All three of these technologies are very worthy recipients of our first postPerspective Impact Awards from SIGGRAPH,” said Randi Altman, postPerspective’s founder and editor-in-chief. “These awards celebrate companies that define the leading-edge of technology while producing tools that actually make users’ working lives easier and projects better, and our winners certainly fall into that category.

“While SIGGRAPH’s focus is on VFX, animation, VR/AR and the like, the types of gear they have on display vary. Some are suited for graphics and animation, while others have uses that slide into post production. We’ve tapped real-world users in these areas to vote for our Impact Awards, and they have determined what tools might be most impactful to their day-to-day work. That’s what makes our awards so special.”

There were many new technologies and products at SIGGRAPH this year, and while only three won an Impact Award, our judges felt there were other updates that it was important to let people know about as well.

Blackmagic Design’s Fusion 9 was certainly turning heads and Nvidia’s VRWorks 360 Video was called out as well. Chaos Group also caught our judges attention with V-Ray for Unreal Engine 4.

Stay tuned for future Impact Award winners in the coming months — voted on by users for users — from IBC.

WAR FOR THE PLANET OF THE APES

Editor William Hoy — working on VFX-intensive War for the Planet of the Apes

By Mel Lambert

For William Hoy, ACE, story and character come first. He also likes to use visual effects “to help achieve that idea.” This veteran film editor points to director Zack Snyder’s VFX-heavy I, Robot, director Matt Reeves’ 2014 version of Dawn of the Planet of the Apes and his new installment, War for the Planet of the Apes, as “excellent examples of this tenet.”

War for the Planet of the Apes, the final part of the current reboot trilogy, follows a band of apes and their leader as they are forced into a deadly conflict with a rogue paramilitary faction known as Alpha-Omega. After the apes suffer unimaginable losses, their leader begins a quest to avenge his kind, and an epic battle that determines the fate of both their species and the future of our planet.

Marking the picture editor’s second collaboration with Reeves, Hoy recalls that he initially secured an interview with the director through industry associates. “Matt and I hit it off immediately. We liked each other,” Hoy recalls. “Dawn of the Planet of the Apes had a very short schedule for such a complicated film, and Matt had his own ideas about the script — particularly how the narrative ended. He was adamant that he ‘start over’ when he joined the film project.

“The previous Dawn script, for example, had [the lead ape character] Caesar and his followers gaining intelligence and driving motorized vehicles,” Hoy says. “Matt wanted the action to be incremental which, it turned out, was okay with the studio. But a re-written script meant that we had a very tight shoot and post schedule. Swapping its release date with X-Men: Days of Future Past gave us an additional four or five weeks, which was a huge advantage.”

William Hoy, ACE (left), Matt Reeves (right).

Such a close working relationship on Dawn of the Planet of the Apes meant that Hoy came to the third installment in the current trilogy with a good understanding of the way that Reeves likes to work. “He has his own way of editing from the dailies, so I can see what we will need on rough cut as the filmed drama is unfolding. We keep different versions in Avid Media Composer, with trusted performances and characters, and can see where they are going” with the narrative. Having worked with Reeves over the past two decades, Stan Salfas, ACE, served as co-editor on the project, joining prior to the Director’s Cut.

A member of The Academy of Motion Picture Arts And Sciences, Hoy also worked with director Randall Wallace on We Were Soldiers and The Man in the Iron Mask, with director Phillip Noyce on The Bone Collector and director Zack Snyder on Watchmen, a film “filled with emotional complexity and heavy with visual effects,” he says.

An Evolutionary Editing Process
“Working scene-by-scene with motion capture images and background artwork laid onto the Avid timeline, I can show Matt my point of view,” explains Hoy. “We fill in as we go — it’s an evolutionary process. I will add music and some sound effects for that first cut so we can view it objectively. We ask, ‘Is it working?’ We swap around ideas and refine the look. This is a film that we could definitely not have cut on film; there are simply too many layers as the characters move through these varied backgrounds. And with the various actors in motion capture suits giving us dramatic performances, with full face movements [CGI-developed facial animation], I can see how they are interacting.”

To oversee the dailies on location, Hoy set up a Media Composer editing system in Vancouver, close to the film locations used for principal photography. “War for the Planet of the Apes was shot on Arri Alexa 65 digital cameras that deliver 6K images,” the editor recalls. “These files were down-sampled to 4K and delivered to Weta Digital [in New Zealand] as source material, where they were further down-sampled to 2K for CGI work and then up-sampled back to 4K for the final release. I also converted our camera masters to 2K DNxHD 32/36 for editing color-timed dailies within my Avid workstation.”

In terms of overall philosophy, “we did not want to give away Caesar’s eventual demise. From the script, I determined that the key arc was the unfolding mystery of ‘What is going on?’ And ‘Where will it take us?’ We hid that Caesar [played by Andy Serkis] is shot with an arrow, and initially just showed the blood on the hand of the orangutan, Maurice [Karin Konoval]; we had to decide how to hide that until the key moment.”

Because of the large number of effect-heavy films that Hoy has worked on, he is considered an action/visual effects editor. “But I am drawn to performances of actors and their characters,” he stresses. “If I’m not invested in their fate, I cannot be involved in the action. I like to bring an emotional value to the characters, and visualize battle scenes. In that respect Matt and I are very much in tune. He doesn’t hide his emotion as we work out a lot of the moves in the editing room.”

For example, in Dawn of The Planet of The Apes, Koba, a human-hating Bonobo chimpanzee who led a failed coup against Caesar, is leading apes against the human population. “It was unsatisfying that the apes would be killing humans while the humans were killing apes. Instead, I concentrated on the POV of Caesar’s oldest son, Blue Eyes. We see the events through his eyes, which changed the overall idea of the battle. We shot some additional material but most of the scene — probably 75% — existed; we also spoke with the FX house about the new CGI material,” which involved re-imaged action of horses and backgrounds within the Virtual Sets that were fashioned by Weta Digital.

Hoy utilized VFX tools on various layers within his Media Composer sessions that carried the motion capture images, plus the 3D channels, in addition to different backgrounds. “Sometimes we could use one background version and other times we might need to look around for a new perspective,” Hoy says. “It was a trial-and-error process, but Matt was very receptive to that way of working; it was very collaborative.”

Twentieth Century Fox’s War for the Planet of the Apes.

Developing CGI Requests for Key Scenes
By working closely with Weta Digital, the editor could develop new CGI requests for key scenes and then have them rendered as necessary. “We worked with the post-viz team to define exactly what we needed from a scene — maybe to put a horse into a blizzard, for example. Ryan Stafford, the film’s co-producer and visual effects producer, was our liaison with the CGI team. On some scenes I might have as many as a dozen or more separate layers in the Avid, including Caesar, rendered backgrounds, apes in the background, plus other actors in middle and front layers” that could be moved within the frame. “We had many degrees of freedom so that Matt and I could develop alternate ideas while still preserving the actors’ performances. That way of working could be problematic if you have a director who couldn’t make up his mind; happily, Matt is not that way!”

Hoy cites one complex scene that needed to be revised dramatically. “There is a segment in which Bad Ape [an intelligent chimpanzee who lived in the Sierra Zoo before the Simian Flu pandemic] is seen in front of a hearth. That scene was shot twice because Matt did not consider it frenetic enough. The team returned to the motion capture stage and re-shot the scene [with actor Steve Zahn]. That allowed us to start over again with new, more frantic physical performances against resized backgrounds. We drove the downstream activities – asking Weta to add more snow in another scene, for example, or maybe bring Bad Ape forward in the frame so that we can see him more clearly. Weta was amazing during that collaborative process, with great input.”

The editor also received a number of sound files for use within his Avid workstation. “In the beginning, I used some library effects and some guide music — mostly some cues of composer Michael Giacchino’s Dawn score music from music editor Paul Apelgren. Later, when the picture was in one piece, I received some early sketches from the sound design team. For the Director’s Cut we had a rough cut with no CGI from Weta Digital. But when we received more sound design, I would create temp mixes on the Avid, with a 5.1-channel mix for the sound-editorial team using maybe 24 tracks of effects, dialog and music elements. It was a huge session, but Media Composer is versatile. After turning over that mix to Will Files, the film’s sound designer, supervising sound editor and co-mixer, I was present with Matt on the re-recording stage for maybe six weeks of the final mix as the last VFX elements came in. We were down to the wire!”

Hoy readily concedes that while he loves to work with new directors — “and share their point of view” — returning to a director with whom he has collaborated previously is a rewarding experience. “You develop a friendly liaison because it becomes easier once you understand the ways in which a director works. But I do like to be challenged with new ideas and new experiences.” He may get to work again with Reeves on the director’s next outing, The Batman, “but since Matt is still writing the screenplay, time will tell!”


Mel Lambert is principal of Content Creators, an LA-based copywriting and editorial service, and can be reached at mel.lambert@content-creators.com. Follow him on Twitter @MelLambertLAHe is also a long-time member of the UK’s National Union of Journalists.

 

Pixomondo streamlines compute management with Deadline

There’s never a dull moment at Pixomondo, where artists and production teams juggle feature film, TV, theme park and commercial VFX projects between offices in Toronto, Vancouver, Los Angeles, Frankfurt, Stuttgart, Shanghai and Beijing. The Academy- and Emmy-award-winning VFX studio securely manages its on-premises compute resources across its branches and keeps its rendering pipeline running 24/7 utilizing Thinkbox’s Deadline, which it standardized on in 2010.

In recent months, Pixomondo has increasingly been computing workstation tasks on its render farm via Deadline and has moved publishing to Deadline as well. Sebastian Kral, Pixomondo’s global head of pipeline, says, “By offloading more to Deadline, we’re able to accelerate production. Our artists don’t have to wait for publishes to finish before they move onto the next task, and that’s really something. Deadline’s security is top-notch, which is extremely important for us given the secretive nature of some of our projects.”

Kral is particularly fond of Deadline’s Python API, which allows his global team to develop custom scripts to minimize the minutia that artists must deal with, resulting in a productivity boon. “Deadline gives us incredible flexibility. The Python API is fast, reliable and more usable than a command line entry point, so we can script so many things on our own, which is convenient,” says Kral. “We can build submission scripts for texture conversions, and create proxy data when a render job is done, so our artists don’t have to think about whether or not they need a QT of a composite.”

Power Rangers. Images courtesy of Pixomondo.

The ability to set environment variables for renders, or render as a specific user, allows Pixomondo’s artists to send tasks to the farm with an added layer of security. With seven facilities worldwide, and the possibility of new locations based on production needs, Pixomondo has also found Deadline’s ability to enable multi-facility rendering valuable.

“Deadline is packed with a ton of great out-of-the-box features, in addition to the new features that Thinkbox implements in new iterations; we didn’t even need to build our own submission tool, because Deadline’s submission capabilities are so versatile,” Kral notes. “It also has a very user-friendly interface that makes setup quick and painless, which is great for getting new hires up to speed quickly and connecting machines across facilities.”

Pixomondo’s more than 400 digital artists are productive around the clock, taking advantage of alternating time zones at facilities around the world. Nearly every rendering decision at the studio is made with Deadline in mind, as it presents rendering metrics in an intuitive way that allows the team to more accurately estimate project turnaround. “When opening Deadline to monitor a render, it’s always an enjoyable experience because all the information I need is right there at my fingertips,” says Kral. “It provides a meaningful overview of our rendering resource spread. We just log in, test renders, and we have all the information needed to determine how long each task will take using the available machines.”

Behind the Title: 3008 Editorial’s Matt Cimino and Greg Carlson

NAMES: Matt Cimino and Greg Carlson

COMPANY: 3008 Editorial in Dallas

WHAT’S YOUR JOB TITLE?
Cimino: We are sound designers/mixers.

WHAT DOES THAT ENTAIL?
Cimino: Audio is a storytelling tool. Our job is to enhance the story directly or indirectly and create the illusion of depth, space and a sense of motion with creative sound design and then mix that live in the environment of the visuals.

Carlson: And whenever someone asks, I always tend to prioritize sound design before mixing. Although I love every aspect of what we do, when a spot hits my room as a blank slate, it’s really the sound design that can take it down a hundred different paths. And for me, it doesn’t get better than that.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Carlson: I’m not sure a brief job title can encompass what anyone really does. I am a composer as well as a sound designer/mixer, so I bring that aspect into my work. I love musical elements that help stitch a unified sound into a project.

Cimino: That there really isn’t “a button” for that!

WHAT’S YOUR FAVORITE PART OF THE JOB?
Carlson: The freedom. Having the opportunity to take a project where I think it should go and along the way, pushing it to the edge and back. Experimenting and adapting makes every spot a completely new trip.

Matt Cimino

Cimino: I agree. It’s the challenge of creating an expressive and aesthetically pleasing experience by taking the soundtrack to a whole new level.

WHAT’S YOUR LEAST FAVORITE?
Cimino: Not Much. However, being an imperfect perfectionist, I get pretty bummed when I do not have enough time to perfect the job.

Carlson: People always say, “It’s so peaceful and quiet in the studio, as if the world is tuned out.” The downside of that is producer-induced near heart attacks. See, when you’re rocking out at max volume and facing away from the door, well, people tend to come in and accidentally scare you to death.

WHAT IS YOUR FAVORITE TIME OF THE DAY?
Cimino: I’m a morning person!

Carlson: Time is an abstract notion in a dark room with no windows, so no time in particular. However, the funniest time of day is when you notice you’re listening about 15 dB louder than the start of the day. Loud is better.

IF YOU DIDN’T HAVE THIS JOB, WHAT WOULD YOU BE DOING INSTEAD?
Cimino: Carny. Or Evel Knievel.

Carlson: Construction/carpentry. Before audio, I had lots of gritty “hands-on” jobs. My dad taught me about work ethic, to get my hands dirty and to take pride in everything. I take that same approach with every spot I touch. Now I just sit in a nice chair while doing it.

WHY DID YOU CHOOSE THIS PROFESSION? HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
Cimino: I’ve had a love for music since high school. I used to read all the liner notes on my vinyl. One day I remember going through my father’s records and thinking at that moment, I want to be that “sound engineer” listed in the notes. This led me to study audio at Columbia College in Chicago. I quickly gravitated towards post production audio classes and training. When I wasn’t recording and mixing music, I was doing creative sound design.

Carlson: I was always good with numbers and went to Michigan State to be an accountant. But two years in, I was unhappy. All I wanted was to work on music and compose, so I switched to audio engineering and never looked back. I knew the second I walked into my first studio, I had found my calling. People always say there isn’t a dream job; I disagree.

CAN YOU DESCRIBE YOUR COMPANY?
Cimino: A fun, stress-free environment full of artistry and technology.

Carlson: It is a place I look forward to every day. It’s like a family, solely focused on great creative.

CAN YOU NAME SOME RECENT SPOTS YOU HAVE WORKED ON?
Cimino: Snapple, RAM, Jeep, Universal Orlando, Cricket Wireless, Maserati.

Carlson: AT&T, Lay’s, McDonald’s, Bridgestone Golf.

Greg Carlson

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
Carlson: It’s nearly impossible to pick one, but there is a project I see as pivotal in my time here in Dallas. It was shortly after I arrived six years ago. I think it was a boost to my confidence and in turn, enhanced my style. The client was The Home Depot and the campaign was Lets Do This. A creative I admire greatly here in town gave me the chance to spearhead the sonic approach for the work. There are many moments, milestones and memories, but this was a special project to me.

Cimino: There are so many. One of the most fun campaigns I worked on was for Snapple, where each spot opened with the “pop!” of the Snapple cap. I recorded several pops (close-miced) and selected one that I manipulated to sound larger than life but also retain the sound of the brands signature cap pop being opened. After the cap pops, the spot transforms into an exploding fruit infusion. The sound was created by smashing Snapple bottles for the glass break, crushing, smashing and squishing fruit with my hands, and using a hydrophone to record splashing and underwater sounds to create the slow-motion effect of the fruit morphing. So much fun.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT.
Cimino: During a mix, my go-tos are iZotope, Sound Toys and Slate Digital. Outside the studio I can’t live without my Apple!

Carlson: ProTools, all things iZotope, Native Instruments.

THIS IS A HIGH-STRESS JOB WITH DEADLINES AND CLIENT EXPECTATIONS. WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
Cimino: Family and friends. I love watching my kiddos play select soccer. Relaxing pool or beachside with a craft cider. Or on a single path/trail with my mountain bike.

Carlson: I work on my home, build things, like to be outside. When I need to detach for a bit, I prefer dangerous power tools or being on a body of water.

Review: Polaroid Cube+

By Brady Betzel

There are a lot of options out there for outdoor, extreme sports cameras — GoPro is the first that comes to mind with their Hero line, but even companies like Garmin have their own versions that are gaining traction in the niche action camera market. Polaroid has been trying their hand in lots of product markets lately, from camera sliders to monopods and even video cameras with the Polaroid Cube+.

I’m a big fan of GoPro cameras, but one thing that might keep people away is the price. So what if you want something that will record video and take still pictures at a lower cost? That’s where the Polaroid Cube+ fits in. It’s a cube-shaped HD camera that is not much larger than a few sugar cubes. It can film HD video (technically 720p at 30, 60 or 120 fps; 1080p at 30 or 60fps; or 1440p at 30fps), as well take still images at four megapixels interpolated into eight megapixels.

Right off the bat you’ll read “4MP interpolated into 8MP,” which really means it’s a 4MP camera sensor that uses some sort of algorithm, like bicubic interpolation, to blow up your image with a minimal amount of quality loss. Think of it this way — if you are viewing images on your smartphone, you probably won’t see a lot of problems except for your image being a little soft. Other than that tricky bit of word play (which is not uncommon among camera manufacturers), the Cube+ has a decent retail price at just $150.

In my mind, this is a camera that can be used as an educational tool for young filmmakers or for a filmmaker that wants to get a really sneaky b-roll shot in a tight space without paying a high cost. The sound quality isn’t great, but it’s good for reference when syncing cameras together or in an emergency when there is no other audio recording.

Inside the box you get the Cube+ in black, red or teal; a microUSB cable to charge and connect the Cube + to your computer, a user guide, and an 8GB MicroSD. There is a WiFi button, a power/record button and a back cover. Your MicroSD lives under the back cover, and the connection for the microUSB cable can be found there as well.

The Cube+ has WiFi built in, so you can access the camera on your Android or iPhone, control your camera and settings, or even browse the content of your camera. You must have their app to be able to control the Cube+’s camera settings, otherwise it will default to what you had last. To start filming or taking pictures, you hold the power button for three seconds to turn it on. You click the button on the top twice to start recording video, then click once more to end video recording. You click just once to take a picture.

The Cube+ films with its 124-degree lens that has a fisheye look like many wide-angle action cams. According to Polaroid, the Cube+ has image stabilization built in, but I found the footage to still be shaky. It’s possible that the video could be shakier without it, but I found the footage to need some post production stabilization work.

In my opinion, what really sets this camera apart from other action cameras, besides the price point, is the magnet inside the camera that allows you to stick it to anything magnetic without buying additional accessories. Others should consider adding that to their lineup too.

I took the Cube+ to the Santa Barbara Zoo with one of my sons recently and wasn’t afraid to give it to him to film or take pictures with. Since it is splash proof, it can even get a little wet without ruining it. Again, I really love the ability to mount the Cube+ to almost anything with its magnet on the bottom, which is pretty strong. We were riding the train around the zoo, and I stuck it to the train rail without a worry of it falling off. But I did notice when using it that the magnet did get pretty warm, as in it would border on being too hot to touch. Just something to keep in mind if you let kids use it.

In the end, the Polaroid Cube+ is not on the quality level of the GoPro Hero 5 Session, but it might be good for someone filming for the first time that doesn’t want to spend a lot of money. And at $150, it might be a good b-roll camera when used in conjunction with your phone’s camera.

You can check out more about the Polaroid Cube+ in its user manual.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on Life Below Zero and Cutthroat Kitchen. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Keslow Camera acquires Clairmont Camera — Denny Clairmont Retires

Signaling the end of an era, Denny Clairmont, one of the industry’s most respected talents in front of and behind the camera, is retiring. Keslow Camera is buying his company, Clairmont Camera, including its Vancouver and Toronto operations. The acquisition is expected to be complete on or before August 4.

Keslow Camera says it will retain the teams at Clairmont’s Vancouver and Toronto facilities, which have been offering professional digital and film cameras, lenses and accessories to the area since the 1980s. All operations within California are slated to eventually be consolidated into Keslow Camera’s headquarters in Culver City. The move will more than quadruple Keslow Camera’s anamorphic and vintage lens inventory and add a substantial range of custom camera equipment to the company’s portfolio.

Denny Clairmont, along with his brother, Terry, established the movie equipment and camera rental company that would become Clairmont Camera in 1976. In 2011, Clairmont received the John A. Bonner Medal of Commendation from the Academy of Motion Picture Arts and Sciences (AMPAS), awarded by the Academy Board of Governors upon the recommendation of the Scientific and Technical Awards Committee. Clairmont and Ken Robings won a Technical Achievement Award from the Society of Camera Operators (SOC) for the lens perspective system, and Clairmont has won two Emmys from the Academy of Television Arts and Sciences for his role in the development of special lens systems.

“Clairmont Camera is my life’s work, and I never stopped searching for innovative ways to serve our clients,” says Clairmont. “I have long respected Robert Keslow and the team at Keslow Camera for their integrity, quality of management, best-in-class customer service and successful performance. I am confident they are the right company to honor my heritage and founding vision going forward.”