OWC 12.4

Category Archives: motion capture

OptiTrack reveals new skeletal solver

OptiTrack has a new skeletal solver that brings artifact-free, realtime character animation to its optical motion capture systems.

Key features of OptiTrack skeletal solver include:

– Accurate human movement tracking in realtime
– Major advances in solve quality and artifact-free streaming of character data
– Compatible with any OptiTrack system, including those used for live-action camera tracking, virtual camera tracking and virtual reality
– Supports industry-standard tools, including Epic Games’ Unreal Engine, Unity Technologies’ Unity realtime platform and Autodesk MotionBuilder
– Extremely low latency (less than 10 milliseconds)

As a complement to its new skeletal solver, OptiTrack has introduced an equally high-performing finger-tracking solution created in partnership with Manus VR. Embedded with OptiTrack’s signature pulse Active technology, Inertial Measurement Units (IMU) and bend sensors, the gloves deliver accurate, continuous finger-tracking data in real time that is fully compatible with existing character animation and VR pipelines when used with OptiTrack systems.

Sandbox VR partners with Vicon on Amber Sky 2088 experience

VR gaming company Sandbox VR has been partnering and working with Vicon motion capture tools to create next-generation immersive experiences. By using Vicon’s motion capture cameras and its location-based VR (LBVR) software Evoke, the Hong Kong-based Sandbox VR is working to transport up to six people at a time into the Amber Sky 2088 experience, which takes place in a future where the fate of humanity lies in the balance.

Sandbox VR’s adventures resemble movies where the players become the characters. With two proprietary AAA-quality games already in operation across Sandbox VR’s seven locations, for its third title, Amber Sky 2088, a new motion capture solution was needed. In the futuristic game, users step into the role of androids, granting players abilities far beyond the average human while still scaling the game to their actual movements. To accurately convey that for multiple users in a free-roam environment, precision tracking and flexible scalability were vital. For that, Sandbox VR turned to Vicon.

Set in the twilight of the 21st century, Amber Sky 2088 takes players to a futuristic version of Hong Kong, then through the clouds to the edge of space to fight off an alien invasion. Android abilities allow players to react with incredible strength and move at speeds fast enough to dodge bullets. And while the in-game action is furious, participants in the real-world — equipped with VR headsets —  freely roam an open environment as Vicon LBVR motion capture cameras track their movement.

Vicon’s motion capture cameras record every player movement, then send the data to its Evoke software, a solution introduced last year as part of its LBVR platform, Origin. Vicon’s solution offers  precise tracking, while also animating player motion in realtime, creating a seamless in-game experience. Automatic re-calibration also makes the experience’s operation easier than ever despite its complex nature, and the system’s scalability means fewer cameras can be used to capture more movement, making it cost-effective for large scale expansion.

Since its founding in 2016, Sandbox VR has been creating interactive experiences by combining motion capture technology with virtual reality. After opening its first location in Hong Kong in 2017, the company has since expanded to seven locations across Asia and North America, with six new sites on the way. Each 30- to 60-minute experience is created in-house by Sandbox VR, and each can accommodate up to six players at a time.

The recent partnership with Vicon is the first step in Sandbox VR’s expansion plans that will see it open over 40 experience rooms across 12 new locations around the world by the end of the year. In considering its plans to build and operate new locations, the VR makers chose to start with five systems from Vicon, in part because of the company’s collaborative nature.

OWC 12.4

Reallusion intros three tools for mocap, characters

Reallusion has launched three new motion capture and character creation products: Character Creator 3, a stand-alone character creation tool; Motion Live, a realtime motion capture solution; and 3D Face Motion Capture with Live Face for iPhone X. With these products Reallusion is offering a total solution to build, morph, animate and gamify 3D characters.

Character Creator 3 (CC3), the new generation of iClone Character Creator, has separated from iClone to become a professional stand-alone tool. With a new quad base, roundtrip editing with ZBrush and photorealistic rendering using Iray, Character Creator 3 is a full character-creation solution for generating optimized 3D characters that are ready for games or intensive artistic design.

CC3 provides a new game character base with topology optimized for mobile, game and AR/VR developers. The big breakthrough is the integration with InstaLOD’s model and material optimization technologies to generate game-ready characters that are animatable on the fly, fulfilling the complete character pipeline on polygon reduction, material merge, texture baking, remeshing and LOD generation.

CC3 launches this month and is available now for preorder for $199. More details can be found here. iClone Motion Live, the multidevice motion capture system, connects industry-standard motion gear — including Rokoko, Leap Motion, Xsens, Faceware, OptiTrack, Noitom and iPhone X — into one solution.

Motion Live’s intuitive plug-and-play design makes connecting complicated mocap devices simple by animating custom imported characters or fully rigged 3D characters generated by Character Creator, Daz Studio or other industry-standard sources.

Reallusion has also debuted the combination of the 3D Face Motion Capture with the iPhone X solution with the Live Face app for iClone. As a result, users can record instant facial motion capture on any 3D character with an iPhone X. Reallusion has expanded the technology behind Animoji and Memoji to lift iPhone X animation and motion capture to the next level for studios and independent creators. The solution combines the power of iPhone X mocap with iClone Motion Live to blend face motion capture with Xsens, Perception Neuron, Rokoko, OptiTrack and Leap Motion for a truly realtime live experience in full-body mocap.


Alkemy X joins forces with Quietman, adds CD Megan Oepen

Creative content studio Alkemy X has entered into a joint venture with long-time New York City studio Quietman. In addition, Alkemy X has brought on director/creative director Megan Oepen.

The Quietman deal will see founder and creative director Johnnie Semerad moving the operations of his company into Alkemy X, where both parties will share all creative talent, resources and capabilities.

“Quietman’s reputation of high-end, award-winning work is a tribute to Johnnie’s creative and entrepreneurial spirit,” says Justin B. Wineburgh, Alkemy X president/CEO. “Over the course of two decades, he grew and evolved Quietman from a fledgling VFX boutique into one of the most renowned production companies in advertising and branded content. By joining forces with Alkemy X, we’ll no doubt build on each other’s legacies collectively.”

Semerad co-founded Quietman in 1996 as a Flame-based visual effects company. Since then, it has expanded into the full gamut of production and post production services, producing more than 100 Super Bowl spots, and earning a Cannes Grand Prix, two Emmy Awards and other honors along the way.

“What I’ve learned over the years is that you have to constantly reinvest and reinvent, especially as clients increasingly demand start-to-finish projects,” says Semerad. “Our partnership with Alkemy X will elevate how we serve existing and future clients together, while bolstering our creative and technical resources to reach our potential as commercial filmmakers. The best part of this venture? I’ve always been listed with the Qs, but now, I’m with the As!”

Alkemy X is also teaming up with Oepen, an award-winning creative director and live-action director with 20 years of broadcast, sports and consumer brand campaign experience. Notable clients include Google, the NBA, MLB, PGA, NASCAR, Dove Beauty, Gatorade, Sprite, ESPN, Delta Air Lines, Home Depot, Regal Cinemas, Chick-Fil-A and Yahoo! Sports. Oepen was formerly the executive producer and director for Red Bull’s Non-Live/Long Format Productions group, and headed Under Armour’s Content House. She was also the creator behind Under Armour Originals.


Maxon intros Cinema 4D Release 20

Maxon will be at Siggraph this year showing the next iteration of its Cinema 4D Release 20 (R20), an update of its 3D design and animation software. Release 20 introduces high-end features for VFX and motion graphics artists including node-based materials, volume modeling, CAD import and an evolution of the MoGraph toolset.

Maxon expects Cinema 4D Release 20 to be available this September for both Mac and Windows operating systems.

Key highlights in Release 20 include:
Node-Based Materials – This feature provides new possibilities for creating materials — from simple references to complex shaders — in a node-based editor. With more than 150 nodes to choose from that perform different functions, artists can combine nodes to easily build complex shading effects. Users new to a node-based material workflow still can rely on Cinema 4D’s standard Material Editor interface to create the corresponding node material in the background automatically. Node-based materials can be packaged into assets with user-defined parameters exposed in a similar interface to Cinema 4D’s Material Editor.

MoGraph Fields – New capabilities in this procedural animation toolset offer an entirely new way to define the strength of effects by combining falloffs — from simple shapes, to shaders or sounds to objects and formulas. Artists can layer Fields atop each other with standard mixing modes and remap their effects. They can also group multiple Fields together and use them to control effectors, deformers, weights and more.

CAD Data Import – Popular CAD formats can be imported into Cinema 4D R20 with a drag and drop. A new scale-based tessellation interface allows users to adjust detail to build amazing visualizations. Step, Solidworks, JT, Catia V5 and IGES formats are supported.

Volume Modeling – Users can create complex models by adding or subtracting basic shapes in Boolean-type operations using Cinema 4D R20’s OpenVDB–based Volume Builder and Mesher. They can also procedurally build organic or hard-surface volumes using any Cinema 4D object, including new Field objects. Volumes can be exported in sequenced .vdb format for use in any application or render engine that supports OpenVDB.

ProRender Enhancements — ProRender in Cinema 4D R20 extends the GPU-rendering toolset with key features including subsurface scattering, motion blur and multipasses. Also included are Metal 2 support, an updated ProRender core, out-of-core textures and other architectural enhancements.

Core Technology Modernization —As part of the transition to a more modern core in Cinema 4D, R20 comes with substantial API enhancements, the new node framework, further development on the new modeling framework and a new UI framework.

During Siggraph, Maxon will have guest artists presenting at their booth each day of the show. Presentations will be live streamed on C4DLive.com.

 

 


iPi Motion Capture V.4 software offers live preview

iPi Soft, makers of motion capture technology, has introduced iPi Motion Capture Version 4, the next version of its markerless motion capture software. Version 4 includes realtime preview capability for a single-depth sensor. Other new features and enhancements include support for new depth sensors (Intel RealSense D415/D435, ASUS Xtion2 and Orbbec Astra/ Astra Pro); improved arms and body tracking; and support for action cameras such as GoPro and SJCAM. With Version 4, iPi Soft also introduces a perpetual license model.

The realtime tracking feature in Version 4 uses iPi Recorder, a free software provided by iPi Soft for capturing, playback and processing video records from multiple cameras and depth sensors, to communicate with iPi Mocap Studio software, which tracks in realtime and instantly transfers motion to 3D characters. This allows users to see how the motion will look on a 3D character and improve motion accordingly at the time of acting and recording, without the need to redo multiple iterations of acting, recording and offline tracking.

Live tracking results can then be stored to disk for additional offline post processing, such as tracking refinement (to improve tracking accuracy), manual corrections and jitter removal.

iPi Mocap Version 4 currently includes the realtime tracking feature for a single depth sensor only. iPi Soft is scheduled to bring realtime functionality for multiple depth sensors to users by the end of this year.

Development of plug-ins for popular 3D game engines, including Unreal Engine and Unity, is also underway.

Tracking Improvements include:
• Realtime tracking support of human performance for live preview for a single-depth sensor (Basic and Pro configurations). Motion can be transferred to a 3D character.
• Improved individual body parts tracking, after performing initial tracking, allows users to re-do tracking for selected body parts to fix tracking errors more quickly.
• Tracking improvements of head and hands when used in conjunction with Sony’s PS Move motion controller takes into account joint limits.

New Sensors and cameras supported include:
• Support for Intel RealSense D415 / D435 depth cameras, Asus Xtion2 motion sensors and Orbbec Astra / Astra Pro 3D cameras.
• Support for action cameras such as GoPro and SJCAM, including wide-angle cameras, allows users to come closer to the camera decreasing space requirements.
• The ability to calibrate individual internal parameters of any camera, helps users to correctly reconstruct 3D information from video for improved overall tracking quality.
• The ability to load unsynchronized videos from multiple cameras and then use iPi Recorder to sync and convert footage to .iPiVideo format used by iPi Mocap Studio.
• Support of fast motion action cameras — video frame rate can reach up to 120fps to allow for tracking extremely fast motions.

Version 4’s perpetual license is not time-limited and includes two years with full support and software updates. Afterwards, users have the option to subscribe to a support plan to continue receiving full support and software updates. Alternatively, they can continue using their latest software version.

iPi Motion Capture Version 4 is also available as a subscription-based model. Prices range from $165 to $1995 depending on the version of software (Express, Basic, Pro) and the duration of subscription.

The Basic edition provides support for up to 6 Sony PS3 Eye cameras or 2 Kinect sensors, and tracking of a single actor. The Pro version features full 16-camera/four depth sensors capability and can track up to three actors. A 30-day free trial for Version 4 is available.

 


House of Moves add Selma Gladney-Edelman, Alastair Macleod

Animation and motion capture studio House of Moves (HOM) has strengthened its team with two new hires — Selma Gladney-Edelman was brought on as executive producer and Alastair Macleod as head of production technology. The two industry vets are coming on board as the studio shifts to offer more custom short- and long-form content, and expands its motion capture technology workflows to its television, feature film, video game and corporate clients.

Selma Gladney-Edelman was most recently VP of Marvel Television for their primetime and animated series. She has worked in film production, animation and visual effects, and was a producer on multiple episodic series at Walt Disney Television Animation, Cartoon Network and Universal Animation. As director of production management across all of the Discovery Channels, she oversaw thousands of hours of television and film programming including TLC projects Say Yes To the Dress, Little People, Big World and Toddlers and Tiaras, while working on the team that garnered an Oscar nom for Werner Herzog’s Encounters at the End of the World and two Emmy wins for Best Children’s Animated Series for Tutenstein.

Scotland native Alastair Macleod is a motion capture expert who has worked in production, technology development and as an animation educator. His production experience includes work on films such as Lord of the Rings: The Two Towers, The Matrix Reloaded, The Matrix Revolutions, 2012, The Twilight Saga: Breaking Dawn — Part 2 and Kubo and the Two Strings for facilities that include Laika, Image Engine, Weta Digital and others.

Macleod pioneered full body motion capture and virtual reality at the research department of Emily Carr University in Vancouver. He was also the head of animation at Vancouver Film School and an instructor at Capilano University in Vancouver. Additionally, he developed PeelSolve, a motion capture solver plug-in for Autodesk Maya.


Technicolor Experience Center launches with HP Mars Home Planet

By Dayna McCallum

Technicolor’s Tim Sarnoff and Marcie Jastrow oversaw the official opening of the Technicolor Experience Center (TEC), with the help of HP’s Sean Young and Rick Champagne, on June 15. The kickoff event also featured the announcement that TEC is teaming up with HP to develop HP Mars Home Planet, an experimental VR experience to reinvent life on Mars for one million humans.

The purpose-built TEC space is located in Blackwelder creative park, a business district designed specifically for the needs of creative and media companies in Culver City. The center, dedicated to bringing artists and scientists together to explore immersive media, covers almost 27,000 square feet, with 3,000 square feet dedicated to motion capture. The TEC serves as a hub connecting Technicolor’s creative houses and research labs across the globe, including an R&D team from France that made an appearance during event via a remote demo, with technology partners, such as HP.

Sarnoff, Technicolor deputy CEO and president of production services, said, “The TEC is about realizing the aspirations of all the players who are part of the nascent immersive ecosystem we work in, from content creation, to content distribution and content consumption. Designing and delivering immersive experiences will require a massive convergence of artistic, technological and economic talent. They will have to come together productively. That is why the TEC has been formed. It is designed to be a practical place where we take theoretical constructs and move systematically to tactical implementation through a creative and dynamic process of experimentation.”

The HP Mars Home Planet project is a global, immersive media collaboration uniting engineers, architects, designers, artists and students to design an urban area on Mars in a VR environment. The project will be built on the terrain from Fusion’s “Mars 2030” game, which is based on research, images, and expertise based on NASA research. In addition to HP, Fusion and TEC, partners include Nvidia, Unreal Engine, Autodesk and HTCVive. Additional details will be released at Siggraph 2017.

Young, worldwide segment manager for product development and AEC for HP Inc., said of the Mars project, “To ensure fidelity and professional-grade quality and a fantastic end-user experience, the TEC is going to oversee the virtual reality development process of the work that is going to be done by collaborators from all over the world. It is an incredible opportunity for anybody from anywhere in the world that is interested in VR to work with Technicolor.”


OptiTrack’s parent company merging with Planar Systems

Planar Systems, a Leyard company and a worldwide provider of display systems, has entered into a definitive merger agreement to acquire NaturalPoint, which makes optical tracking and motion capture solutions, for $125 million in an all-cash transaction. NaturalPoint makes OptiTrack, TrackIR and SmartNav products.

The acquisition brings together companies specializing in complementary technologies to increase attention on the high growth and strategic opportunities in augmented and virtual reality, and in other market segments, like drone tracking, computer visualization and animation.

NaturalPoint is headquartered two hours south of the Planar campus in Oregon and employs a team of 60 in North America. The has a 25,000-square-foot facility for its optical tracking business.

The acquisition is subject to customary closing conditions, and is expected to finalize in the fourth calendar quarter of 2016 or early in the first calendar quarter of 2017. NaturalPoint will remain a separate business with its own executive team, customers and market initiatives.