NBCUni 7.26

Category Archives: video games

Lenovo intros next-gen ThinkPads

Lenovo has launched the next generation of its ThinkPad P Series with the release of five new ThinkPads, including the ThinkPad P73, ThinkPad P53, ThinkPad P1 Gen 2 and ThinkPad P53s and P43s.

The ThinkPad P53 features the Nvidia Quadro RTX 5000 GPU with RT and Tensor cores, offering realtime raytracing and AI acceleration. It now features Intel Xeon and 9th Gen Core class CPUs with up to eight cores (including the Core i9) up to 128GB of memory and 6TB of storage.

This mobile workstation also boasts a new OLED touch display with Dolby Vision HDR for superb color and some of the deepest black levels ever. Building on the innovation behind the ThinkPad P1 power supply, Lenovo is also maximizing the portability of this workstation with a 35 percent smaller power supply. The ThinkPad P53 is designed to handle everything from augmented reality and VR content creation to the deployment of mobile AI or ISV workflows. The ThinkPad P53 will be available in July, starting at $1,799.

At 3.74 pounds and 17.2mm thin, Lenovo’s thinnest and lightest 15-inch workstation — the ThinkPad P1 Gen 2 — includes the latest Nvidia Quadro Turing T1000 and T2000 GPUs. The ThinkPad P1 also features eight-core Intel 9th Gen Xeon and Core CPUs and an OLED touch display with Dolby Vision HDR.

The ThinkPad P1 Gen 2 will be available at the end of June starting at $1,949.

With its 17.3-inch Dolby Vision 4K UHD screen and mobility with a 35% smaller power adaptor, Lenovo’s ThinkPad P73 offers users maximum workspace and mobility. Like the ThinkPad 53, it features the Intel Xeon and Core processors and the most powerful Nvidia Quadro RTX graphics. The ThinkPad P73 will be available in August starting at $1,849.

The ThinkPad P43s features a 14-inch chassis and will be available in July starting at $1,499.

Rounding out the line is the ThinkPad P53s which combines the latest Nvidia Quadro graphics and Intel Core processors — all in a thin and light chassis. The ThinkPad P53s will be available in June, starting at $1,499.

For the first time, Lenovo is adding new X-Rite Pantone Factory Color Calibration to the ThinkPad P1 Gen 2, ThinkPad P53 and ThinkPad P73. The unique factory color calibration profile is stored in the cloud to ensure more accurate recalibration. This profile allows for dynamic switching between color spaces, including sRGB, Adobe RGB and DCI-P3 to ensure accurate ISV application performance.

The entire ThinkPad portfolio is also equipped with advanced ThinkShield security features – from ThinkShutter to privacy screens to self-healing BIOS that recover when attacked or corrupted – to help protect users from every angle and give them the freedom to innovate fearlessly.

Quick Chat: Lord Danger takes on VFX-heavy Devil May Cry 5 spot

By Randi Altman

Visual effects for spots have become more and more sophisticated, and the recent Capcom trailer promoting the availability of its game Devil May Cry 5 is a perfect example.

 The Mike Diva-directed Something Greater starts off like it might be a commercial for an anti-depressant with images of a woman cooking dinner for some guests, people working at a construction site, a bored guy trimming hedges… but suddenly each of our “Everyday Joes” turns into a warrior fighting baddies in a video game.

Josh Shadid

The hedge trimmer’s right arm turns into a futuristic weapon, the construction worker evokes a panther to fight a monster, and the lady cooking is seen with guns a blazin’ in both hands. When she runs out of ammo, and to the dismay of her dinner guests, her arms turn into giant saws. 

Lord Danger’s team worked closely with Capcom USA to create this over-the-top experience, and they provided everything from production to VFX to post, including sound and music.

We reached out to Lord Danger founder/EP Josh Shadid to learn more about their collaboration with Capcom, as well as their workflow.

How much direction did you get from Capcom? What was their brief to you?
Capcom’s fight-games director of brand marketing, Charlene Ingram, came to us with a simple request — make a memorable TV commercial that did not use gameplay footage but still illustrated the intensity and epic-ness of the DMC series.

What was it shot on and why?
We shot on both Arri Alexa Mini and Phantom Flex 4k using Zeiss Super Speed MKii Prime lenses, thanks to our friends at Antagonist Camera, and a Technodolly motion control crane arm. We used the Phantom on the Technodolly to capture the high-speed shots. We used that setup to speed ramp through character actions, while maintaining 4K resolution for post in both the garden and kitchen transformations.

We used the Alexa Mini on the rest of the spot. It’s our preferred camera for most of our shoots because we love the combination of its size and image quality. The Technodolly allowed us to create frame-accurate, repeatable camera movements around the characters so we could seamlessly stitch together multiple shots as one. We also needed to cue the fight choreography to sync up with our camera positions.

You had a VFX supervisor on set. Can you give an example of how that was beneficial?
We did have a VFX supervisor on site for this production. Our usual VFX supervisor is one of our lead animators — having him on site to work with means we’re often starting elements in our post production workflow while we’re still shooting.

Assuming some of it was greenscreen?
We shot elements of the construction site and gardening scene on greenscreen. We used pop-ups to film these elements on set so we could mimic camera moves and lighting perfectly. We also took photogrammetry scans of our characters to help rebuild parts of their bodies during transition moments, and to emulate flying without requiring wire work — which would have been difficult to control outside during windy and rainy weather.

Can you talk about some of the more challenging VFX?
The shot of the gardener jumping into the air while the camera spins around him twice was particularly difficult. The camera starts on a 45-degree frontal, swings behind him and then returns to a 45-degree frontal once he’s in the air.

We had to digitally recreate the entire street, so we used the technocrane at the highest position possible to capture data from a slow pan across the neighborhood in order to rebuild the world. We also had to shoot this scene in several pieces and stitch it together. Since we didn’t use wire work to suspend the character, we also had to recreate the lower half of his body in 3D to achieve a natural looking jump position. That with the combination of the CG weapon elements made for a challenging composite — but in the end, it turned out really dramatic (and pretty cool).

Were any of the assets provided by Capcom? All created from scratch?
We were provided with the character and weapons models from Capcom — but these were in-game assets, and if you’ve played the game you’ll see that the environments are often dark and moody, so the textures and shaders really didn’t apply to a real-world scenario.

Our character modeling team had to recreate and re-interpret what these characters and weapons would look like in the real world — and they had to nail it — because game culture wouldn’t forgive a poor interpretation of these iconic elements. So far the feedback has been pretty darn good.

In what ways did being the production company and the VFX house on the project help?
The separation of creative from production and post production is an outdated model. The time it takes to bring each team up to speed, to manage the communication of ideas between creatives and to ensure there is a cohesive vision from start to finish, increases both the costs and the time it takes to deliver a final project.

We shot and delivered all of Devil May Cry’s Something Greater in four weeks total, all in-house. We find that working as the production company and VFX house reduces the ratio of managers per creative significantly, putting more of the money into the final product.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

NBCUni 7.26

From artist to AR technologist: What I learned along the way

By Leon Hui

As an ARwall co-founder and chief technology officer (CTO), I manage all things relating to technology for the company. This includes overseeing software and technology development, designs, engineering, IT, troubleshooting and everything in-between. Launching the company, I solely developed the critical pieces of technology required to achieve the ARwall concept overall.

I came into augmented reality (AR) as a game development software engineer, and that plays a big role in how I approach this new medium. Stepping into ARwall, it became my job to produce artistic realtime graphics for AR backdrops and settings, while also pursuing technological advancements that will move the AR industry forward.

Rene Amador presents in front of an ARwall screen. The TV monitor in the foreground shows the camera’s perspective.

Alongside CEO Rene Amador, the best way we found to make sure the company retained artistic values was to bring on highly talented artists, coders and engineers with a diverse skill set in both art and tech. It’s our mission to not let the scales tip one way or the other, and to focus on bringing in both artistic and tech talent.

With the continuing convergence of entertainment and technology, it is vital for a creative technology company to continue to advance, while maintaining and nurturing artistic integrity.

Here is what we have learned along the way in striking this balance:

Diversify Your Hiring
Going into AR, or any other immersive field, it is very important that one understands realtime graphics.

So, while it’s useful for my company to hire engineers that have graphics and coding backgrounds — as many game engineers do — it’s still crucial to hire for the individual strengths of both tech and art. At ARwall, our open roles could be combined for one gifted individual, or isolated with an emphasis on either artistry or coding, for those with specialties.

Because we are dealing with high-quality realtime graphics, the ARwall team would be similar to the team profiles of any AAA game studio. We never deviated from an artistic trajectory — we just brought technology along for the ride. We think of talent recruitment as a crucial process in our advancement and always have our eyes out for our next game developer to fill roles ranging from technical, environment, material and character artist to graphics, game engine and generalist engineer.

Expand Your Education
If someone with a background in film or TV post production came to work in a new tech industry, like AR, they would need to expand their own education. It’s challenging, but not impossible. While my company’s current emphasis is on game developers and CG artists, the backgrounds of fellow co-founders Rene Amador, Eric Navarrette and Jocelyn Hsu sit in ad agencies, television digital development, post production and beyond.

Jocelyn Hsu on an XR set, a combination of physical set pieces with the CG set extension running in the background.

There are a variety of toolsets and concepts left to learn, including: the software development life cycle; Microsoft Project or Hansoft; Agile methodology; the definition of “realtime graphics” and how it works; the top-dog game engine tools, including Unity and Unreal Engine 4; and digital asset creation pipelines for game engines, among others.

The transition is largely based on ones game development background but, of course, there is always a learning curve when entering a new industry.

Focus on the Balance
We understand that the core of a “technology company,” as we bill ourselves, is still the foundational technology. However, depending on the type of technology, companies need staffers that have a high-level mastery of the technology in order to demonstrate its full potential to others. It just happens that with AR technology there is an inherently visual aspect, which translates to a need for superior artistry in unison with the precise technology.

In order for AR technology to showcase and look more appealing, high-quality artistry is very much needed. This can be a difficult balance to maintain if focus or purpose are lost. For ARwall, we aim to hire talent that excels at art or engineering, or both.

ARwall expanded its offerings to stake its claim as a technology company, but built on each founders’ roots as artists, engineers and producers. Tech and art aren’t mutually-exclusive; rather, with focus, education and time to search for the right talent, technology companies can excel with invention and keep their creative edge, all at once.


Leon Hui brings to the team 20+ years of technical experience as a software engineer focusing on realtime 3D graphics, VR/AR and systems architecture. He has lead/senior technical roles on 15 AAA shipped titles as a veteran of top developers including EA, Microsoft Studios, Konami Digital Entertainment. He was previously TD at Skydance Interactive. ARwall is based in Burbank. 

 


Nvidia intros Turing-powered Titan RTX

Nvidia has introduced its new Nvidia Titan RTX, a desktop GPU that provides the kind of massive performance needed for creative applications, AI research and data science. Driven by the new Nvidia Turing architecture, Titan RTX — dubbed T-Rex — delivers 130 teraflops of deep learning performance and 11 GigaRays of raytracing performance.

Turing features new RT Cores to accelerate raytracing, plus new multi-precision Tensor Cores for AI training and inferencing. These two engines — along with more powerful compute and enhanced rasterization — will help speed the work of developers, designers and artists across multiple industries.

Designed for computationally demanding applications, Titan RTX combines AI, realtime raytraced graphics, next-gen virtual reality and high-performance computing. It offers the following features and capabilities:
• 576 multi-precision Turing Tensor Cores, providing up to 130 Teraflops of deep learning performance
• 72 Turing RT Cores, delivering up to 11 GigaRays per second of realtime raytracing performance
• 24GB of high-speed GDDR6 memory with 672GB/s of bandwidth — two times the memory of previous-generation Titan GPUs — to fit larger models and datasets
• 100GB/s Nvidia NVLink, which can pair two Titan RTX GPUs to scale memory and compute
• Performance and memory bandwidth sufficient for realtime 8K video editing
• VirtualLink port, which provides the performance and connectivity required by next-gen VR headsets

Titan RTX provides multi-precision Turing Tensor Cores for breakthrough performance from FP32, FP16, INT8 and INT4, allowing faster training and inference of neural networks. It offers twice the memory capacity of previous-generation Titan GPUs, along with NVLink to allow researchers to experiment with larger neural networks and datasets.

Titan RTX accelerates data analytics with RAPIDS. RAPIDS open-source libraries integrate seamlessly with the world’s most popular data science workflows to speed up machine learning.

Titan RTX will be available later in December in the US and Europe for $2,499.


Epic Games’ Unreal Engine 4.21 adds more mobile optimizations, efficiencies

Epic Games’ Unreal Engine 4.21 is designed to offer developers greater efficiency, performance and stability for those working on any platform.

Unreal Engine 4.21 adds even more mobile optimizations to both Android and iOS, up to 60% speed increases when cooking content and more power and flexibility in the Niagara effects toolset for realtime VFX. Also, the new production-ready Replication Graph plugin enables developers to build multiplayer experiences at a scale that hasn’t been possible before, and Pixel Streaming allows users to stream interactive content directly to remote devices with no compromises on rendering quality.

Updates in Unreal Studio 4.21 also offer new capabilities and enhanced productivity for users in the enterprise space, including architecture, manufacturing, product design and other areas of professional visualization. Unreal Studio’s Datasmith workflow toolkit now includes support for Autodesk Revit, and enhanced material translation for Autodesk 3ds Max, all enabling more efficient design review and iteration.

Here is more about the key features:
Replication Graph: The Replication Graph plugin, which is now production-ready, makes it possible to customize network replication in order to build large-scale multiplayer games that would not be viable with traditional replication strategies.

Niagara Enhancements: The Niagara VFX feature set continues to grow, with substantial quality of life improvements and Nintendo Switch support added in Unreal Engine 4.21.

Sequencer Improvements: New capabilities within Sequencer allow users to record incoming video feeds to disk as OpenEXR frames and create a track in Sequencer, with the ability to edit and scrub the track as usual. This enables users to synchronize video with CG assets and play them back together from the timeline.

Pixel Streaming (Early Access): With the new Pixel Streaming feature, users can author interactive experiences such as product configurations or training applications, host them on a cloud-based GPU or local server, and stream them to remove devices via web browser without the need for additional software or porting.

Mobile Optimizations: The mobile development process gets even better thanks to all of the mobile optimizations that were developed for Fortnite‘s initial release on Android, in addition to all of the iOS improvements from Epic’s ongoing updates. With the help of Samsung, Unreal Engine 4.21 includes all of the Vulkan engineering and optimization work that was done to help ship Fortnite on the Samsung Galaxy Note 9 and is 100% feature compatible with OpenGL ES 3.1.

Much Faster Cook Times: In addition to the optimized cooking process, low-level code avoids performing unnecessary file system operations, and cooker timers have been streamlined.

Gauntlet Automation Framework (Early access): The new Gauntlet automation framework enables developers to automate the process of deploying builds to devices, running one or more clients and or/servers, and processing the results. Gauntlet scripts can automatically profile points of interest, validate gameplay logic, check return values from backend APIs and more. Gauntlet has been battle tested for months in the process of optimizing Fortnite, and is a key part of ensuring it runs smoothly on all platforms.

Animation System Optimizations and Improvements: Unreal Engine’s animation system continues to build on best-in-class features thanks to new workflow improvements, better surfacing of information, new tools, and more.

Blackmagic Video Card Support: Unreal Engine 4.21 also adds support for Blackmagic video I/O cards for those working in film and broadcast. Creatives in the space can now choose between Blackmagic and AJA Video Systems, the two most popular options for professional video I/O.

Improved Media I/O: Unreal Engine 4.21 now supports 10-bit video I/O, audio I/O, 4K, and Ultra HD output over SDI, as well as legacy interlaced and PsF HD formats, enabling greater color accuracy and integration of some legacy formats still in use by large broadcasters.

Windows Mixed Reality: Unreal Engine 4.21 natively supports the Windows Mixed Reality (WMR) platform and headsets, such as the HP Mixed Reality headset and the Samsung HMD Odyssey headset.

Magic Leap Improvements: Unreal Engine 4.21 supports all the features needed to develop complete applications on Magic Leap’s Lumin-based devices — rendering, controller support, gesture recognition, audio input/output, media, and more.

Oculus Avatars: The Oculus Avatar SDK includes an Unreal package to assist developers in implementing first-person hand presence for the Rift and Touch controllers. The package includes avatar hand and body assets that are viewable by other users in social applications.

Datasmith for Revit (Unreal Studio): Unreal Studio’s Datasmith workflow toolkit for streamlining the transfer of CAD data into Unreal Engine now includes support for Autodesk Revit. Supported elements include materials, metadata, hierarchy, geometric instancing, lights and cameras.

Multi-User Viewer Project Template (Unreal Studio): A new project template for Unreal Studio 4.21 enables multiple users to connect in a real-time environment via desktop or VR, facilitating interactive, collaborative design reviews across any work site.

Accelerated Automation with Jacketing and Defeaturing (Unreal Studio): Jacketing automatically identifies meshes and polygons that have a high probability of being hidden from view, and lets users hide, remove or move them to another layer; this command is also available through Python so Unreal Studio users can integrate this step into automated preparation workflows. Defeaturing automatically removes unnecessary detail (e.g. blind holes, protrusions) from mechanical models to reduce polygon count and boost performance.

Enhanced 3ds Max Material Translation (Unreal Studio): There is now support for most commonly used 3ds Max maps, improving visual fidelity and reducing rework. Those in the free Unreal Studio beta can now translate 3ds Max material graphs to Unreal graphs when exporting, making materials easier to understand and work with. Users can also leverage improvements in BRDF matching from V-Ray materials, especially metal and glass.

DWG and Alias Wire Import (Unreal Studio): Datasmith now supports DWG and Alias Wire file types, enabling designers to import more 3D data directly from Autodesk AutoCAD and Autodesk Alias.


Our SIGGRAPH 2018 video coverage

SIGGRAPH is always a great place to wander around and learn about new and future technology. You can get see amazing visual effects reels and learn how the work was created by the artists themselves. You can get demos of new products, and you can immerse yourself in a completely digital environment. In short, SIGGRAPH is educational and fun.

If you weren’t able to make it this year, or attended but couldn’t see it all, we would like to invite you to watch our video coverage from the show.

SIGGRAPH 2018


2nd-gen AMD Ryzen Threadripper processors

At the SIGGRAPH show, AMD announced the availability of its 2nd-generation AMD Ryzen Threadripper 2990WX processor with 32 cores and 64 threads. These new AMD Ryzen Threadripper processors are built using 12nm “Zen+” x86 processor architecture. Second-gen AMD Ryzen Threadripper processors support the most I/O and are compatible with existing AMD X399 chipset motherboards via a simple BIOS update, offering builders a broad choice for designing the ultimate high-end desktop or workstation PC.

The 32-core/64-thread Ryzen Threadripper 2990WX and the 24-core/48-thread Ryzen Threadripper 2970WX are purpose-built for prosumers who crave raw computational compute power to dispatch the heaviest workloads. The 2nd-gen AMD Ryzen Threadripper 2990WX offers up to 53 percent faster multithread performance and up to 47 percent more rendering performance for creators than the core i9-7980XE.

This new AMD Ryzen Threadripper X series comes with a higher base and boost clocks for users who need high performance. The 16 cores and 32 threads in the 2950X model offer up to 41 percent more multithreaded performance than the Core i9-7900X.

Additional performance and value come from:
• AMD StoreMI technology: All X399 platform users will now have free access to AMD StoreMI technology, enabling configured PCs to load files, games and applications from a high-capacity hard drive at SSD-like read speeds.
• Ryzen Master Utility: Like all AMD Ryzen processors, the 2nd-generation AMD Ryzen Threadripper CPUs are fully unlocked. With the updated AMD Ryzen Master Utility, AMD has added new features, such as fast core detection both on die and per CCX; advanced hardware controls; and simple, one-click workload optimizations.
• Precision Boost Overdrive (PBO): A new performance-enhancing feature that allows multithreaded boost limits to be raised by tapping into extra power delivery headroom in premium motherboards.

With a simple BIOS update, all 2nd-generation AMD Ryzen Threadripper CPUs are supported by a full ecosystem of new motherboards and all existing X399 platforms. Designs are available from top motherboard manufacturers, including ASRock, ASUS, Gigabyte and MSI.

The 32-core, 64-thread AMD Ryzen Threadripper 2990WX is available now from global retailers and system integrators. The 16-core, 32-thread AMD Ryzen Threadripper 2950X processor is expected to launch on August 31, and the AMD Ryzen Threadripper 2970WX and 2920X models are slated for launch in October.


Epic Games launches Unreal Engine 4.20

Epic Games has introduced Unreal Engine 4.20, which allows developers to build even more realistic characters and immersive environments across games, film and TV, VR/AR/MR and enterprise applications. The Unreal Engine 4.20 release combines the latest realtime rendering advancements with improved creative tools, making it even easier to ship games across all platforms. With hundreds of optimizations, especially for iOS, Android and Nintendo Switch — which have been built for Fortnite and are now rolled into Unreal Engine 4.20 and released to all users — Epic is providing developers with the scalable tools they need for these types of projects.

Artists working in visual effects, animation, broadcast and virtual production will find enhancements for digital humans, VFX and cinematic depth of field, allowing them to create realistic images across all forms of media and entertainment. In the enterprise space, Unreal Studio 4.20 includes upgrades to the UE4 Datasmith plugin suite, such as SketchUp support, which make it easier to get CAD data prepped, imported and working in Unreal Engine.

Here are some key features of Unreal Engine 4.20:

A new proxy LOD system: Users can handle sprawling worlds via UE4’s production-ready Proxy LOD system for the easy reduction of rendering cost due to poly count, draw calls and material complexity. Proxy LOD offers big gains when developing for mobile and console platforms.

A smoother mobile experience: Over 100 mobile optimizations developed for Fortnite come to all 4.20 users, marking a major shift for easy “shippability” and seamless gameplay optimization across platforms. Major enhancements include improved Android debugging, mobile Landscape improvements, RHI thread on Android and occlusion queries on mobile.

Works better with Switch: Epic has improved Nintendo Switch development by releasing tons of performance and memory improvements built for Fortnite on Nintendo Switch to 4.20 users as well.

Niagara VFX (early access): Unreal Engine’s new programmable VFX editor, Niagara, is now available in early access and will help developers take their VFX to the next level. This new suite of tools is built from the ground up to give artists unprecedented control over particle simulation, rendering and performance for more sophisticated visuals. This tool will eventually replace the Unreal Cascade particle editor.

Cinematic depth of field: Unreal Engine 4.20 delivers tools for achieving depth of field at true cinematic quality in any scene. This brand-new implementation replaces the Circle DOF method. It’s faster, cleaner and provides a cinematic appearance through the use of a procedural bokeh simulation. Cinematic DOF also supports alpha channel and dynamic resolution stability, and has multiple settings for scaling up or down on console platforms based on project requirements. This feature debuted at GDC this year as part of the Star Wars “Reflections” demo by Epic, ILMxLAB and Nvidia.

Digital human improvements: In-engine tools now include dual-lobe specular/double Beckman specular models, backscatter transmission in lights, boundary bleed color subsurface scattering, iris normal slot for eyes and screen space irradiance to build the most cutting-edge digital humans in games and beyond.

Live record and replay: All developers now have access to code from Epic’s Fortnite Replay system. Content creators can easily use footage of recorded gameplay sessions to create incredible replay videos.

Sequencer cinematic updates: New features include frame accuracy, media tracking, curve editor/evaluation and Final Cut Pro 7 XML import/export.

Shotgun integration: Shotgun, a production management and asset tracking solution, is now supported. This will streamline workflows for Shotgun users in game development who are leveraging Unreal’s realtime performance. Shotgun users can assign tasks to specific assets within Unreal Engine.

Mixed reality capture support (early access): Users with virtual production workflows will now have mixed reality capture support that includes video input, calibration and in-game compositing. Supported webcams and HDMI capture devices allow users to pull real-world greenscreened video into the engine, and supported tracking devices can match your camera location to the in-game camera for more dynamic shots.

AR support: Unreal Engine 4.20 ships with native support for ARKit 2, which includes features for creating shared, collaborative AR experiences. Also included is the latest support for Magic Leap One, Google ARCore 1.2 support.

Metadata control: Import metadata from 3ds Max, SketchUp and other common CAD tools for the opportunity to batch process objects by property, or expose metadata via scripts. Metadata enables more creative uses of Unreal Studio, such as Python script commands for updating all meshes of a certain type, or displaying relevant information in interactive experiences.

Mesh editing tools: Unreal Engine now includes a basic mesh editing toolset for quick, simple fixes to imported geometry without having to fix them in the source package and re-import. These tools are ideal for simple touch-ups without having to go to another application. Datasmith also now includes a base Python script that can generate Level of Detail (LOD) meshes automatically.

Non-destructive re-import: Achieve faster iteration through the new parameter tracking system, which monitors updates in both the source data and Unreal Editor, and only imports changed elements. Previous changes to the scene within Unreal Editor are retained and reapplied when source data updates.


V-Ray GPU is Chaos Group’s new GPU rendering architecture

Chaos Group has redesigned its V-Ray RT product. The new V-Ray GPU rendering architecture, according to the company, effectively doubles the speed of production rendering for film, broadcast and design artists. This represents a redesign of V-Ray’s kernel structure, ensuring a dual-blend of high-performance speed and accuracy.

Chaos Group has renamed V-Ray RT to V-Ray GPU, wanting to establish the latter as a professional production renderer capable of supporting volumetrics, advanced shading and other smart tech coming down the road.

Current internal tests have V-Ray GPU running 80 percent faster on the Nvidia’s Titan V, a big gain from previous benchmarks on the Titan Xp, and up to 10-15x faster than an Intel Core i7-7700K, with the same high level of accuracy across interactive and production renders. (For its testing, Chaos Group uses a battery of production scenes to benchmark each release.)

“V-Ray GPU might be the biggest speed leap we’ve ever made,” says Blagovest Taskov, V-Ray GPU lead developer at Chaos Group. “Redesigning V-Ray GPU to be modular makes it much easier for us to exploit the latest GPU architectures and to add functionality without impacting performance. With our expanded feature set, V-Ray GPU can be used in many more production scenarios, from big-budget films to data-heavy architecture projects, while providing more speed than ever before.”

Representing over two years of dedicated R&D, V-Ray GPU builds on nine years of GPU-driven development in V-Ray. New gains for production artists include:

• Volume Rendering – Fog, smoke and fire can be rendered with the speed of V-Ray GPU. It’s compatible with V-Ray Volume Grid, which supports OpenVDB, Field3D and Phoenix FD volume caches.
• Adaptive Dome Light – Cleaner image-based lighting is now faster and even more accurate.
• V-Ray Denoising – Offering GPU-accelerated denoising across render elements and animations.
• Nvidia AI Denoiser – Fast, real-time denoising based on Nvidia OptiX AI-accelerated denoising technology.
• Interface Support – Instant filtering of GPU-supported features lets artists know what’s available in V-Ray GPU (starting within 3ds Max).

V-Ray GPU will be made available as part of the next update of V-Ray Next for 3ds Max beta.

VES names award nominees

The Visual Effects Society (VES) has announced the nominees for its 16th Annual VES Awards, which recognize visual effects artistry and innovation in film, animation, television, commercials and video games and the VFX supervisors, VFX producers and hands-on artists who bring this work to life.

Blade Runner 2049 and War for the Planet of the Apes have tied for the most feature film nominations with seven each. Despicable Me 3 is the top animated film contender with five nominations, and Game of Thrones leads the broadcast field and scores the most nominations overall with 11.

Nominees in 24 categories were selected by VES members via events hosted by 10 of its sections, including Australia, the Bay Area, London, Los Angeles, Montreal, New York, New Zealand, Toronto, Vancouver and Washington. The VES Awards will be held on February 13 at the Beverly Hilton Hotel. The VES Georges Méliès Award will be presented to Academy Award-winning visual effects supervisor Joe Letteri, VES. The VES Lifetime Achievement Award will be presented to producer/writer/director Jon Favreau. Comedian Patton Oswalt will once again host.

Here are the nominees:

Outstanding Visual Effects in a Photoreal Feature

 

Blade Runner 2049

John Nelson

Karen Murphy Mundell

Paul Lambert

Richard Hoover

Gerd Nefzer

 

Guardians of the Galaxy Vol. 2

Christopher Townsend

Damien Carr

Guy Williams

Jonathan Fawkner

Dan Sudick

Kong: Skull Island

Jeff White

Tom Peitzman

Stephen Rosenbaum

Scott Benza

Michael Meinardus

 

Star Wars: The Last Jedi

Ben Morris

Tim Keene

Eddie Pasquarello

Daniel Seddon

Chris Corbould

 

War for the Planet of the Apes

Joe Letteri

Ryan Stafford

Daniel Barrett

Dan Lemmon

Joel Whist

 

Outstanding Supporting Visual Effects in a Photoreal Feature

 

Darkest Hour

Stephane Naze

Warwick Hewitt

Guillaume Terrien

Benjamin Magana

Downsizing

James E. Price

Susan MacLeod

Lindy De Quattro

Stéphane Nazé

 

Dunkirk

Andrew Jackson

Mike Chambers

Andrew Lockley

Alison Wortman

Scott Fisher

 

Mother!

Dan Schrecker

Colleen Bachman

Ben Snow

Wayne Billheimer

Peter Chesney

 

Only the Brave

Eric Barba

Dione Wood

Matthew Lane

Georg Kaltenbrunner

Michael Meinardus

 

Outstanding Visual Effects in an Animated Feature

 

Captain Underpants

David Soren

Mark Swift

Mirielle Soria

David Dulac

 

Cars 3

Brian Fee

Kevin Reher

Michael Fong

Jon Reisch

Coco

Lee Unkrich

Darla K. Anderson

David Ryu

Michael K. O’Brien

 

Despicable Me 3

Pierre Coffin

Chris Meledandri

Kyle Balda

Eric Guillon

 

The Lego Batman Movie

Rob Coleman

Amber Naismith

Grant Freckelton

Damien Gray

The Lego Ninjago Movie

Gregory Jowle

Fiona Chilton

Miles Green

Kim Taylor

 

Outstanding Visual Effects in a Photoreal Episode

 

Agents of S.H.I.E.L.D.: Orientation Part 1

Mark Kolpack

Sabrina Arnold

David Rey

Kevin Yuille

Gary D’Amico

 

Game of Thrones: Beyond the Wall

Joe Bauer

Steve Kullback

Chris Baird

David Ramos

Sam Conway

 

Legion: Chapter 1

John Ross

Eddie Bonin

Sebastien Bergeron

Lionel Lim

Paul Benjamin

 

Star Trek: Discovery: The Vulcan Hello

Jason Michael Zimmerman

Aleksandra Kochoska

Ante Dekovic

Mahmoud Rahnama

 

Stranger Things 2: The Gate

Paul Graff

Christina Graff

Seth Hill

Joel Sevilla

Caius the Man

 

Outstanding Supporting Visual Effects in a Photoreal Episode

 

Black Sails: XXIX

Erik Henry

Terron Pratt

Yafei Wu

David Wahlberg

Paul Dimmer

 

Fear the Walking Dead: Sleigh Ride

Peter Crosman

Denise Gayle

Philip Nussbaumer

Martin Pelletier

Frank Ludica

 

Mr. Robot: eps3.4_runtime-err0r.r00

Ariel Altman

Lauren Montuori

John Miller

Luciano DiGeronimo

 

Outlander: Eye of the Storm

Richard Briscoe

Elicia Bessette

Aladino Debert

Filip Orrby

Doug Hardy

 

Taboo: Pilot

Henry Badgett

Tracy McCreary

Nic Birmingham

Simon Rowe

Colin Gorry

 

Vikings: On the Eve

Dominic Remane

Mike Borrett

Ovidiu Cinazan

Paul Wishart

Paul Byrne

 

Outstanding Visual Effects in a Real-Time Project

 

Assassin’s Creed Origins

Raphael Lacoste

Patrick Limoges

Jean-Sebastien Guay

Ulrich Haar

 

Call of Duty: WWII

Joe Salud

Atsushi Seo

Danny Chan

Jeremy Kendall

 

Fortnite: A Hard Day’s Night

Michael Clausen

Gavin Moran

Brian Brecht

Andrew Harris

 

Sonaria

Scot Stafford

Camille Cellucci

Kevin Dart

Theresa Latzko

 

Uncharted: The Lost Legacy

Shaun Escayg

Tate Mosesian

Eben Cook

 

Outstanding Visual Effects in a Commercial

 

Beyond Good and Evil 2

Leon Berelle

Maxime Luère

Dominique Boidin

Remi Kozyra

 

Kia Niro: Hero’s Journey

Robert Sethi

Anastasia von Rahl

Tom Graham

Chris Knight

Dave Peterson

 

Mercedes Benz: King of the Jungle

Simon French

Josh King

Alexia Paterson

Leonardo Costa

 

Monster: Opportunity Roars

Ruben Vandebroek

Clairellen Wallin

Kevin Ives

Kyle Cody

 

Samsung: Do What You Can’t, Ostrich

Diarmid Harrison-Murray

Tomek Zietkiewicz

Amir Bazazi

Martino Madeddu

 

Outstanding Visual Effects in a Special Venue Project

 

Avatar: Flight of Passage

Richard Baneham

Amy Jupiter

David Lester

Thrain Shadbolt

 

Corona: Paraiso Secreto

Adam Grint

Jarrad Vladich

Roberto Costas Fernández

Ed Thomas

Felipe Linares

 

Guardians of the Galaxy: Mission: Breakout!

Jason Bayever

Amy Jupiter

Mike Bain

Alexander Thomas

 

National Geographic Encounter: Ocean Odyssey

Thilo Ewers

John Owens

Gioele Cresce

Mariusz Wesierski

 

Nemo and Friends SeaRider

Anthony Apodaca

Kathy Janus

Brandon Benepe

Nick Lucas

Rick Rothschild

 

Star Wars: Secrets of the Empire

Ben Snow

Judah Graham

Ian Bowie

Curtis Hickman

David Layne

 

Outstanding Animated Character in a Photoreal Feature

 

Blade Runner 2049: Rachael

Axel Akkeson

Stefano Carta

Wesley Chandler

Ian Cooke-Grimes

Kong: Skull Island: Kong

Jakub Pistecky

Chris Havreberg

Karin Cooper

Kris Costa

 

War for the Planet of the Apes: Bad Ape

Eteuati Tema

Aidan Martin

Florian Fernandez

Mathias Larserud

War for the Planet of the Apes: Caesar

Dennis Yoo

Ludovic Chailloleau

Douglas McHale

Tim Forbes

 

Outstanding Animated Character in an Animated Feature

 

Coco: Hèctor

Emron Grover

Jonathan Hoffman

Michael Honsel

Guilherme Sauerbronn Jacinto

 

Despicable Me 3: Bratt

Eric Guillon

Bruno Dequier

Julien Soret

Benjamin Fournet

 

The Lego Ninjago Movie: Garma Mecha Man

Arthur Terzis

Wei He

Jean-Marc Ariu

Gibson Radsavanh

 

The Boss Baby: Boss Baby

Alec Baldwin

Carlos Puertolas

Rani Naamani

Joe Moshier

 

The Lego Ninjago Movie: Garmadon

Matthew Everitt

Christian So

Loic Miermont

Fiona Darwin

 

Outstanding Animated Character in an Episode or Real-Time Project

 

Black Mirror: Metalhead

Steven Godfrey

Stafford Lawrence

Andrew Robertson

Lestyn Roberts

 

Game of Thrones: Beyond the Wall – Zombie Polar Bear

Paul Story

Todd Labonte

Matthew Muntean

Nicholas Wilson

 

Game of Thrones: Eastwatch – Drogon Meets Jon

Jonathan Symmonds

Thomas Kutschera

Philipp Winterstein

Andreas Krieg

 

Game of Thrones: The Spoils of War – Drogon Loot Train Attack

Murray Stevenson

Jason Snyman

Jenn Taylor

Florian Friedmann

 

Outstanding Animated Character in a Commercial

 

Beyond Good and Evil 2: Zhou Yuzhu

Dominique Boidin

Maxime Luère

Leon Berelle

Remi Kozyra

 

Mercedes Benz: King of the Jungle

Steve Townrow

Joseph Kane

Greg Martin

Gabriela Ruch Salmeron

 

Netto: The Easter Surprise – Bunny

Alberto Lara

Jorge Montiel

Anotine Mariez

Jon Wood

 

Samsung: Do What You Can’t – Ostrich

David Bryan

Maximilian Mallmann

Tim Van Hussen

Brendan Fagan

 

Outstanding Created Environment in a Photoreal Feature

 

Blade Runner 2049: Los Angeles

Chris McLaughlin

Rhys Salcombe

Seungjin Woo

Francesco Dell’Anna

 

Blade Runner 2049: Trash Mesa

Didier Muanza

Thomas Gillet

Guillaume Mainville

Sylvain Lorgeau

Blade Runner 2049: Vegas

Eric Noel

Arnaud Saibron

Adam Goldstein

Pascal Clement

 

War for the Planet of the Apes: Hidden Fortress

Greg Notzelman

James Shaw

Jay Renner

Gak Gyu Choi

 

War for the Planet of the Apes: Prison Camp

Phillip Leonhardt

Paul Harris

Jeremy Fort

Thomas Lo

 

Outstanding Created Environment in an Animated Feature

 

Cars 3: Abandoned Racetrack

Marlena Fecho

Thidaratana Annee Jonjai

Jose L. Ramos Serrano

Frank Tai

 

Coco: City of the Dead

Michael Frederickson

Jamie Hecker

Jonathan Pytko

Dave Strick

 

Despicable Me 3: Hollywood Destruction

Axelle De Cooman

Pierre Lopes

Milo Riccarand

Nicolas Brack

 

The Lego Ninjago Movie: Ninjago City

Kim Taylor

Angela Ensele

Felicity Coonan

Jean Pascal leBlanc

 

Outstanding Created Environment in an Episode, Commercial or Real-Time Project

 

Assassin’s Creed Origins: City of Memphis

Patrick Limoges

Jean-Sebastien Guay

Mikael Guaveia

Vincent Lombardo

 

Game of Thrones: Beyond the Wall – Frozen Lake

Daniel Villalba

Antonio Lado

José Luis Barreiro

Isaac de la Pompa

 

Game of Thrones: Eastwatch

Patrice Poissant

Deak Ferrand

Dominic Daigle

Gabriel Morin

 

Still Star-Crossed: City

Rafael Solórzano

Isaac de la Pompa

José Luis Barreiro

Óscar Perea

 

Stranger Things 2: The Gate

Saul Galbiati

Michael Maher

Seth Cobb

Kate McFadden

 

Outstanding Virtual Cinematography in a Photoreal Project

 

Beauty and the Beast: Be Our Guest

Shannon Justison

Casey Schatz

Neil Weatherley

Claire Michaud

 

Guardians of the Galaxy Vol. 2: Groot Dance/Opening Fight

James Baker

Steven Lo

Alvise Avati

Robert Stipp

 

Star Wars: The Last Jedi – Crait Surface Battle

Cameron Nielsen

Albert Cheng

John Levin

Johanes Kurnia

 

Thor: Ragnarok – Valkyrie’s Flashback

Hubert Maston

Arthur Moody

Adam Paschke

Casey Schatz

 

Outstanding Model in a Photoreal or Animated Project

 

Blade Runner 2049: LAPD Headquarters

Alex Funke

Steven Saunders

Joaquin Loyzaga

Chris Menges

 

Despicable Me 3: Dru’s Car

Eric Guillon

François-Xavier Lepeintre

Guillaume Boudeville

Pierre Lopes

 

Life: The ISS

Tom Edwards

Chaitanya Kshirsagar

Satish Kuttan

Paresh Dodia

 

US Marines: Anthem – Monument

Tom Bardwell

Paul Liaw

Adam Dewhirst

 

Outstanding Effects Simulations in a Photoreal Feature

 

Kong: Skull Island

Florent Andorra

Alexis Hall

Raul Essig

Branko Grujcic

 

Only the Brave: Fire & Smoke

Georg Kaltenbrunner

Thomas Bevan

Philipp Zaufel

Himanshu Joshi

 

Star Wars: The Last Jedi – Bombing Run

Peter Kyme

Miguel Perez Senent

Ahmed Gharraph

Billy Copley

Star Wars: The Last Jedi – Mega Destroyer Destruction

Mihai Cioroba

Ryoji Fujita

Jiyong Shin

Dan Finnegan

 

War for the Planet of the Apes

David Caeiro Cebrián

Johnathan Nixon

Chet Leavai

Gary Boyle

 

Outstanding Effects Simulations in an Animated Feature

 

Cars 3

Greg Gladstone

Stephen Marshall

Leon JeongWook Park

Tim Speltz

 

Coco

Kristopher Campbell

Stephen Gustafson

Dave Hale

Keith Klohn

 

Despicable Me 3

Bruno Chauffard

Frank Baradat

Milo Riccarand

Nicolas Brack

Ferdinand

Yaron Canetti

Allan Kadkoy

Danny Speck

Mark Adams

 

The Boss Baby

Mitul Patel

Gaurav Mathur

Venkatesh Kongathi

 

Outstanding Effects Simulations in an Episode, Commercial or Real-Time Project

 

Game of Thrones: Beyond the Wall – Frozen Lake

Manuel Ramírez

Óscar Márquez

Pablo Hernández

David Gacituaga

 

Game of Thrones: The Dragon and the Wolf – Wall Destruction

Thomas Hullin

Dominik Kirouac

Sylvain Nouveau

Nathan Arbuckle

 

Heineken: The Trailblazers

Christian Bohm

Andreu Lucio Archs

Carsten Keller

Steve Oakley

 

Outlander: Eye of the Storm – Stormy Seas

Jason Mortimer

Navin Pinto

Greg Teegarden

Steve Ong

 

Outstanding Compositing in a Photoreal Feature

 

Blade Runner 2049: LAPD Approach and Joy Holograms

Tristan Myles

Miles Lauridsen

Joel Delle-Vergin

Farhad Mohasseb

 

Kong: Skull Island

Nelson Sepulveda

Aaron Brown

Paolo Acri

Shawn Mason

 

Thor: Ragnarok: Bridge Battle

Gavin McKenzie

David Simpson

Owen Carroll

Mark Gostlow

 

War for the Planet of the Apes

Christoph Salzmann

Robin Hollander

Ben Morgan

Ben Warner

 

Outstanding Compositing in a Photoreal Episode

 

Game of Thrones: Beyond the Wall – Frozen Lake

Óscar Perea

Santiago Martos

David Esteve

Michael Crane

 

Game of Thrones: Eastwatch

Thomas Montminy Brodeur

Xavier Fourmond

Reuben Barkataki

Sébastien Raets

 

Game of Thrones: The Spoils of War – Loot Train Attack

Dom Hellier

Thijs Noij

Edwin Holdsworth

Giacomo Matteucci

 

Star Trek: Discovery

Phil Prates

Rex Alerta

John Dinh

Karen Cheng

 

Outstanding Compositing in a Photoreal Commercial

 

Destiny 2: New Legends Will Rise

Alex Unruh

Michael Ralla

Helgi Laxdal

Timothy Gutierrez

 

Nespresso: Comin’ Home

Matt Pascuzzi

Steve Drew

Martin Lazaro

Karch Koon

 

Samsung: Do What You Can’t – Ostrich

Michael Gregory

Andrew Roberts

Gustavo Bellon

Rashabh Ramesh Butani

 

Virgin Media: Delivering Awesome

Jonathan Westley

John Thornton

Milo Paterson

George Cressey

 

Outstanding Visual Effects in a Student Project

 

Creature Pinup

Christian Leitner

Juliane Walther

Kiril Mirkov

Lisa Ecker

 

Hybrids

Florian Brauch

Romain Thirion

Matthieu Pujol

Kim Tailhades

 

Les Pionniers de l’Univers

Clementine Courbin

Matthieu Guevel

Jérôme Van Beneden

Anthony Rege

 

The Endless

Nicolas Lourme

Corentin Gravend

Edouard Calemard

Romaric Vivier

 

 

 

 

 

 

 

 

 

 

 

 

 

Naomi Goldman

Principal
NLG Communications
Office: 424-293-2113

Cell: 310-770-2765

ngoldman77@gmail.com

 

LinkedIn Profile

 

Behind the Title: Sounding Sweet audio producer/MD Ed Walker

NAME: Ed Walker

COMPANYSounding Sweet (@sounding_sweet)

CAN YOU DESCRIBE YOUR STUDIO?
We are a UK-based independent recording and audio production company with a recording studio in Stratford Upon Avon, Warwickshire, and separate postproduction facilities in Leamington Spa. Our recording studio is equipped with the latest technology, including a 7.1 surround sound dubbing suite and two purpose-built voiceover booths, which double as Foley studios and music recording spaces when necessary. We are also fully equipped to record ADR, via Source Connect and ISDN.

WHAT’S YOUR JOB TITLE?
Audio producer, sound engineer and managing director — take your pick.

WHAT DOES THAT ENTAIL?
As we are a small business and I am very hands-on., and my responsibilities change on a daily basis. They may include pitching to new clients, liaising with existing clients, overseeing projects from start to finish and ensuring our audio deliveries as a team are over and above what the client is expecting.

WHAT WOULD SURPRISE PEOPLE THE MOST ABOUT WHAT FALLS UNDER THAT TITLE?
Creating and implementing interactive sound into video games is a technical challenge. While I don’t write code myself, as part of working in this industry, I have had to develop a technical understanding of game development and software programming in order to communicate effectively and achieve my audio vision.

WHAT’S YOUR FAVORITE PART OF THE JOB?
I often get the opportunity to go out and record supercars and motorbikes, as well as occasionally recording celebrity voiceovers in the studio. We work with clients both locally and globally, often working across different time zones. We are definitely not a 9-to-5 business.

WHAT’S YOUR LEAST FAVORITE?
Working through the night during crunch periods is hard. However, we understand that the main audio effort is usually applied toward the end of a project, so we are kind of used to it.

WHAT’S YOUR FAVORITE TIME OF THE DAY?
I would have to say first thing in the morning. My studio is so close to home that I get to see my family before I go to work.

IF YOU DID NOT HAVE THIS JOB WHAT WOULD YOU BE DOING INSTEAD?
If I wasn’t producing audio I would have to be doing something equally creative. I need an outlet for my thoughts and emotions, perhaps video editing or creating visual effects.

HOW EARLY ON DID YOU KNOW THIS WOULD BE YOUR PATH?
I have always loved music, as both my parents are classically trained musicians. After trying to learn lots of different instruments, I realized that I had more of an affinity with sound recording. I studied “Popular Music and Recording” at university. Later on, I realized that a lot of the music recording skills I had learned were transferable to creating sound effects for computer games.

CAN YOU NAME SOME RECENT PROJECTS YOU HAVE WORKED ON?
– BMW Series 7 launch in Bahrain — sound design
– Jaguar F Pace launch in Bahrain — sound design
Forza Horizon 3 for Microsoft/Playground Games —  audio design
Guitar Hero Live for Activision — audio design

Forza Horizon 3 Lamborghini

WHAT IS THE PROJECT THAT YOU ARE MOST PROUD OF?
I worked as a sound designer at Codemasters for several years, and I have very fond memories of working on Dirt 2. It sounded awesome back in 2009 in surround sound on the Xbox 360! More recently, Sounding Sweet’s work for Playground Games on Forza Horizon 3 was a lot of fun, and I am very proud of what we achieved.

NAME THREE PIECES OF TECHNOLOGY YOU CAN’T LIVE WITHOUT?
A portable sound recorder, an iPhone and a kettle.

WHAT SOCIAL MEDIA CHANNELS DO YOU FOLLOW?
Facebook, LinkedIn and Twitter

DO YOU LISTEN TO MUSIC WHILE YOU WORK?
All kinds of music — classics, reggae, rock, electronic, the Stones, Led Zeppelin… the list is truly endless.

WHAT DO YOU DO TO DE-STRESS FROM IT ALL?
My wife is half Italian, so we often visit her “homeland” to see the family. This really is the time when I get to switch off.