This was a fairly quick, spare-time exploration of Ultra Dynamic Sky / Weather in Unreal Engine 5.2. It’s a relatively simple scene using assets from Quixel, Unreal and Twinmotion. With UDS the first requirement is to remove all standard lighting and atmosphere components within an Unreal scene, sun/sky, height fog etc., so the plugin-components can work as one integrated system. UDS has hundreds of exposed variables and a large number of presets, which makes set-up extremely fast. Unsurprisingly this scene starts with the ‘rain’ preset.
From the point of view of rendering, this uses Lumen, not hardware ray-tracing. Having compared the two a while back I was impressed by how Lumen treats translucency in particular, which seemed appropriate for both foliage and to some extent surface water. In addition, it’s extremely fast. Lumen is default in UE 5.2 scenes.
As well as particles UDS supports material effects. To do this requires a small amount of editing of Blueprints, but it’s not especially time-consuming or difficult. Before the output component of the blueprint all that’s needed is a (in this instance) wet weather element, which processes base, roughness and normal values. In addition ripples and streaks can be controlled here.
The one issue I had with this scene was the trees. These are Quixel Megascans European Beech and I initially used the foliage painter to provide background density. This however resulted in extreme motion artefacts due to wind movement, which I assume is some conflict between movement vectors and temporal anti-aliasing. I couldn’t resolve this, so instead I hand-placed a couple of hundred trees – a process which only took around 20 minutes. Apart fro this it was a very straightforward scene.
The scene runs in real-time on a Quadro RTX 5000 although I haven’t optimised it. For the cinematic I set TAA samples to 8 – nothing else. Render resolution was 2252 x 944 (2.39:1 at 2048 x 858 times 10% to give a wiggle in After Effects a bit of room to move around). I decided on camera shake in AE just because it’s easier to tweak when the shots are lined up. I also added film-grain (Kodak T320). Render time for the whole cinematic with sub-sampling in UE was around 25 minutes.
I think the metaphor for this kind of test may have changed. Instead of storyboarding in advance, my approach would be more like wandering around a real-world environment with a hand-held camcorder. It’s very refreshing.
The first in an intelligent, multi-layered crime series featuring DI Eva Harris. A vicious killer on the loose, a traitor on the team – and a past that’s catching up with her. Can DI Harris see the truth before it’s too late?
An expert killer who takes his victims’ eyes – can DI Harris see clearly enough to uncover the truth?
On the first day of her new job, Harris is called to the scene of a brutal murder at the heart of Surrey society. A shocking crime, with a meticulous killer: the victim’s eyes have been removed and the body drained of blood, with no forensic evidence left at the scene.
Her boss insists it must be the return of a killer who escaped justice several years ago, leaving a trail of bloody ritualistic killings. Harris isn’t so sure: both sets of victims have had their eyes removed, but other details are significantly different.
But Harris’s desperate desire to uncover the truth is complicated by her secret mission to find corruption at the heart of the police. Who can she trust in her own team? Harris is also hiding a secret past of her own – can she find the killer and learn who to depend on, before her dark history catches up with her?
With availability of nVidia RTX graphics cards, ray-tracing in real-time applications, most especially Unity HDRP and Unreal Engine, has become a viable solution. Benefits are improved image quality and reduced production times, although these can come at the expense of performance unless carefully managed.
During lockdown and persisting into 2021 I’ve experienced a large (read: huge) increase in demand for video production using real-time engines, mostly Unity but also Twinmotion, and this looks as though it’s going to persist. For the sake of my sanity, this is an overview of a production pipeline using Unity HDRP to produce commercial videos for B2B applications.
Interior using Unity HDRP with path tracing. 5000 samples, c.7 mins per frame at 1920×1080.
I work for Kantar in the consulting division, my client is Unilever and I live in the dark recesses of the Customer Insight and Innovation Centre in Leatherhead, at least I did when there wasn’t a global pandemic on. The role originally involved creating real-time projects for research, including quantitative and biometric research, as well as commercial presentations. Using real-time to produce videos for wider distribution has become very popular though, and methodologies have had to adapt accordingly.
Kantar have developed a real-time tool called VR INFINITY, which is a combination a Unity front-end and a cloud-based CMS for content that is designed to run on relatively low specification laptops. The benefit of this is category managers and marketing executives are able to produce and experiment with their own 3d scenes, making it an effective tool. In some situations though, there is a requirement for higher quality visualisations, hence HDRP.
The aesthetic objective here is to match or beat the visual quality provided by teams at advertising agencies using offline renderers such as VRay and Corona. With caveats, this is broadly achievable.
Exterior with HDRI map, direct light, using screen space reflection, screen space global illumination and screen space ambient occlusion.
Back to basics
‘In 3d computer graphics, ray tracing is a technique for modelling light transport for use in a wide variety of rendering algorithms for generating digital images.’ There are a number of variants on the theme of ray-tracing, and as noted by Wikipedia they come with increasing computational cost. As a process though ray-tracing is inherently parallelisable, which makes it viable for RISC / ARM architectures such as graphics cards. (RISC: Reduced Instruction Set Computing – often GPU – vs. CISC, Complex Instruction Set Computing. With apologies for generalisations, but think of this as the difference between multiplying 7×5 and adding 7+7+7+7+7. The addition will process faster on a RISC processor than the multiplication will on a CISC processor, such as an Intel CPU. Amongst other things, this is why Apple have recently ventured into ARM based architectures for computers, in addition to phones and tablets).
In nature, light is emitted by a light source and is either absorbed, reflected, refracted or fluoresces with the surfaces it meets. In doing so some characteristics of the light may change, for example its colour may appear different as a result of interactions. Think of a brightly coloured ball being placed by a neutral coloured wall, and note the colour of the reflection on the wall’s surface. Such interactions cause the visual richness and complexity of the world we see around us.
Inevitably, simulating these complex interactions is a compute-heavy task and requires multiple solutions. In addition to real-time ray-tracing in the form of ambient occlusion, reflections and global illumination, physically based rendering techniques (PBR) using realistic materials enabled by microfacet BRDFs such as GGX to enable greater realism, and are now supported by powerful hardware. It’s a fun time to be involved with computer graphics (but then, this has always been true).
Exterior with HDRI map, direct light, using screen space reflection, screen space global illumination and screen space ambient occlusion.
A brief look at Physically Based Rendering
Without getting into too much detail, PBR shaders use micro facet BRDFs such as GGX. Bi-directional Reflectance Distribution Functions have been at the heart of computer graphics pretty much since day one, and have frequently gone by the names of their creators, such as Blinn, Lambert and Phong. In recent years normal maps have replaced bump maps as the default method for describing surfaces details that are too small or numerous to be effectively modelled by geometry, and have been an enabler for PBR.
‘Microfacet BRDF models are based on a geometrical description of the material surface as a collection of microfacets whose dimensions are much greater than the wavelength.’ (EPFL, Lionel Simonot, Université de Poitiers, France).
Arguably the key term in BRDF is ‘bi-directional,’ as it refers to the effective reversibility of a ray path, meaning the effect of a ray emanating from a light source can also be measured by a ray emanating from the camera / eye, which in many ways is a medieval concept. (Some monks thought ‘light’ was a property that emanated from the eye. Just shows how wrong you can be). Underlying this is conservation of energy, meaning that to be physically realistic no more energy can be output than is input.
PBR shaders will have a number of different components depending on the specific workflow they’re designed for (such as metallic or specular). These may include albedo, metallicity, roughness/smoothness and surface normal, as well as height maps, ambient occlusion, detail maps etc. Like HDRI maps it’s possible using photographic equipment to produce these from scratch, but doing so is generally not a trivial exercise. Fortunately tools such as Substance, Quixel Mixer and Materialize allow for the creation of PBR shaders in a more streamlined fashion. Modelling both metallic and dielectric surfaces has become more accurate and more straightforward.
The point here is simply that PBR shaders have increased the level of visual fidelity in computer graphics in representation of materials, ray-tracing has increased the level of fidelity with respect to lighting, and GPU acceleration has enabled these two to be combined in a timely fashion – mostly. We’ll come onto path tracing in a bit.
Finally… The nuts and bolts.
So after a lengthy preamble we’ll get on to using this stuff in Unity HDRP. The objective is to go from a medium fidelity render of a real-time scene to a substantially higher fidelity render / animation with minimal intervention. Let’s look at the source material:
The scene shown is part of a large series of models developed by Kantar for use on comparatively low performance devices. All models and textures have been built from photographic site surveys, and in this application are all unlit. They’re used in conjunction with planograms and data visualisation, often in the context of very large retail environments (think: the biggest stores Walmart has).
Using real-time ray-tracing in HDRP we can, with comparatively minimal intervention, improve rendering quality substantially (but of course only on computers with ray-trace compatible graphics cards).
Working with lights in real-time ray-tracing is a delight in comparison to off-line renderers. Positioning spots and are lights and seeing the effect of global illumination changes as they occur is an enormous workflow benefit. It also allows for experimentation, which is always a benefit.
Finally, an example of a path-traced version of the same scene:
Ray-tracing is used here to avoid or at least reduce the need to bake light maps into models, and to improve on the quality of screen space reflection, GI and AO, the disadvantages of which revolve primarily around their inability to assess effects outside of the camera frustum, resulting in artefacts with moving cameras.
Kantar’s library of scenes and retail environments is a huge and incredibly flexible resource for any company involved in retail. For more information contact Carl [dot] Goodman [at] Kantar [dot] com.
LIFESIGN, the second book in the DI Eva Harris series, is due to be available as an Audible original in September of 2021. Following the discovery of body parts preserved in formaldehyde in a disused water treatment plant, Eva Harris finds herself in a race against time. Someone has been abducting researchers and conducting gruesome experiments on them. The hunt for the killer draws her into the bizarre world of body modification and transhumanism where she encounters the glacial Anna Seifert, head of a pharmaceutical company with dark secrets. Is Seifert involved in the abductions? Eva will find herself tested to the limit before she discovers the truth.
LIFESIGN is narrated by Louise Brealey, known for Sherlock and A Discovery of Witches amongst many other film and TV credits.
Goldmann Germany, a Penguin Random House imprint, has acquired the rights to 20/20. The agreement was made by Sandra Sawicka at Marjacq, and covers mass market, trade paperback, hardcover and e-book. It’s an immensely exciting development, as Penguin Random House are the largest publisher in Germany!
Hera are delighted to announce that they have secured rights to the first in a thrilling new police procedural series by Carl Goodman. 20/20, the start of the D.I. Eva Harris detective series, will be published in Summer 2021. World rights were acquired from Sandra Sawicka at Marjacq.
Keshini Naidoo said of the deal:
‘Reading 20/20 on submission was a terrifying experience – the killer is one of the most frightening I have read and the narrative is so incredibly compelling. Thank goodness that D.I. Eva Harris is resilient, capable and a true kick-ass detective, making this police procedural debut a truly addictive read. I’m so delighted that Carl is joining us at Hera Books, and I can’t wait for crime fans to see the secrets withing 20/20 for themselves later this summer!’
Carl Goodman commented:
‘I’m thrilled Hera have gone with 20/20, it seems like a perfect fit. Scratch the surface and the affluent world Eva finds herself in is dark, menacing, and worryingly close to home. I’m excited to be working with Hera to explore it!’
Sandra Sawicka added:
‘I’m really delighted that Carl’s incredibly intelligent, twisty crime debut found a home with Hera Books. I hope this will be the beginning of a long and flourishing relationship and readers will be able to enjoy D.I. Eva Harris’ adventures for years to come.’