I was hired by Rascal Post, to help out with making a animated city for the A7S Nirvana music video. According to concept, we needed to come up with a way to  have a deforming – bending and twisting buildings, made out of room modules in Unreal Engine.

I came up with a building generator in Houdini. It works by taking in a base shape of the building, references to Unreal Static Mesh assets used for different room modules, antennas on the roofs etc. It then generates geometry for the building core, rooftops, fundament and a point cloud that has all the parameters Unreal needs for Instanced Static Mesh component. Thanks to per-instance random value, we can further randomize shaders for instanced modules – by adding random texture offsets, random color tinting, different normal map distortion for glass surfaces etc.
Thanks to Houdini Engine, this could be used directly inside Unreal Engine to generate as much different buildings as we needed.

I implemented building deformation controller consists of two parts:

  1. Instanced modules (Instanced Static Meshes) are transformed using multithreaded C++ code
  2. The rest of the building (procedurally generated geometry for building cores and rooftops), were pre-sliced in Houdini and deformed inside Unreal using matching vertex shader math.

As a result we had a very good performance – allowing real time preview for a large city inside Unreal Engine editor.

As always when working in Unreal, it is a quite a challenge to have all of this working in Edit mode in Sequencer. The problem is that usually, when making games, all your logic is supposed to execute when the game is in Play mode. When doing VFX work, you almost never enter the playmode, but live in the sequencer – scrubbing the playhead around with a mouse like in other 3d animation software. So the challenge is to have everything being calculated and updated properly without entering playmode as well as when entering playmode (during final render). There are a lot of caveats working this way.

Big props to Russell Tickner for actually creating all the nice shots in the final video – awesome lighting and environment setup in Unreal!

Another set of challellenges is always coming when using Raytracing in Unreal Engine. This time I discovered a nasty bug when using instanced static meshes in Raytracing/Pathtracing. It looks like UE is not calculating world normals correctly when your instances have a rotation transformation – producing incorrect lighting and unusable reflections/refractions for glass windows. We worked around this using simple cheat – rendering the scene two times – without windows and with fully opaque windows (which worked fine) and then blending between in post, based on fresnel.

, ,

Published: