Texturing with COPs

In theory, COPs could replace Substance Designer and be even more powerful, because you have full access to other Houdini networks and your 3D data. You could use it to generate texture details based on geometry and even other adjacent objects in the scene. Unfortunately COPs are slow – very slow. Most of advanced texturing work is done via COP VOPs and when working in that context – as soon as you modify or add new node – Houdini recompiles everything and re-cooks your COP network. While doing that, it becomes unresponsive for quite a long time, making working with it quite unpleasant endeavor.  Before telling how I scrapped the idea of using COPs and did everything in Substance Desinger, I will first quickly explain basic COPs workflow – because it is still useful for a lot of cool tricks – for example in baking  some sort of custom utility texture that you will use later in game engine.

Most important technique I want to share is how to get 3D geometry attributes in COPs:

  1. First you split your geometry along UV seams (vertex split).
  2. Then transform 3d geometry to 2D UV space.
  3. Then in COP VOP or VEX (executes for each pixel in 2D texture) – you can use xyzdist command to get closest point on the 2D flattened geometry for each pixel. You also get primitive ID and barycentric coordinates of the point inside a primitive .  This works because your 3d geometry is flattened in the same coordinate space as your pixels – in 0-1 space on XY plane, so you can just use your pixel UV coordinates. With that, you can now use Primitive Attribute node ( primuv VEX function) to get any geometry attribute from your mesh – including it’s 3D position, Normal or anything else that you can use in generating your textures. Just remember that in this node, you must reference your actual non-flattened 3d mesh. This works because both models share the same number of primitives in the same order.

Once you get your texture pixels 3d position on the mesh, you can do a lot of other things – like using xyzdist node to find distance to other meshes.

Here is a video where I try to explain this workflow:

Baking

Houdini Game Tools (now SideFX Labs since Houdini 18) includes a nice COPs based texture baker (GameDev Maps Baker). It’s very fast, compared to default baker (Mantra based). Unfortunately It does not support some features out of the box. Fortunately you have access to everything in it and you can extend it to your needs.

For my needs, I added ability to bake out bent normalmaps (Feature supported in Unity HDRP and Unreal that simulates more realistic enviroment reflection and lighting on objects). I also added option to normalize baked position map to bounding unit sphere (recommended for Substance Designer). Also I added ability to specify different file format for each baked texture so I don’t need to manually convert them after baking and I can save some disk space.

GameDev Maps Baker is a SOP node and if you set it to automatic mode (disable manual checkbox), it works just like any other SOP node. If you put it inside your SOP network, It will bake out all the textures when the network is cooking and reaches the Baker node.

Custom Baker HDA (Houdini Indie)
File download:

custom_baker.hdalc

Texturing with Substance Designer and Substance Automation Toolkit

Because COPs were too slow, I decided to use Substance Designer to generate my textures. SD is almost like and industry standard, so it seemed a reasonable choice for texturing pipeline.
Because Substance Designer ir strictly 2D procedural texturing tool, you don’t have access to 3d geometry. Therefore all the data you need should be baked in textures beforehand.

Generating textures in SD was a bit tricky. Because I wanted to generate unique geometry based textures for each 3D asset, it means I needed to work in 3D space and convert to 2D texture space at the end. This way all generated texture details would be orientated correctly even if UV island is rotated. And most importantly – this way you can eliminate seams between UV islands.  Easiest way to do this is to do everything with triplanar mapping. When doing triplanar, one thing is that you can’t use Normal maps directly as they are tangent space dependent. Easiest way to solve this is by not using them at all, but using only height maps which I then convert to normal maps after triplanar projections. Of course this is not a real solution – especially if you don’t have height map info available and must use intricate normal maps. But for most surface details, it gives good enough results. To be fair – in most substances out there, they are usually built by first producing a height map and generating normal map from it internaly. Usually you can ignore the normal map and reference the height map directly and convert it to normal map yourself after triplanar mapping and get the same quality normal map. So it works fine for almost any existing substance out there that you might want to use in your texturing workflow.

After generating the normal map, I then combine it with a baked normal map from high poly geometry to get final normal map.

As for texturing itself – I used ready made sandstone and rock substances from Substance Source and I blended them using procedural masks generated based on input (baked) textures. Output textures in this case are specificaly made for Unity engines HD render pipeline.

Once my substance file is set up – with proper input and output nodes, I can use Substance Automation Toolkit sbsrender command line tool to render out final textures. You give it path to substance archive you want to render, specify substance name inside the archive, specify all input parameter values (paths to input textures, numeric values etc.), set the output resolution and finally specify output texture paths. This can easily be integrated in Houdini PDG using Python TOP node. This way I can automate unique texture generation for each asset if needed. More on this in the next part of the article!

Post A Comment

Your email address will not be published. Required fields are marked *