Procedural rocks
First let’s start with a little demo of what I achieved with rock generation. Here I demonstrate two different rock styles generation from the same input geometry – rendered in Blender Eeevee. To play with one of the demo asset styles in real time, you can check it out on Sketchfab here!
To see how it looks in “The Collector” project demo, which was the project I created this for – check out this link.
For my project, as an user input, I had a basic geometry blockout of the cave, exported as FBX from Blender. Using that, I could generate all the procedural content. This is the level of manual work that I think is acceptable and allows for a good balance between manual and procedural.
Of course you will need to add additional information to the geometry to give context to the input data. One approach is to use some kind of naming convention to specify purpose of the input geometry. For example, I named differently cave rock blocks for walls, “ceiling” pieces, free standing boulders. I used parenting to identify geometry classes. Also, I had basic blockout geometry of the base floor and the columns which I also later used as “collider” geometry that influence the rock shape.
Importing FBX
You have two ways to import FBX files in Houdini. One is using File->Import->Filmbox FBX from main menu, which creates obj
level hierarchy that matches hierarchy inside FBX (One file SOP node in each object, that reads just one object out of FBX file) . For some use cases it might be useful, but not for procedural processing of unknow amount and structure of objects. You can use python to go through the obj node hierarchy, but it is much more flexible to do everything in SOPs – especially when dealing with PDG.
Other option is to use File SOP node to read FBX yourself. Here you get just one SOP node – single object with all objects merged and all object transformations baked in geometry (points already transformed in to world space). Luckily Houdini stores all object information as primitive attributes – including object name and transformation values. You can use that information to extract individual objects, their transformation matrices and material assignments and basically re-create your FBX object and transformation hierarchy inside SOP context.
Houdini geometry spreadheet showing attributes of SOP level node after importing FBX with multiple objects
Extracting transforms
First step is to separate objects based on their name. I do this with a detail wrangle node count_pieces (Executed only once) – manually going through all primitives and assigning integer attribute class
for each unique object name found. I use two handy VEX functions – nuniqueval
and uniqueval
. First one returns number of unique attribute values. Second returns unique attribute value by index. So for each primitive I loop through all unique name attribute values and compare it’s name to it. If there is a match, I store loop index as piece class
(ID).
When each primitive has class attribute assigned, I can use for-each
SOP loop to work with individual object or in my case, because I’m using PDG to parallelize everything – I extract one piece with blast
node based on @name
work item attribute. This attribute is passed from active PDG work item when cooking the PDG graph. For debug purposes I can just pass in any other name to debug specific piece. From this point on I work with each object separately.
VEX code for “count_pieces” primitive wrangle node:
//Returns the number of unique values from an integer //or string attribute i@count = nuniqueval(0, "prim", "name"); int prims = nprimitives(0); for( int p=0; p<prims; p++ ) { for (int i = 0; i < i@count; i++) { //Return i-th unique attribute string val = uniqueval(0, "prim", "name", i); if(val == prim(0,"name",p)) setprimattrib(0,"class",p,i); } }
Attribute wrangle (Detail) code to create transformation matrix and inverse matrix:
v@translate = point(0, "fbx_translation", 0); v@scale = point(0, "fbx_scale", 0); v@rotate = degrees(point(0, "fbx_rotation", 0)); //Store transformation matrix and inverse matrix 4@xform = maketransform(0, 0, v@translate, v@rotate, v@scale ); 4@ixform = invert(4@xform);
When loading FBX with the SOP file node, you get translation, rotation and scale as separate point attributes named: fbx_translation
fbx_scale
and fbx_rotation
. Because I’m working with each piece separately, I can promote these attributes to detail attribute (stored per object instead of per point or primitive).
I’m also creating a combined transformation matrix attribute. To later un-bake transformation from points, I also store inverse matrix of the transform matrix. You can multiply your point coordinates with the inverse matrix to remove any baked in transformation.
Once your blockout geometry is loaded, you can create different SOP branches for doing different things. For example, take the rock blockout pieces and do rock generation. Take ground piece and do sand generation. Filter out collision meshes and group them, so they can be referenced in other branches, while taking just column collider meshes and generating old-destroyed column assets in another branch.
In this way – in a single Houdini file, you can basically have all your asset generation for this type of environment if you want it. Of course it would not be practical in larger teams where it is better to split things up. In fact you should probably utilize Houdini Digital Assets feature to share different node setups so they can be used in different houdini files across your team. Nevertheless, I just wanted to share some ideas on building asset workflows with Houdini.
Shaping
Rock shaping itself is nothing special – just a bunch of VDB SDF (signed distance field) operations, re-meshing, polygon reduction and some more VDB SDF operations.
Tip!
Part that makes big difference is how you apply noises to your SDF. Most people convert it to geometry, smooth the normals and do the regular Point VOP/Wrangle noise displacement and then recalculate normals. What works much better is using VDB and Volume VOP/Wrangle to work with SDF directly. SDF is just a single value volume, where each voxel stores the distance to the closest surface – so you can just offset the “surface” attribute using 3d noises effectively distorting the SDF. This produces better looking details and takes care of self-intersecting geometry.
In the image on the left, you can see comparison of the two methods. As you can see – displace along normals method works good with relatively small displacement values. With bigger values it starts to produce self intersecting geometry and geometry streching in some areas. At the same time, VDB SDF displace method produces even displacement in all cases. With heavier displacements it even starts to produce nice geometry overhangs – almost like vector displacement. Also there is no streching because geometry is calculated at the end – after the displacement.
Note: because of how VDB has an optimised hierarchical structure – it stores values only where it matters. For SDFs, it’s only close to surface boundary. By offsetting surface too much – will result in going out of VDBs activated cells. To avoid that, you must increase inner and outer voxel thickness parameter when creating VDB from geometry.
According to “Collector” concept, all these rocks were actually some sort of sediment that has settled over ancient civilization ruins, enclosing these remains in rock which then got erorded. Therefore I needed to do “collision detection”with old ruins.
I again solved this using VDB SDF shaping operations.
First I convert collider geometry to a volume using Iso offset node. Then I blur the volume and use it later in VDB Reshape SDF (Dilate) as a mask (second input). This node allows to expand the SDF in masked area. After that I use VDB Boolean operation to cut out collider geometry (converted to SDF). Then couple other operations to smooth the edges (again using blurred collider volume as a mask) and add finally some noise and other distortions to the new – expanded part around collider. End result looks like the rock has formed around the obstacles, while leaving holes for obstacles them selves.
Sand
Sand generation is an interesting and very useful Houdini usecase.
I implemented it as a manual step – after generating all the rock assets. I import back all generated rock pieces and use them as geometry colliders in shaping the sand.
Main trick was to use xyzdist
and primuv
VEX functions to get distance to nearest rock surface and also sample surface normal of the rock surface at that point. Using that information, you can tell if this ground point is inside the rock or outside. I care mostly about parts outside the rocks – and using simple geometry offsets controlled by ramp based on distance from rock + some noises, I simulate sand build-ups close to rocks. Also I use distance info to generate masks for material blending as vertex colors, that I later use in Unity in my custom shader.
As a last step I do heavy polyreduce for parts that are inside the rocks.
Overall, sand generation is the simplest part of the whole project. It is also a good example where proceduralism really shines – at any moment I can change layout of my rocks and re-generate my ground mesh almost instantly. Imagine re-sculpting it by hand each time you move something?!
Stalagmites and Stalactites
According to the concept of the “Collector”, I wanted to generate stalactites and stalagmites (Fun fact: Stalatcites are things hanging from ceiling and stalagmites are things building on the ground – usually below stalactites, where dissolved sediment is dropping).
I chose two separate approaches for this.
For stalactites I decided to add them to rock geometry – generate during rock shaping process. Their placement can be purely procedural. My approach is simple – first I convert the input geometry to a simpler shape, because I care only about large forms. Then with the measure node I get the curvature – I care only about convex parts that are facing downwards. Using poit wrangle node, I manually mask out sides of the mesh (using ramp with the relpointbbox
). Finally I have a density attribute which I use to scatter 50 points. After that I filter out bad points – points that are too close to edges or holes. I do this by randomly scattering points around the seed point and checking if they intersect with geometry when projecting upwards.
//test random points in radius to see if this is a valid place for stalagmite for (int i = 0; i < 30; i++ ) { vector hitpos, hituvw; vector2 uv; uv.x = random( i * 23.33 ); uv.y = f@radius * random( i * 3423.44); vector2 cpos = sample_circle_uniform(uv); vector ipos = @P + set(cpos.x,-0.1,cpos.y); int r = intersect(1, ipos, {0,1,0}, hitpos, hituvw); if( r == -1) { removepoint(0, @ptnum); //test failed. break; } }
Tip: To generate different geometry features for different types of rocks – you can use Split
node and do a simple name check (with Hscript or Python) in it’s parameter. In my case I just checked if this rocks name contains “wall”, “top” or “boulder” string and then have a different SOP branch for shaping the rock.
For stalagmites (thingies on the ground) I needed manual control of the placement and size.
FIrst I start by importing final sand geometry (therefore sand needs to be generated before this) and placing seed points on it by hand using TopoBuild
node (probably there is another node for this?). I also set stalactite radius attribute for each point manually using a simple point wrangle. I needed all this manual control purely for aesthetic – compositional purposes. The rest of the process was procedural generation.
When I have my seed points, I can do the same thing I did with the rock pieces – I filter out one point based on @pdg_index
work item attribute and generate a single stalagmite. In this way I can plug this into PDG and generate all the stalagmites in parallel.
First I started with generating a circle around the point and projecting it on the sand. Then I extruded this upwards and applied shaping ramp (ok, this ramp also Is a bit of a manual input). After that – couple layers of viens and noises to make the final shape.
For UV mapping I used manual seam generation – by picking one random point from the lower edge loop and one point from the other end and then just finding the shortest path and using that as a seam for UV Flatten
. To minimize distortion I decided to do two seams – each on opposite side. To do that, I simply find another base point – one that has the largest distance from first point.
Simple enough and produces good results.
Another important step is to transfer normals from base surface to the lower part of the stalagmites/stalactites to help with better blending the lighting – so it looks that it is one continuous object.
To eliminate texture seam, I generate the so called “dirt skirt” using Houdini Game Tools (now Sidefx Labs). It is a very cool tool.
Strata layers - experimental approach
At one point I explored stratified rock generation. There are couple tricky things in that process. One is generating strata cutting planes with user controlled distribution. Other is chipping corners and edges of strata layers, to simulate erosion.
For the first problem I wanted to implement point scattering based on user controlled ramp. This sounds easy, but is a bit tricky to do. Luckily people have solved this before and I found a very nice explanation by Matt Ebb.
As for the chipping edges and corners, I dug into the Material Fracture
node in Houdini 18, to see how corner chipping was implemented there, and found some ideas.
Here is a video where I try to explain the basic ideas:
UV Mapping rocks
Procedural UV mapping is tricky – especially for complex, organic surfaces. While UV flatten
node does everything for you, first you must generate good seams and that is the hard part. There are some auto seam generation options (and even more in Game Tools (Now Sidefx Labs)), but none of them do a good job on complex organic surfaces. By saying “not a good job”, I mean – most of the time results were not usable. I got a ton of overlaps, distortions, unnecessary holes, etc.
First I spent a lot of time trying to fix auto generated seams with VEX and different SOP nodes – I managed to fix most unnecessary seams in middle of geometry. Also I maged to fix some overlapping parts, but still – usually I would get too many small UV islands and also UV Flatten produced a lot of weird distortions – parts of UV islands got scaled down for unknown reasons, producing very uneven texel density and visible seaming on final textured models.
Adding RizomUV to the pipeline
Comparison of Houdini UV mapping results and RizomUV
While not planned initially, but because of bad results with Houdini built in UV mapping tools, I was pleasantly surprised that it was quite easy to automate UV mapping using RizomUV and get perfect results. How this works is a bit strange – you can write LUA script and tell RizomUV to run that script with a command line option. Strange part is that it opens full GUI version of the app and you must include “Quit” command in your script to kill it once it finishes. As for script itself – it is very easy to create it, because every operation you do in RizomUV GUI, produces LUA command history which you can just copy and paste in your script. RizomUV produces excellent results – giving very few UV islands, intelligently placed seams and no overlaps and perfectly even texel density. And it is fast!
To do UV mapping in RizomUV, I’m doing a nasty hack in SOP’s. There is probably a more elegant way to to this, but whatever – this works too and it works when executed from PDG graph.
Edit: It looks like Sidefx Tools (Former GameTools) now includes RizomUV node. I have not tried it myself, but it looks like it uses similar workflow of generating a LUA code on the fly.
First I export geometry with File node as obj file. After that I call Python
node, that creates appropriate LUA script and saves it to disk. Then it runs RizomUV.exe and passes this .lua file as command line parameter.. and waits for it to finish.
Then comes the dirty part – loading back the obj file. Only way I could get it to work was to merge in another File node (that loads the obj) without parent with my main node stream (which I clean with Blast node before). This way Houdini always loads updated Obj file when it gets to cooking the merge node – and that is after Python SOP has finished.
After that I just transfer back geometry attributes from before (exporting to obj cleans all attributes).
And last one – I do optional UV island scaling and re-packing based on previously calculated “is-outside” attribute to scale down UV islands that are on the outside of rock and will be invisible most of the time. This is needed just for “Collector” scene and not for generic rock asset generation.
Here is the python code (including LUA code) I used to call RizomUV and do automatic unwrapping :
import hou import os, sys import subprocess sbsRender = r"C:\Program Files\Rizom Lab\RizomUV 2019\rizomuv.exe" hip_path = hou.hscriptExpandString("$HIP") os.chdir(hip_path) #we will generate LUA script here path_of_script = hou.expandString("$HIP/rizom_script.lua") #this is the temporary mesh used for geometry exchange geo_file_name = hou.expandString("$HIP/unity/test_mesh.obj") sep = "\\\" fbx_file_name = geo_file_name .replace(sep, "/") lua_script = ('ZomLoad({File={Path="GEO_PATH", ImportGroups=true, XYZ=true}, NormalizeUVW=true})\n' + 'ZomIslandGroups({Mode="SetGroupsProperties", WorkingSet="Visible", MergingPolicy=8322, GroupPaths={ "RootGroup" }, Properties={Pack={Rotate={Step=15}}}})\n' + 'ZomIslandGroups({Mode="SetGroupsProperties", WorkingSet="Visible", MergingPolicy=8322, GroupPaths={ "RootGroup" }, Properties={Pack={Resolution=500}}})\n' + 'ZomIslandGroups({Mode="SetGroupsProperties", WorkingSet="Visible", MergingPolicy=8322, GroupPaths={ "RootGroup" }, Properties={Pack={MaxMutations=2}}})\n' + 'ZomSet({Path="Vars.AutoSelect.Mosaic.Developability", Value=0.75})\n' + 'ZomSelect({PrimType="Edge", WorkingSet="Visible", Select=true, ResetBefore=true, ProtectMapName="Protect", FilterIslandVisible=true, Auto={QuasiDevelopable={Developability=0.75, IslandPolyNBMin=1, FitCones=false, Straighten=true}, HandleCutter=true, StoreCoordsUVW=true, FlatteningMode=0, FlatteningUnfoldParams={Iterations=1, BorderIntersections=true, TriangleFlips=true}}})\n' + 'ZomCut({PrimType="Edge", WorkingSet="Visible"})\n' + 'ZomLoad({Data={CoordsUVWInternalPath="Mesh.Tmp.AutoSelect.UVW"}})\n' + 'ZomIslandGroups({Mode="DistributeInTilesByBBox", WorkingSet="Visible", MergingPolicy=8322})\n' + 'ZomIslandGroups({Mode="DistributeInTilesEvenly", WorkingSet="Visible", MergingPolicy=8322, UseTileLocks=true, UseIslandLocks=true})\n' + 'ZomPack({ProcessTileSelection=false, RecursionDepth=1, RootGroup="RootGroup", WorkingSet="Visible", Scaling={Mode=2}, Rotate={}, Translate=true, LayoutScalingMode=2})\n' + 'ZomSave({File={Path="FBX_PATH", UVWProps=true}, __UpdateUIObjFileName=true})\n' + 'ZomQuit()') lua_script = lua_script.replace("GEO_PATH", geo_file_name ) #save LUA script to file textFile = open(path_of_script, "w") textFile.write(lua_script) textFile.close() #execute RizomUV with cli options commandArgs = '-cfi "{0}"'.format(path_of_script) result = subprocess.check_output('"{0}" {1}'.format(sbsRender, commandArgs)) #force reloading of File SOP node to load back unwrapped geo out_node = hou.node("../file3") out_node.parm('reload').pressButton()
Columns
This part is something a bit different – just a awesome trick for creating edge “wear” for geometry. I use it to generate damage to old structure columns in the “Collector”.
Basic breakdown of the method is:
- Remesh your object
- Expand it a bit, then smooth it.
- Add some displacement noise
- Do boolean intersection with original mesh