This is what this is all about

## Velocity Fields

To make this effect, we must create a custom velocity vector field (volume) that moves particles around the surface of the 3D model in specific way – moving farthest particles closer to the surface and then pushing them along the surface in upwards motion. Because this velocity field is precalculated, it will work only for static meshes.
Using custom velocity fields (velocity volumes) for particle advection is a very popular VFX technique. It’s really simple but effective way that allows achieving visually complex particle motion without actually running complicated particle logic.
It is also supported by Unreal Engine – sampling vector fields in it’s particle systems via 3D textures (Velocity field in UE is just a 3d texture containing direction vector for each 3d point – similarly like volumes in Houdini). With Niagara it is also very fast, so you can have millions of particles.

##### What is a velocity vector field?

Imagine having a fog volume, but instead of storing fog density, we store a 3D vector at each point. This vector defines the direction and magnitude (length of the vector) of the velocity at each point.

“Fog” volume – each point stores a density value

Vector Volume – each point stores both direction and magnitude

Moving points through the velocity field

First let’s describe what we want to achieve. We want our particles to be attracted towards the surface of the 3d mesh and when they get closer, they will be pulled along and around the surface in upwards motion.
To get the motion, we will have a vector field (volume) enclosing our 3d model and all particles, that will store motion direction and speed information that the particles can read and travel along. Particles will not actually have any interaction with the 3d mesh itself – everything will come from vector field.

To calculate velocity direction for this effect in each volume point, you need to know couple vectors:

1. Direction and distance towards the closest point of the 3d mesh surface
2. Direction along  the 3d mesh surface (tangent) – direction you want the particles to travel on the surface of your mesh.

When you know these, you can just blend between them depending on the distance from surface. For example, when the point is further away from the surface – it gets pushed more in the direction towards the surface. As soon as it gets closer, it gets pushed more along the surface.

There are few additional things you can add. One would be – all points that are inside the surface, gets pushed out towards the surface more agressively, so particles quickly go out and stick to the surface. Also you can do something different when getting closer to volume borders or special parts of the model – blend to some other directional vector.

You can also introduce some curl noise to break things up.

Example of  blending to special direction when close to horses eyes

Blending between two vectors using linear interpolation

Blending between “direction to surface” and “along the surface” vectors based on distance from surface.

## Creating velocity volumes in Houdini

When working with volumes in Houdini, I like to use Volume Wrangle node (VEX). (You can use VOP, but I think it’s a waste of time – you know the VOP nodes already so you already know VEX .. it’s much faster.) Volume Wrange gets executed for each voxel (3d point in the volume, based on it’s resolution). Like when doing geometry wrangles, you get a @P position attribute for each voxel.
You can then use this position in space to calculate distance to 3d mesh surface (plug your geo in other slots of the wrangle node for access), read surface attributes from the nearest surface point etc.

### Calculating Surface Direction

As I said, all this is very easy – the tricky and most important part is calculating the direction along the surface in a meaningful and art directable way. As you can imagine – there is no “correct” direction along the surface, so it’s totally up to you to come up with some kind of logic.

Here are several techniques I use, depending on artistic needs and input mesh shape:

### Calculate from Up vector

The simplest method is to use general direction vector (up vector) and calculate tangent vector surface normal and this vector using cross product. This works well for round objects, but it completely ignores bent leg direction on the horse statue.

Point wrangle VEX:

```vector general_dir = {0, 1, 0};  // This is the  general  direction we want the flow to happen
vector side_dir = cross(v@N, {0,1,0}); //direction perpendicular to general direction
v@tangent = cross(side, v@N); //Surface tangent vector. N, side_dir and tangent are orthogonal.```

tangent vector calculated from upwards general direction vector

### UV Based Tangent space

This is useful if your model has a good and meaningful UV mapping that can be used for direction – for example, your mesh is a tree, where each branch is mapped so texture goes along the branch. You can use PolyFrame node to calculate tangentu and tangentv vectors,  from the UV coordinates.

##### Tangent Space

This is actually whats usually called “tangent space” in CG – 3d direction vectors for each vertex, that show in which direction does the U and V coordinates get bigger. Having normal and tangent directions, you have a coordinate space (axis) of how the texture is aligned in 3D space for that surface point. Knowing this, you can transform things from texture space to object space – for example transform normal map information to object space surface normal (This is how normal mapping works).

tangent vector shows the direction of U coordinate change on the surface in 3D space.

In my case with the model of the horse statue, it is not useful as the UV coordinates are quite random and does not provide any meaningful surface direction information. This is because there are too many UV islands and they are randomly rotated – not pointing in any particular direction.

Here is example where UV based tangent space works well. Each tree branch is mapped in V direction on the texture.

### Tangents from Curves

This is my favourite method and the one I used in my video. In short – you create some curves, calculate direction vectors (tangents) for them using PolyFrame node, and then just Attribute Transfer tangents back to your mesh. Just be mindful about curve directions – as this impacts the tangent direction.

I used Labs Straight Skeleton 3D node to generate “skeleton” curves procedurally. But you can also augment that with curves generated with different methods – or even hand drawn curves for completely art directable effect.

Simple attribute transfer will only get you so far, as  this will ignore all the subtle 3d surface direction changes that are not in the curves. To fix that, you can do the same method as when calculating tangent vector from general direction (up vector) – just use transferred curve tangent vector as the up vector.

```vector side_dir = cross(v@N, v@tangentu); //Calculate orthogonal vector to surface normal and transferred tangentu vector
v@tangentu = cross(side_dir, v@N); //Calculate new tangent vector from surface normal and calculated orthogonal side vector.
```
###### Get Houdini Example:

particle_vector_field_example_01.hiplc ## Setting up Niagara particle system

This is very simple. First you need to import  your vector field into Unreal Engine. To do this, you can firt export it using Labs Vector Field Export node in Houdini as a .fga file.

After that, you can add Apply Vector Field module to  your Niagara system Particles Update and point it to the imported vector field asset. This module then applies vector field velocity as force or as final particle velocity (or mixed). For basic setup that is it.

Of course you also need to emit your particles somewhere. If  you emit them just on the surface of the mesh, it will not look very interesting as no particles will be flying in mid air. Also if you emit them in a bounding volume sourounding your mesh, there will not be enough density to make surface swirls visible. I suggest making a special emission source mesh  with clumps of geometry close to surface, but also some distance away from it. You can also combine with some particles emitted  in surrounding volume.

Emitter source mesh I used in my demo

## Faking Depth of Field with particle material

I used RadialGradientExponential material function to generate circular particle alpha procedurally. Then you can use Distance_Blend  material function to have different particle radius, sharpness and alpha, based on distance from camera. By making particles larger and more transparent closer  to camera, you can get fake depth of field effect, which might be impossible to achieve using post processing (because semitransparent particles does not write to depth buffer and therefore  are ignored by post processing DOF).