Unity, C#, Game Development, Compute Shaders, Perlin Noise

Perlin Noise and Unity Compute Shaders

Learning about Perlin Noise and how to use Compute Shaders in Unity

FancyFennec
11 min readOct 12, 2021

Table of Content

0. Motivation

Sometimes it is just not enough to run your program on the CPU and sometimes you can greatly improve the performance of your program if you let the GPU run it for you. In Unity you can do this with a compute shader. Essentially, if you have anything that is super slow on the CPU and you can parallelise it, chances are that it runs a lot better on a compute shader. In article we are going to look at Perlin Noise and how to work with compute shaders in Unity.
With Perlin Noise you can create many great things, but today we will create a wave effect by using it together with vertex displacement in Unity.

Dislaimer:
This will probably not be the most performant implementation ever. If you need something that runs well you can check Wikipedia.

1.0 Perlin Noise Introduction

Have you ever looked at standard noise and were disgusted by how ugly and boring it looks?

ugliness in its purest form

Then worry not. Because there is Perlin Noise…

ahhh… perfection

How does Perlin Noise work? The basic idea is that instead of just generating random values for each pixel, we generate a grid of random vectors and use them to generate a smooth noise function. The random vectors are called gradients because the resulting function has a positive slope in the direction of the corresponding gradient vector.

Me thinking about Perlin Noise

If we just took the scalar product of a normalized random vector v and the vector (x,y), we can create a function defined by

(x,y) (v,(x,y)), where (x,y) ∈ [-1,1]².

It is a relatively boring function that has constant slope v but let’s plot it in Mathematica and see what we get.

just a constant slope

This might not look very impressive, but it is a start. What we would like to do now is create a function that has slope v at (x,y)=(0,0) and is 0 at any point on the border where x, y ∈ {-1,1}. We will achieve this by multiplying the function above with a drop-off function.

Consider the polynomial f(x)=3x²-2x³ and its graph.

3x²-2x³

If we take the composition of f and the function g(x)=1-abs(x), we get a nice drop-off function that looks like this:

(f ∘ g )(x)

Combining the drop-off with the slope function from before we get

(x,y) (v,(x,y))(f ∘ g )(x)(f ∘ g )(y), where (x,y) ∈ [-1, 1]²

Let’s put everything together in Mathematica and see how it looks like.

better :3

This looks a lot more promising than before. The last step is to do create these functions for every single point on the grid and then sum them all up. That way we get a function whose slopes coincide with the gradients that we define for each point on the grid.

something that looks like Perlin Noise

This looks pretty much like Perlin Noise is supposed to look (and way better than ugly, pathetic normal noise). In the final shader we create a lookup table for the different gradients that we will compute in advance and pass to the shader.

Remark:

Instead of the polynomial above, it might be a good idea to use the polynomial

g(x)=6x⁵ — 15x⁴+10x³

instead because its second derivative is 0 at 0 and 1. That way you get better behaviour at the border of each cell and therefore better results.

2.0 Compute Shaders

Compute shaders in Unity are small programs that run on the GPU. They are written in HLSL which looks similar to the programming language C.

I found the Microsoft reference to be a useful place to find information about HLSL. You can find all sorts of things there, like all the possible allowed operations or all the functions that already exist and how to use them.

If you work with compute shaders in Visual Studio, I highly recommend installing the HLSL Tools for Visual Studio extension. It makes your life a lot easier.

Remark:
Somehow not all the intrinsic functions work in Unity… For example, there is a noise function that is supposed to generate Perlin Noise, but it doesn’t seem to work with compute shaders. I guess it has something to do with the HLSL version that is used in Unity.

2.1 How to use Compute Shaders

Using compute shaders works in the following way:

  • Create a variable of ComputeShader type in your script. I usually do it by loading it from the Resources folder, i.e.
    (ComputeShader)Resources.Load(“name_of_my_shader”)
  • Set any textures or floats or whatever your heart requires in the compute shader with the respective Set method, i.e. SetFloat. Check the reference of the ComputeShader class. We will see examples of how to do that later.
  • Call the Dispatch method to execute the compute shader.

Remark:
You can even pass your own datatype to the shader with SetBuffer.

2.2 Setup of the Unity Project

We create a new HDRP Project and chose a fitting name. I pick an HDRP project because I want to create a vertex displacement shader for a wave like effect (it might be that URP is enough, but I am too lazy to check :D). I will call mine ComputeShaderExample.
Next we are going to create an editor script where we will run our compute shader in. I use an editor script because we can inspect render textures directly in the editor and that makes debugging easier. Also, you immediately see the results of your code when you switch back to the Unity editor. If you want you can run the compute shader in a monobehaviour. The code should be almost the same.
Create a new folder called Resources and then create a compute shader asset Create -> Shader -> Compute Shader.
I call my shader PerlinNoise. This will create a new compute shader that already contains some code.

Let us inspect the code. CSMain is the main function that the shader runs. It takes a uint3 argument called id and updates a two dimensional texture called Result. We can interpret at the id variable as the index of the pixel we are currently processing. Calling id.xy will get us the indices and we can access the corresponding pixel in the texture with Result[id.xy].

I usually move all my scripts into a directory called Scripts under the Assets directory. Now create a new directory called Editor underneath Assets and put a C# script called WaveEditor into it. This is the main file that we will be working with. In order for it to work, WaveEditor needs to derive from Editor and it needs the CustomEditor annotation. If you want to learn more about how to write editor scripts you can read about them here. My empty WaveEditor script looks like this:

2.3 Running the Shader

Let’s update our editor script. We want it to run the compute shader when we click on our Wave game object. First we need to create a texture and allow the shader to write to it, then we load the shader from the resources folder and pass the texture to it. To run the shader we just call the Dispatch method.

To see the outputs of the shader I created an object field that allows us to inspect the texture in the editor.

If we now click on the Wave object we should see a fractal that looks like the Sierpinski triangle on the rhs in the editor.

Result of the default Shader

3.0 Perlin Noise

Let us start implementing what we did in the beginning in the Mathematica script. We want to take the dot product of (x,y) with a random vector and multiply it with a drop-off function. The dot product is already implemented in HLSL, so we can just call it. There is also a function for computing powers, but since our exponents are very low, I will just write it out. I picked another RenderTextureFormat for the texture. RFloat will be enough. In the end my first version of the shader looked like this:

Note that we can define the signature of a function at the start of the script and then define its body after the CSMain function.
If we look closely at the texture in the inspector we can see that it actually worked. But it is hard to tell (either that or I am suffering from some sort of colour blindness).

It’s not just red :D

To see the results of the compute shader, let’s create a displacement material. In the end we want to have the displacement material anyway to create a beautiful wave effect.

3.1 Creating the Displacement Material

Create a plane with a lot of subdivisions in Blender and export it to Unity. Add it to the scene, call it Wave, and attach the Wave script to it.

Plane with a lot of subdivisions

Create a new shader graph and call it VertexDisplacementGraph
Create -> Shader -> Blank Shader Graph.
I am not going into details about how to work with shader graphs, but the left you can see all the properties that are exposed by your graph and if your right click anywhere you can add more nodes. Make sure that you select HDRP as your active target on the right. In the end your shader graph should look something like this:

Vertex Displacement Graph

What it does is reading the height from a texture and then it proportionally displaces corresponding vertices in the z-direction. It als has to generate normals from the height, otherwise it looks very flat, and it adds controls for the strength of of the displacement and the normals.
Create a new material, called VertexDisplacement, select our vertex displacement graph as its shader and attach the VertexDisplacement material to the Wave object. You won’t see much yet, because we didn’t pass our texture to the shader yet, so let’s do that. We set the texture in our material exactly the same way that we did it in our compute shader. We just need to figure out the name of the field. If we click on the shader graph we can find all of it’s properties in the inspector.

Shader Graph Properties

In our case, the texture is called “_Texture2D”. Now can update the editor script to get the material and set the texture.

To finish up the material I fiddled around with the displacement and the normal_strength in the material until I was happy. I picked a displacement value of 0.02 and a normal_strength of 2. This is how it ended up looking.

Getting there

Now it is a lot easier to tell what our compute shader is actually doing.

3.2 Adding the Gradients

Let’s begin by updating our Perlin Noise compute shader file. In the end we want for each pixel to get the gradient vectors for the 4 surrounding points on the grid. Let us update the shader to allow us to do that.

Note that we can call id.xyxy to return a 4-dimensional vector. Similarly we can use id.yx to switch x and y or call v.xz to just access the x and z coordinates of v.
We should see the following results in the editor.

Almost there

Right now we are exactly where we finished working with Mathematica. As the next step we will generate a set of gradients and pass them to the shader. We can do that with a ComputeBuffer. Sadly the Unity documentation doesn’t have a nice example of how to use Compute Buffers. But essentially you can define your own structs that you can pass to the shader. In our case we want to pass an array of 2-dimensional vectors to the shader.

When we call the constructor of the ComputeBuffer class we need to tell it how much memory it is going to use. Because it is a 2-dimensional vector of type float, its size is going be 2 times the size of a float and we will need 256 of them. The sizeof is very useful for this. in case you are working with your own struct you can just call sizeof(your_struct) and you will get the right size.
The GetRandomDirection method generates a random normalised vector. We initialise 256 of them and set them in our ComputeBuffer. Finally we can pass the ComputeBuffer to the shader with the SetBuffer method.
Now we can update the script and use the gradients that we have just passed in.

With the upgraded shader we can now work with higher frequencies of Perlin Noise. Here is how that can look like:

Remark:
With the drop-off function that we used so far, we will have ugly artefacts at each border of the grid. I fixed it as mentioned earlier, by using a polynomial of higher degree as seen on line 22/23.

3.3 The Wave Effect

Achieving the wave effect is fairly straight forward. We will add multiple layers of Perlin Noise with smaller and smaller amplitudes on top of each other. Additionally we pass the passed time to the shader and use it to add an offset to the noise function. This will make our wave move. Here are the final editor script and compute shader. You can also check out the project on github.
If you are looking for as slightly more complex project that uses compute shaders you can look at this erosion simulation implemented in Unity.

Wave Effect

Final Editor Script:

Final Compute Shader:

--

--

FancyFennec

I am a Software Developer by day and Game Developer by night.