Wednesday, March 14, 2012

Normal Mapping

Previously, I calculated the normal of each vertex in order to do some shading. The result was quite nice, but the detail wasn’t great and made the mountains look rounded. Normal mapping is a technique whereby you create a texture which contains normal information at texture detail (as opposed to at vertex detail) and use this when calculate lighting for a pixel. It means much higher shading detail but without having to increase the amount of points in the model. Both of these projects use it so I thought I’d give it a lash.

As with all these graphical things, the first go was pretty rewarding, but getting it (almost) right was very tough work! Here’s the first pic, you can see the shading is going to be much more detailed than before:

JellyEngineXna 2012-02-12 22-20-39-12

Basically there are 3 main things involved here; 1) lighting and shading, 2) creating the normal map from a height map, and 3) calculating the normals, binormals and tangents to work out the lighting. For 1) I took a HLSL sample of bump mapping from rbwhitaker's bump map tutorial but 2) and 3) are a little more complicated. Let’s talk about 3), binormals and tangents, first.

A normal is a line going straight up from the surface, and calculating it for a sphere is easy enough, just normalise the vector from the centre to the point. Binormals and tangents aren’t actually that complicated, but I found a few online articles that make them seem so, I’ll try to simplify here. They are both lines tangential to the sphere (so should really be called tangent & bitangent) but at 90 degrees to each other. It’s easier to visualise them as if they are the north and east arrows on a map; the binormal says which way is north, and the tangent says which way is east.

Of course on a sphere you can calculate them as if they were north & east, but the poles get a little complicated because at the top of the north pole, nowhere is north and nowhere is east – any direction is south! Applying this to my planet resulted in some male pattern baldness at the poles:

JellyEngineXna 2012-03-09 21-45-03-45

This image shows the normals (pink), binormals (blue) and tangents (red), you can see how at the pole they become a dot and the lighting is wrong.

JellyEngineXna 2012-03-09 21-44-58-16

So how to fix this? Luckily, the planet is actually made from 6 subdivided faces of a cube so each face can have it’s own north and east without ever having a north pole. For example, the face on the top of the cube has its north (binormal) pointing in the z-axis and its east (tangent) pointing in the x-axis and no pole pinching. So code to calculate the normal, binormal and tangent would look something like this for any point on any face:

Vector3 normal = Vector3.Normalize(point);
Vector3 tangent = Vector3.Normalize(pointToTheLeft - pointToTheRight);
Vector3 binormal = Vector3.Normalize(pointAbove - pointBelow);

At the edges there’s a problem e.g. there’s no point above the top, but this can be resolved with skirting – I’ll post about that next time because I haven’t gotten it right yet… although that’s never stopped me before!! Also for non-spherical models, the normal would be worked out in relation to its neighbours, but this simple example will do for now. Here’s what this looks like with a standard normal map (a bump), there are multiple lines in most places because each face and each sub-patch calculate their own normals, binormals and tangents for shared points:

JellyEngineXna 2012-03-13 00-14-36-17

Ok so that’s binormals and tangents, now how to work out the normal map dynamically given a height map? Luckily for me there’s a thing called a Sobel filter which is a little equation that works out what the normal value should be for any point. To simplify it a bit, to work out the normal of a pixel, it gets the y-magnitude (binormal) by looking at the heights above and below and gets the x-magnitude (tangent) by looking at the heights to the left and right. Even more luckily for me (and my shader phobia) Catalin Zina blogged some HLSL code that implements the Sobel filter. Unfortunately it’s not that easy, when I tried it out I got some strange results:

JellyEngineXna 2012-03-10 17-29-56-40

Each face on its own looks ok but they obviously differ at the borders … what’s going on?? This problem was a real headwrecker, and it just highlights that you can’t always plug things in from various sources & expect them to work without having a proper understanding of what’s going on. A hint is that the normal map created by Catalin’s code as shown in the first image of this post is generally green while most other examples of normal maps are generally purple. It boils down to one place where the dx and dy components are put in the red and blue channels whereas usually they are put in the red and green channels of the pixel:

float4 N = float4(normalize(float3(dX, 1.0f / normalStrength, dY)), 1.0f);

This is absolutely fine if when you calculate the lighting you use the same channels for x & y but breaks down when you use lighting that expects x and y to be in different channels.

I changed this to use the more conventional channels and added in a level parameter to halve the normalStrength when calculating a normal map for a subdivided patch (it’s the same area image but half the normal intensity). Here’s the full normal map creation shader I used, adapted from Catalin’s:


// copied from here http://www.catalinzima.com/tutorials/4-uses-of-vtf/terrain-morphing/
float normalStrength = 0.5;
float texel = 1; // calculate as 1f / textureSideLength
int level = 1; // the patch subdivision level
float4 PSNormal(vertexOutput IN): COLOR
{
float tl = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2(-1.0, -1.0)).x); // top left
float l = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2(-1.0, 0.0)).x); // left
float bl = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2(-1.0, 1.0)).x); // bottom left
float t = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2( 0.0, -1.0)).x); // top
float b = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2( 0.0, 1.0)).x); // bottom
float tr = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2( 1.0, -1.0)).x); // top right
float r = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2( 1.0, 0.0)).x); // right
float br = abs(tex2D(heightMapSampler, IN.texcoord.xy + texelWidth * float2( 1.0, 1.0)).x); // bottom right

// Compute dx using Sobel:
// -1 0 1
// -2 0 2
// -1 0 1

float dX = tr + 2.0*r + br -tl - 2.0*l - bl;

// Compute dy using Sobel:
// -1 -2 -1
// 0 0 0
// 1 2 1

float dY = bl + 2.0*b + br -tl - 2.0*t - tr;

float4 N = float4(normalize(float3(dX, dY, 1.0 / (normalStrength * (1 + pow(2, level))))), 1.0);
// was float4 N = float4(normalize(float3(dX, 1.0f / normalStrength, dY)), 1.0f);

return N * 0.5 + 0.5;
}

This gives some purdy results, but there’s still more to do (you can see a feint dividing line down the centre of the first image because normals aren’t calculated correctly for patch edges so I’ll need to skirt around a patch to get the values past the borders):

JellyEngineXna 2012-03-14 12-23-13-95


JellyEngineXna 2012-03-14 12-28-23-15

FPS and video memory are getting hit hard so I’ll clean up the code & make sure I’m not doing anything I don’t need to. Next I’ll do skirting so I can get rid of those feint but pesky dividing lines.

Saturday, January 28, 2012

Textured Procedural Planet

…aka back to square one albeit the terrain is round this time, so back to round one? I added some textures to the planet, generated on the GPU, they look ok but still lots of work to do. First some outtakes, looked like a load of shite at the start…

JellyEngineXna 2012-01-27 19-15-05-88JellyEngineXna 2012-01-27 19-15-16-54

… then a bit like an old leather ball…

JellyEngineXna 2012-01-27 19-23-18-91JellyEngineXna 2012-01-27 19-26-42-21

… then all contoury …

JellyEngineXna 2012-01-27 19-30-04-16JellyEngineXna 2012-01-27 19-29-49-84

… (which is actually not too bad looking) then it started to come together a bit …

JellyEngineXna 2012-01-27 19-35-58-32

The problem with this though is that the source texture is applied at the same scale for each patch so there’s a noticeable jump when changing levels, a higher level might show a rock but when it subdivides there are 4 rocks. This gif shows the gifferences when approaching, it looks like the centre patch uses a completely different texture.

texturemonoscaleprob

So it made sense to scale them according to the level (texCoords *= pow(2, maxlevels – patchlevel)), but this introduced some pretty nasty artefacts when zoomed out due to tiling the same texture over and over. Applying mipmapping to the texture I was using helped a bit but it’s still very obvious.

JellyEngineXna 2012-01-28 00-26-54-35

The transitions looked very layered so I added a bit of noise:

JellyEngineXna 2012-01-28 12-51-12-78JellyEngineXna 2012-01-28 12-51-43-54

All in all it’s not bad for a first go but could be a lot better. I was having fun flying about the place so I made a little video:

 

…then noticed a problem I had neglected from before that’s obvious in lines going through the brown pit on the right at 0:34.

JellyEngineXna 2012-01-28 12-19-10-78

I thought it was cracking at first because it only seemed to occur when creating other patches but investigating it further it wasn’t cracking, but something to do with textures. Another gif to show the gifferences…

pointlinearclampproblem

The points are fine, it’s the texture on top that’s creating the artefact. What it ended up being was a leftover setting from creating patches. In order to get the heights from the GPU I draw the noise for each vertex in a patch to an image and read them back to reset the vertices. In order for neighbouring patches to have the same height values I draw it as a point-clamped list of lines. When this was happening a subsequent draw would use the point-clamped sampler and create the artefact. Looking for how to fix led to some good advice, don’t assume sampler states when drawing, explicitly set them every time you need them.

I’ve found a few other very impressive procedural planet projects so I might trawl their blogs to see if they mention how to do textures so well. So for next time…

  • Adding noise to the texture isn’t great, needs a good bit of tweaking. I could try to do a look-up table & incorporate slope as in this old post from Ysaneya (Infinity).
  • I need to hide tiling somehow, it occurs and is dealt with in the old post from Ysaneya above but the description is just “lighting, shadowing, other effects”.
  • I’ll need a better noise algo, the one I’m using is very homogenous (“samey”). This project called lithosphere might help find a good one quickly.
  • The FPS are struggling whenever creating patches & I’m not showing that many polys so I might need to do some optimisation. I pass data around a lot between CPU and GPU so I think these articles by Shawn Hargreaves might help find out if I’m CPU or GPU bound and try to balance both.

Lots to do!

Tuesday, January 17, 2012

Less Abnormal…

This problem took me MONTHS to resolve. First I had to laze about and ignore it for about 6 months until a new year’s resolution made me look at it again. The problem is that each patch is created independently (i.e. it might not have any neighbours yet), so when calculating the normal vectors at the edges the normal calculation is biased towards itself. It’s kinda hard to explain in words so hopefully a diagram will help:

image

A normal is a normalised vector perpendicular to the surface at a certain point, it might help if you imagine walking on the surface, at peaks and troughs you’ll be upright but on slopes you’ll have to compensate by leaning in (or fall over whatever floats your boat). In this example, I’m trying to work out what the normals are for the green “patch” (the green V) and I don’t yet have the blue or the red patch. So I have to calculate a normal (pink arrows) for each of the 3 points in the green patch using just those 3 points.

I can work out the centre normal fully using the 2 adjacent points to the left and the right. However the left and right normals only have 2 points available (left & centre for the left normal, right & centre for the right normal) for normal calculation and so they end up perpendicular to the left and right vectors. When the red patch comes along it happens to work out well, the normal on the right indicates a slope. Unfortunately when the blue patch comes along it  will look wrong. The left normal should be pointing straight up because it’s a peak, not a slope. If that wasn’t enough, to make matters worse when the blue patch works out its right normal it will look like a slope in the opposite direction so lighting will shade the slopes with a very sharp divide at the top.

In order to solve I could redo neighbouring normals whenever a new patch is generated but this would cause lots of rewriting the vertex buffer & would be visually obvious until the neighbouring patch was created. The way I opted for was to extend the patch area by one cell (equivalent to getting the middle point of the blue and red patches) and work out normals for all the inner points (same number of points as in the unextended patch). It worked out pretty well, here’s a before and after shot where latitude lines disappear:

JellyEngineXna 2012-01-17 21-35-20-06

JellyEngineXna 2012-01-17 21-35-25-64

And a close up with pink normals. On the peak in the before shot there’s divergence with normals at the same point going different directions. In the after shot the normals on the peak look like one normal but really they are 4 different normals all pointing in the same direction:

JellyEngineXna 2012-01-17 21-40-58-54

JellyEngineXna 2012-01-17 21-41-10-70

It’s a bit less obvious but on the mountain side you can see some lines caused by divergent normals where patches meet in the before shot disappear in the after. I made a gif to show gifferences but the 256 colour palette isn’t the best:

peak

I think I’ll have a little look at texturing next, to get the planet looking like the terrain I had SEVERAL years ago… pretty shameful thinking this is progressing so slow but sure I’ll keep tapping away at it, one day at a time*.

*time may be anything from 6 months to a year or two!