Sunday, July 24, 2011

L-System Trees

I continued working with the Algorithmic Beauty of Plants book and implemented parametric and context-sensitivity from chapter 1, made the trees from 3D tubes (instead of lines) and tried some of the examples from chapter 2 Modeling of trees. I had some success but couldn’t reproduce some of the models, mainly the section with tri-branching trees.

Here’s my results of the rules on page 56 varied with the parameters on page 57:

p56

And here’s what the original ones from the book look like:

p56original

They look pretty similar, apart from (b). Note that in (b) the height of my version is the same as the other trees, but in the book it is much shorter. This is odd because the table on page 57 modifies the branch contraction rate (r2) only, not the trunk contraction rate (r2) which is always 0.9 so they should all be the same height.

In fact, it’s not clear what the height should be to get the above images as the axiom at the start sets the length to 1 and the width to 10, and if the length:width ratio really were 1:10, the result would be a lot more squat, like this:

JellyEngineXna 2011-07-24 15-10-02-33

And it’s still pretty squat at 10:10:

JellyEngineXna 2011-07-24 15-10-26-57

I found a ratio of 100:10 worked best to yield results similar to the book.

The experiments on page 59 looked pretty much identical to the book (again using a ratio of 100:10 so an initial axiom of A(100,10) instead of A(1,10)), here’s mine:

p59

And here’s the books:

p59original

I had the least luck with reproducing the examples under Ternary branching. In the book (a) is shown as:

image

… which has a degree of irregularity… I don’t know where the model can get this as there is no stochasticity specified and the tropism applied is straight down (like the force of gravity), so why would the branch on the right bow down more than any other branch? My results show a much more regular model:

p60a

The closest of the ternary branching section I got was (b), although I had to fiddle with a few parameters (setting the width increase rate to 1.432 instead of the 1.732 given). Here’s the version in the book:

image

And here’s my result:

p60b vr1.432

(c) and (d) were real headaches… I can’t get these to look anything like the ones in the book… I’m pretty sure something’s missing ar the parameters specified are wrong, especially the elongation rate (lr) for (c); on page 61 it’s defined as 1.790 and all the others are 1.1 or less. Plugging this into my L-System/turtle interpreter results in a very tall tree (looking up at it in the first image and looking head on but zoomed out enough to see whole tree in the second):

JellyEngineXna 2011-07-24 15-38-40-99

JellyEngineXna 2011-07-24 15-39-01-39

I’m pretty sure my coding is right on this one, if you take the start piece from the axiom which is defined as F(200) and apply p2 (F(l) –> F(l*lr) 8 times with lr set to 1.790, this is the same as 200*(1.7908) which is about 21079. For the other trees, lr is 1.109 and the base trunk height after 8 iterations of p2 would be 200*(1.1098) which is about 457. However the image in the book shows the tree as the same height and roughly the same proportion to its width as the other examples:

image

If I muck about with the parameters a bit I can get something that looks a bit more like it:

JellyEngineXna 2011-07-24 15-47-56-31

… but it’s not quite as nice. (d) is the same story really, can’t reproduce it from the parameters given… :(

I’m *pretty* sure I’m right but this book has been out since 1990 and there’s nothing specified in the errata. I tried to find these models in the virtual lab available from algorithmicbotany.org but no luck. So I might fire off an email and see if any one associated with the book can help me out…

Other than that I implemented horizon culling on the planet with a lot of help from an article at crappy coding, redid some of my basic rotation stuff (I had everything mirrored through the z axis), and tried to streamline how new patches are formed on the planet (it didn’t work!). I’m not sure what to do next, had a look at some demoscene programs and am now utterly depressed at how little I’ve done! I might try generating normal maps from the heightmap of a planet patch for lighting, maybe do atmospheric scattering, texturing of the planet and/or trees, adding leaves maybe? Hiring someone else to do this for me?? :)

Tuesday, June 14, 2011

GPU Planet Update

Jeez, I can’t believe it’s been almost 2 years since I last did anything with this… I picked it up again a few weeks ago and tried to get into it again, made a little progress and thought I’d share.

A major problem getting back into it was migrating from XNA 3.1 to XNA 4 – there are lots and lots of changes, one of the worst I found was the restrictions placed on shaders and the Reach and HiDef profiles. I found an article which described how to target Reach and HiDef with the same project which was very handy, otherwise I’d have to develop 2 separate projects in parallel if I wanted to use both of my laptops for developing (one’s far more portable than the other but has to be Reach). So with a lot of help from this cheat sheet I got my project running again, and tried to learn a bit about shaders to see if I could get GPU noise back on shaders less than version 3_0_0.

In the process, I used a ping-pong technique to create Conway’s Game Of Life simulation (much better in 1080p):

The next stage then was to get back to the original goal of getting the GPU to generate all the noise as I found out doing noise in software was far too slow.

The plan was to pass the vertices to the shaders, let them calculate the noise then get back a heightmap which I can use to update the vertices. One of the problems was that the vertices form a curved surface and I need to flatten it for the pixel shader to give me back a regular heightmap. As each of my patches has texture coordinates that go from 0,0 to 1,1 I can create a flat image that the pixel shader could work with by using the texture coordinate to generate the position in clip space and storing the original position –which the shader will use to get the noise- in a texture coordinate. So for example, a point at 123, 234, 345 with texture coordinate 0.5,0.5 would end up in clip space at position 0,0 (clip space goes from –1 to 1) with the position stored as a texture coordinate. There’s a problem with halfpixel offsets as Drilian describes, in the end my shaders look something like this:

struct vertexInput {
float4 position : POSITION;
float2 texcoord : TEXCOORD;
};

struct vertexOutput {
float4 hPosition : POSITION;
float3 wPosition : TEXCOORD1;
};

float texel;
vertexOutput VS_flat(vertexInput IN)
{
vertexOutput OUT;
float clipx = 2 * IN.texcoord.x - 1;
float clipy = 2 * -IN.texcoord.y + 1;
float4 clipPosition = float4(clipx - texel, clipy + texel, 0, 1);
OUT.hPosition = clipPosition;

OUT.wPosition = IN.position.xyz;

return OUT;
}

float4 PS_noise(vertexOutput IN): COLOR
{
float3 p = 0;
p = IN.wPosition;
return inoise(p)*0.5+0.5;
}



And this technique let me create fBm on a 2_0_0 pixel shader by adding each stage of fBm in separate ping-pong passes:





I used this in my original project and tried to update the vertices given the heightmap values generated by the shader. This didn’t work initially, I ended up with lots of cracks because of the way the pixel shader interpolates.



JellyEngineXna 2011-06-11 20-31-45-30



The upper left point (at tex coord 0,0) would have the correct noise value but I couldn’t get the lower right point to draw the correct noise value at tex coord 1,1. When 2 patches met they would have different height values for the same vertex. In the end I managed to get around it by rendering using a line-strip instead of a triangle-strip. This picture shows the same noise values along the edges of the patches:



JellyEngineXna 2011-06-14 10-47-12-57



So, although it looks like I’ve made almost no progress since about 3 years ago (yikes!), I have! … made almost no progress that is :) The main difference is that this planet can render at run time – all the other planets had to have some pre-rendering of some sort, here’s a little fly around at 17x17 vertices per patch so you can see popping and subdividing:





In the end… it might be better to generate all the noise every time, for the vertices and for the pixels, caching the vertex info and texture is very heavy on memory. Hmm what’ll I work on next for an update in 2013…

Thursday, December 24, 2009

Blooming Starfields

It took AGES to add bloom. I used an XNA tutorial for bloom and in order to do the various passes (pick the brightest spots, blur it, recombine with the original image) it called ResolveBackBuffer to copy the back buffer to a texture for the next pass. I don’t fully understand it yet but it seems this call would invalidate the back buffer and subsequent shaders would write as if they were the first. I’ll try to illustrate what I mean by showing the problems I had. My plan was to draw the stars & clusters first, bloom that, then add the nebula.

Here’s what the result was like without the bloom stage, the nebula is added to the background stars (no bloom) as expected:

starfield without bloom

And here with the bloom stage included in the middle, it seems to be overwritten by the nebula stage:

nebula overwrites starfield with bloom In the end, I did a ResolveBackBuffer to a ResoveTarget2D after the bloom stage and drew this to the back buffer before I call the fBm/nebula shader. This finally ended in what I was looking for.

starfield with bloom

I’m still not 100% happy with the results, I had imagined the bloomed background stars would really light up the nebula but they seem to actually be dimmer than the stars outside of the nebula. I might try to use the same fBm noise result to draw some stars so they are in place in the densest parts of the nebula.

The main thing I’m not happy with though is that the code is a total mess, I just kept hacking and hacking away until I got results. I’m sure a team of monkeys would have produced neater code in less time. It’ll be a while before I get these images into a skybox – so until then HAPPY CHRISTMAS!

Procedural Starfield Texture (First Go)

I decided to try my hand at a procedural starfield. I figured it’d be an easy enough shader to do (I’m still a total beginner) plus I’ll need to use the same techniques to generate cached images of planet patches (previous GPU noise posts were generated per-pixel every frame). So I started off with some blue fractal Brownian motion noise for the background:

stars_fBm

And added some code to make simple noise white if it were over a certain threshold. I thought these would appear as stars, but it was all blobby:

stars_fBmn

So then I scaled it by passing p*scale instead of p to inoise:

stars_fBmsn

Not bad… but the stars (little blobs) are too generic so I added in another layer of noise but at a second scale (bigger blobs) so it looks like brighter stars or clustering:

stars_fBmsnc

It looks ok… a bit lame though… The problems I see are that the stars have aliasing artefacts and the scene doesn’t … impress. Zoomed in the stars look crap, like Tetris blocks or something:

stars aliased unimpressive

I think it needs some effects like glow or bloom or MORE BLOOM! So I’ll look into that next.

Here’s the shader code so far, pretty simple stuff:

vertexOutput VS(vertexInput IN)
{
    vertexOutput OUT;
    OUT.hPosition = IN.position;
    OUT.texcoord  = IN.texcoord * noiseScale;
    OUT.wPosition = IN.position.xyz * noiseScale;
    return OUT;
}

float4 PS_test(vertexOutput IN): COLOR
{
  float3 p = IN.wPosition;
  float res = fBm(p, oct, lac, gain);
  float4 color = res;
  color.rg = 0; // make it blue
  color.a = 1;
  float star = inoise(p * speckleScale);
  if (star > speckle)
    color = lerp(color, star, 0.9);

  float starcluster = inoise(p * speckleScale2);
  if (starcluster > speckle2 + 0.1)
    color = lerp(color, starcluster, 0.9);
  return color;
}

technique test
{
    pass p0
    {
        VertexShader = compile vs_3_0 VS();
        PixelShader  = compile ps_3_0 PS_test();
    }
}

Monday, December 21, 2009

GPU Perlin Noise

I updated my DirectX before starting and somehow my MDX project became all liney, like this:

Uhhh

Because of this, and because most of the online samples I find to plagiarize be inspired by are usually in XNA I decided to convert. Unfortunately there’s no automatic conversion tool so I had to open a new project, add my old files and try to fix up the errors. Some things were annoying, like the lack of absolute mouse movement and having to load a font from a file (size is fixed in the file) but overall the classes are better named and organised. Here’s my first shader on the planet in XNA, stripes:

stripey

So after a quick crash course on shaders, I found Perlin had published HLSL code for Improved Perlin noise in GPU Gems 2 – all of which is free online! I had real trouble getting it going however, every texture I generated was blank, and I didn’t know enough about HLSL to figure it out. There seemed to be a tutorial on ziggyware but it seems ziggyware’s been under attack from hackers so I couldn’t get it. I tried out some other projects like Drilian’s but again the XNA version I made the noise didn’t work. Eventually I found a post where someone had similar problems, which led me to the Google cache of the XNA GPU Perlin noise tutorial – by Patrick of recreationstudios (he’s made some amazing progress since I deserted this project 2 years ago). All I needed to do was initialise the textures all the noise functions use. Perlin had functions to do this in HLSL, but I hadn’t a clue how to call them, supposedly you can’t in XNA so I moved the HLSL code to C# then create the textures on the CPU and set the textures on the GPU. Patrick’s tutorial explains this much better (prize-winningly better apparently) so I won’t go into details.

So here are some screens of different types of Perlin noise on the planet:

Noise

NoiseRidgedMF

In these shots the noise is being calculated per-pixel each frame based on the geometry. Surprisingly, getting closer to the planet so the level of detail is higher (the 4 child patches are drawn instead of the parent patch) didn’t change the surface visibly. This had to do with the fact that there wasn’t much geometry difference between the parents and the children, when I changed the patch size from 33x33 to 5x5 there was definite popping in the textures.

Here’s a video of flying about a bit:

Update: shading every frame means I can animate!

Wednesday, December 9, 2009

Procedural LOD Planet Textures

I got a rough draft of procedural LOD planet textures working and thought I’d share some screenshots. The main difference between this and the last post is that this planet subdivides its patches and generates a new texture for each new patch as the camera gets closer and the planet in the last post was just static (it only generated vertices and textures at start up). Currently, all the textures are created in software so it’s hella slow, too slow to play really. Once a texture is generated and cached it’s very fast (capped at 60fps). On-the-fly however even with a texture size of only 17x17 pixels (same amount of vertices per patch) it’s jerky. Here are 6 levels of 256x256 textures zooming in on the one spot.

ProceduralTextureLevel0

ProceduralTextureLevel1

ProceduralTextureLevel2

ProceduralTextureLevel3

ProceduralTextureLevel4

ProceduralTextureLevel5

Next steps are to try to speed it up so it’s acceptable to play with, so I’ll probably do 2 things; use EQATEC to profile my app and do the image generation in hardware. But before I do that, I somehow messed up the geometry so it’s clockwise instead of anti-clockwise… or the other way around… whatever I did anyways I’ll try to undo.

Update: Here’s a video, fraps and windows movie maker seem to work well with youtube, my previous efforts at uploading video turned out horrible! I’m only running the textures at 17x17 so the video runs smoothly (i.e. I know it looks like a big steaming pile of sh!t).

Monday, December 7, 2009

Seams Like Years…

It’s been a while since I updated this… a really long while, but I’m going to try to get back into it again. My current problem is something I never really fixed the last time: seams. Here’s an example:

Seams

Weirdly enough if I save the textures I generate for the faces they line up fine with no seam (the darkish hole should line up with the pit above):

NoSeamsOnLeftBackAndRightTextures

So, having changed my texture generator around a bit I’m 100% sure it’s my texture code some sort of floating point error how MDX is texturing the faces. There must be a way to get MDX to just display the bitmap I give it & not mess up the edges…

Update 1: I had some mag & min filtering going on which was causing some (but not all of my seam problems. When I got rid of this code:

//    device.SamplerState[0].MinFilter = Direct3D.TextureFilter.Anisotropic;
// device.SamplerState[0].MagFilter = Direct3D.TextureFilter.Anisotropic;



It looks like this now (still a problem with top and bottom patches but going from front to left to back to right and back is seamless now):



FilterSeamsFixed





Not sure what’s causing it, but you can see from front to top to back to bottom the textures I’m generating seem to be slightly offset – the image on the right offsets them in the correct direction and they seem to match much better.



nooffset offset



Update 2: There were some errors in how I collated the heights from sub-patches, now top to bottom match up perfectly, and I have no seams!



NoSeams



NoSeamsTextures





Next back to the adaptive planet (the one that properly changes its patches depending on LOD) and an attempt at making the textures in hardware.