The Developer’s Cry

Yet another blog by a hobbyist programmer

Rendering random nebulae (part 1)

As I wrote in my previous post, I’m playing with a random 3D star field lately. To this star field, I like to add some nebulae to fill the black void of space. Previously, I worked with dozens of photos of existing nebulae from NASA. This time, I decided I want the scene to be entirely fictional, so there is no place for real nebulae.

Bring Out The Gimp

There is a way to draw nebulae by hand. Fire up the GIMP (or Photoshop, or alike), and select Filter : Render Clouds : Plasma. You now get a colorful mix onscreen which is probably not quite like what you had in mind. Now select Colors : Desaturate to make it look like a grayscaled image. Now select Colors : Colorize and drag the Hue slider to select the desired color for the nebula. To make the nebula look nicer than this, we probably have to mess with transparent layers, but this is the basic idea.

Programmatically speaking

A way to go would have been to use ImageMagick’s library to do the described operations. Well, it seems that MagickWand (ImageMagick’s C API), does not include a plasma renderer. The C++ counterpart, ImageMagick++, does, but being mostly a standard C programmer, I’m quite intimidated by its use of templates and references. And to be honest, I could not even get any decent looking nebula by trying to use ImageMagick from the command-line.

I got some neat plasma rendering code in standard C from the source code of the GIMP. Well, puting it like this makes it sound easier than it was, but I adapted it to work with an RGBA pixel buffer. I also wrote filters for desaturating (grayscaling), blurring, colorizing, alpha blending, and something that I dubbed “fadebuffer”, but it’s really “gray color to alpha” as it puts the gray value into the alpha channel.
All filters work an RGBA pixel buffer and all have the form:

void render_filter_name(unsigned char *pixels, int width, int height);

My Objective-C PixelBuffer class has a method named renderFilter that allows me to use any implemented filter on that image.

@implementation PixelBuffer

-(void)renderFilter:(void(*)(unsigned char *, int, int))filter_func {
    filter_func(pixels, width, height);
}

@end

This renderFilter method can be used as follows:

PixelBuffer *pixbuf = [[PixelBuffer alloc] initWithWidth:128 andHeight:128];

[pixbuf renderFilter:plasma];
[pixbuf renderFilter:blur];
[pixbuf renderFilter:grayscale];
fadebuffer_factor(1.5f);
[pixbuf renderFilter:fadebuffer];
colorize_color4f(r, g, b, a);
[pixbuf renderFilter:colorize];

Which is both readable and powerful code.

In pure Objective-C, you would have used a selector and maybe filter classes derived from PixelBuffer, but I like this way of doing things as it allows you to plug in quite standard code easily.

Formula’s

I want to share two formula’s that I used. One is for computing the desaturation of a fully colored plasma cloud:

gray = (int)(0.3f * red + 0.59f * green + 0.11f * blue);
if (gray > 0xff)
    gray = 0xff;

if (gray < 0)
    gray = 0;

This works well if the image has lots of color. I experimented a bit with the plasma renderer and adapted it to generate only blue-tones, which results in this formula not working so well, as it takes only 11% of the blue component. So for monotone pictures, do not use this formula but simply take the active color component and put it into the new values for R, G, and B.
Note that this code is floating point and therefore relatively slow. It’s easy to change this to integer only arithmetic. If you need more speed, use lookup tables (it costs only 3 x 256 bytes!). I didn’t bother because this code is only used for preprocessing.

The second formula is for alpha bending. When you search the net you will find more than one formula to do this. Also, many focus on speed for doing realtime alpha blending in 16-bit pixel formats—this is all old stuff from the late 1990s when computers weren’t as powerful.
Anyway, the formula I used is:

result = (alpha * (srcPixel - destPixel)) / 256 + destPixel

Do this for every R, G, B component. Note that in blending terminology you have a ‘source pixel’ and a ‘dest pixel’, but they mean to say that you combine the R, G, and B components of source image 1 with the R, G, and B components of source image 2, and that forms the result.
There are many optimizations possible here, like treating the RGBA bytes like one 32-bit integer and using bit masking to take out the components. This is faster because you do less fetching from memory.
Note: If you want to do really good and fast alpha blending, you should probably use OpenGL. The glBlendFunc() is excellent for all kinds of blending, but in this case it involves some hassle; you have to make some textures first and render them to a resulting texture. Since I’m just using this for preprocessing and I’m not interested in doing realtime blending, I decided to implement the blending ‘by hand’.

The results

For making the nebula, the above procedure is repeated a few times for different colors, and these images are blended together. The pixel buffer is then turned into an OpenGL texture and textured onto a quad (made up from a triangle strip).

Although it’s just a boring gas cloud, I’m quite happy with it. Not everything is well though; the plasma renders into a square and therefore, there is a square nebula in the sky. To make it perfect, the nebula must be given a random shape and fade away near the edges of the texturing square.