Da Fish in Sea

These are the voyages of Captain Observant

Lighting a Sphere, Part 4

| Comments

So last time we ended up with a chunky looking ball. Great. Lets spice things up a little by turning it into a planet, and a big one at that: Jupiter. Here is what the end result looks like: (note - slider controls direction of lighting, in the x,y,z planes.. I apologize for the crappy GUI)

flash 10 required

If the planet is not centered, but instead you only see the bottom right part of it in the top left, refresh your browser.

There’s two main changes since last time. Firstly, in the pixelbender kernel I’ve applied the shading value to the color value of a second input image (called texture). Since all color values in pixel bender are between 0 and 1, simply multiplying each channel value by a constant in the same range (0-1) has the effect of lightening or darkening it, without changing the hue. First I’ll just dump out the kernel since it is not that long:


kernel NormalMap2TextureMap
<   namespace : "com.dafishinsea";
    vendor : "David Wilhelm";
    version : 2;
    description : "kernel to apply shading to a texture map using a normal map";
    input image4 src;
    input image4 texture;
    output pixel4 dst;
    parameter float xr
        minValue: -1.0;
        maxValue: 1.0;
        defaultValue: 0.1;
    parameter float yr
        minValue: -1.0;
        maxValue: 1.0;
        defaultValue: 0.0;
    parameter float zr
        minValue: -1.0;
        maxValue: 1.0;
        defaultValue: 0.0;

        //get currrent pixel color
        pixel4 p = sampleNearest(src,outCoord());
        pixel4 tp = sampleNearest(texture,outCoord());
        //image4 is a 4 element vector of RGBA channels
        //we want a vector of the first three as the normal
        float xd = 2.0*p.r - 1.0;
        float yd = 2.0*p.g - 1.0;
        float zd = 2.0*p.b - 1.0;
        float3 normal = float3(xd, yd, zd);
        normal = normalize(normal);
        float3 light = float3(xr, yr, zr) ;
        light = normalize(light);
        float dotp = dot(light, normal);
        //angle between vectors = acos (a dot b)  
        float angle = acos(dotp);
        float shade = angle/3.14159265358;
        //add contrast
        shade = shade*shade;
        dst = pixel4(shade*tp.r, shade*tp.g, shade*tp.b, 1.0);


The xr,yr,zr params are the direction of the light, which we passed in from flash. We have now added a second input image called ‘texture’ - this is a spherical map of jupiter. I scaled it down a bit to 640 x 640. We grab the pixel color on line 37, just as we got the color of the normal map. Then we don’t do anything else until the end, after we have calculated the shading value. Then, instead of giving the result as the shading value, we multiply each channel of the texture by the shading value. I also fixed the glitch in the lighting by normalizing the light and normal values.

Now the other change is that I refactored the createNormalMap() method so that the normal map is generated on a per-pixel basis, by getting the normal of the surface of the sphere at each point (pixel) in the texture map. This results in a much smoother normal map.

 * create normal map --evaluate normal for each pixel based lat/lon at that position in the sphere
private function createNormalMap():void
    var radius:Number = 1;
    var lon:Number = 0;
    var lat:Number = 0;
    for(var v:int = 0; v < mapHeight; ++v)
        lat = (v/mapHeight)*PI;
        var y3d:Number = radius*Math.cos(lat);

        for(var u:int = 0; u < mapWidth; ++u)
            lon = (u/mapWidth)*TWOPI;
            var x3d:Number = radius*Math.cos(lon)*Math.sin(lat);
            var z3d:Number = radius*Math.sin(lon)*Math.sin(lat);
            //normal is vector from center of sphere (consider = origin) to this point
            var norm:Vector3D = new Vector3D(x3d,y3d,z3d);
            var r:uint = 255*(norm.x + 1)/2;
            var g:uint = 255*(norm.y + 1)/2;
            var b:uint = 255*(norm.z + 1)/2;
            var normalColor:uint = r << 16 | g << 8 | b;

And here is the result:

As you can see it is really smooth, which results in smooth lighting, even though our sphere is not a very high-polygon mesh. And I’m getting a reasonable frame rate, though using PixelBender always gets my old MacBook’s fan whirring, since it is does not have a supported graphics card, and does everything on the CPU. Even so, the performance is a good deal better than it would have been without using PixelBender. A further optimization would be to save out the normal map to a file and compile it in to the app – this would certainly be worthwile if you had a lot of 3d objects on stage. Another technique to improve performance of this sort of thing is to dynamically reduce the resolution of the texture map when the size of the projected 3d object changes. No need to process a 640 x 640 texture and normal map, if the object itself is only 100 pixels high. This is known as mipmapping, and I may well explore that later.

Also want to point out that this is very primitive lighting. The first time round I did this, the lighting seemed flat, so I squared the shade value to make it a bit more contrasty. A lot more can be achieved here, as lighting techniques are a whole science..some really cool things can be done. I also wanted to explore generating a rough surface texture on the sphere, but that proved more complex than I initially thought so I’ve decided to try that on a plane first. Coming up next… height maps!

Da Code

NormalMap4.as (GitHub)

NormalMapShader.pbk (GitHub)