0
$\begingroup$

Question:

What is going wrong with the shape of the contours of constant illumination in the below picture ?

Detail:

Although Lambertian shading might be ill-regarded because it exposes the hard edges of our low-poly models, and because it models only "matte" surfaces, which it supposes reflect light equally in all directions (in particular, independent of the direction to the camera), yet I understand that it is considered a reasonably accurate approximation to how a single (hence no shadows, no illumination from light bouncing off surfaces and hitting other surfaces, no ambient occlusion etc) planar matte surface might look in space, say, illuminated by several lights which are either effectively infinitely far away (like the sun), or else nearby.

I would like to make the physically correct choice to have the surface normal vector constant across the surface of my triangle, that is, I do not want to use any of the various tricks that make the mesh look smoother, by averaging vertex normals over the triangles sharing the vertex, etc.

I believe that the intensity of incident light should decrease as the inverse square of the distance to the light (and therefore be effectively constant, for lights effectively infinitely far away, like the sun) ? But I am not sure. In any case, I would like to ignore this possible effect for now.

I would also like to ignore specular effects, whether produced by raising to some power, the cosine of the angle between the surface normal and the direction to the light, as in Blinn and Phong, or by more physically accurate models.

In spite of all these simplifications, I believe that I still must interpolate across the triangle, since the direction from the point on the triangle to the light source varies with that point (given that my light is not infinitely far away, but rather is quite close to the model). My understanding is that in world coordinates, it is mathematically correct to interpolate that cosine linearly between the 3 vertices, using for instance Barycentric coordinates.

My understanding of perspective projections is that linear interpolation in the world space becomes hyperbolic interpolation, after the perspective projection and w-divide transform coordinates into normalised device coordinates (NDCs). The GPU automatically uses such hyperbolic interpolation, for occlusion testing with the z-buffer, I believe. That is, it measures the reciprocal z at each vertex, and then for each pixel interior to the triangle, it linearly interpolates between those reciprocals, and takes the reciprocal of the result, to get a correct z value. Similarly, any quantity that varies linearly across the triangle in world-space must be treated the same way, once in NDCs.

In the picture below, I have a square made from two triangles joined along the top-left to bottom-right diagonal, lying in a horizontal plane. The camera is above the square, looking down on it, via a reversed-z, infinite far-plane, perspective projection matrix. I have a light source close to the plane of the square but just above it, and (as we look at it) off the bottom-right corner of the image. As expected, the bottom-right corner of the square is brighter than the top-left corner. Also as expected (if we look very carefully) we see faint Mach bands, due to the quantisation of colour and the characteristics of our eyes / brain.

Strange shading

I would expect the Mach bands to form concentric circles, being the level sets of the cosine mentioned above. Even without any explicit attenuation of intensity due to the distance to the light (see above). However, they do not, or if they do, then the centre of the circle is much farther away than the light source, and is different for the two triangles. It seems as if maybe something has gone wrong with the interpolation. Perhaps it is being done linearly instead of hyperbolically ? Even though Blinn explained this potential error in 1992:

Hyperbolic Interpolation - Jim Blinn

I am using the very old-fashioned, deprecated, (Immediate-mode ? Fixed Function ?) style of OpenGL. I am given to understand that by using the modern type of OpenGL, with shaders etc, I could specify the lighting more or less exactly as I would like. But I am trying to find out whether the model described above is achievable in the old-fashioned OpenGL. It seems as if it ought to be, since even that kind of OpenGL was capable of techniques fancier than Lambertian shading, for instance, specular lights.

Here are some of my settings:

GLfloat light0_position[] = {800.0f, 0.0f, -(float)screendistmm + 800.0f, 1.0f};
GLfloat light0_diffuse[] = {1.0f, 1.0f, 1.0f, 1.0f};
GLfloat light0_specular[] = {1.0f, 1.0f, 1.0f, 1.0f};

glLightfv(GL_LIGHT0, GL_POSITION, light0_position);
glLightfv(GL_LIGHT0, GL_DIFFUSE, light0_diffuse);
glLightfv(GL_LIGHT0, GL_SPECULAR, light0_specular);

glShadeModel(GL_SMOOTH);

glEnable(GL_LIGHT0);

glEnable(GL_RESCALE_NORMAL);
glEnable(GL_NORMALIZE);

// Enable gamma-correction:
glEnable(GL_FRAMEBUFFER_SRGB);


glEnable(GL_LIGHTING);

glBegin(GL_TRIANGLE_FAN);

     GLfloat colour[] = {1.0f, 0.5f, 0.5f, 1.0f};
     glMaterialfv(GL_FRONT, GL_DIFFUSE, colour);
     glNormal3f(0.0f, 1.0f, 0.0f);

     glVertex4f(-500.0f, -200.0f, -500.0f - (float)screendistmm, 1.0f);
     glVertex4f(-500.0f, -200.0f, 500.0f - (float)screendistmm, 1.0f);
     glVertex4f(500.0f, -200.0f, 500.0f - (float)screendistmm, 1.0f);
     glVertex4f(500.0f, -200.0f, -500.0f - (float)screendistmm, 1.0f);

glEnd();
$\endgroup$
4
  • 1
    $\begingroup$ I am not familiar with deprecated opengl so I can't comment on that part. Ignoring the specifics of opengl for a directional light you should get a constant colour and for a point light you should get the radial attenuation you're looking for. $\endgroup$
    – lightxbulb
    Commented Jan 4, 2022 at 9:04
  • $\begingroup$ Thank you - that is reassuring ! The "wrong-looking" attenuation in the image above makes the top-left to bottom-right diagonal of the square, which is a "seam" between two triangles, brighter than it ought to be. I feel that excessive brightness along mesh seams is an error that I've seen before. $\endgroup$
    – Simon
    Commented Jan 4, 2022 at 13:39
  • 1
    $\begingroup$ Maybe it's using Gouraud shading instead of Phong shading which causes said issue. $\endgroup$
    – lightxbulb
    Commented Jan 4, 2022 at 13:56
  • 1
    $\begingroup$ Immediate mode rendering is like a box of chocolates, you never know what your gonna get. $\endgroup$
    – pmw1234
    Commented Jan 15, 2022 at 20:27

1 Answer 1

1
$\begingroup$

As far as I remember, without a fragment shader the shading is only evaluated at vertices then interpolated. Try subdividing the mesh.

$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.