Simple shader for point lights in fog

I needed a simple and fast shader to create fog, illuminated by point sources of light. To implement it, I wrote the effect of screen space, the results of which are shown below. The conveyor is almost as simple as for ordinary point sources. It does not require data volume structures, ray marching, and can be easily connected to an existing lighting shader.

The most important principle is that you can calculate in a closed form the light emanating from the fog, as if it is illuminated by a point source of illumination. My solution was to find the formula and its substitution in the shader.


A small scene with a spaceship rendered in the fog using my technique

The basic setup

Fog model


The shader makes a few assumptions about the fog we are working with. In fact, he considers each fragment of the fog a small translucent white scattering surface.

  • The shader suggests that the fog scatters the light evenly in all directions. Real fog or smoke does not always have this property, but this is a good approximation for the picture we are aiming for.
  • Shader suggests that all wavelengths of light interact with fog the same way. In reality, this is often not the case. For example, Rayleigh scattering makes the sky blue, scattering blue light more than other wavelengths.
  • The light emanating from the fog varies depending on the inverse square of the distance from the light source to the fog. In reality, this does not happen, but it looks normal. There are many resources that explain how fog actually scatters lighting, and I’m sure you can supplement my methodology with these formulas.

Nonetheless. the results look plausible even when using these simplifications.

Analytical solution


Let there be an infinitesimal fragment of fog (imagine a small cube of fog). Then the light emitted by this fragment of fog in my system will look like:

light=1(distance from light to fog fragment)2


This formula shows that the light coming from each fragment of the fog decreases as the inverse square of its distance to the light source, as is the case with light coming from a light-scattering surface.

For convenience, we rewrite the equation:

d=distance from light to fog fragment


light=1d2


view linelight atxdx


Or, if more formally:

L=light position


w=world fragment position


c=camera position


light arriving at camera=cw1|x-L|2dx



This integral expresses exactly what we need, but its calculation is more complicated than we need. Since all the variables in the image above are vectors with three values, we essentially have to deal with 12 variables. I will eliminate most of these variables by doing a simple reparameterization.

Define a new space called “ light space

  • The x-axis is the line of sight
  • Y-axis for lighting


Now, instead of a linear integral in three-dimensional space, I simply perform integration along the X axis from the camera to a fragment of the world. Next we need to rewrite the integral. Distance from pointx on the x-axis to the light source is h2+x2. Therefore, the line of sight integral takes the form

1h2+x2dx


Solving it manually or in your favorite computer algebra system, we get:

1h2+x2dx=tan-1(xh)h


That is, to get the lighting coming from the fog for a given line of sight, I calculate this integral from the camera to a fragment of the world. If the camera is inx=a, and a fragment of the world in x=b, then the illumination entering the pixel from the illuminated fog along the line of sight has the form:

ab1h2+x2dx=tan-1(bh)h-tan-1(ah)h


Thoughts on implementation


I managed to implement this shader in my delayed shader pipeline without major modifications. To do this, it was necessary to add about ten lines of GLSL code. If the conveyor already calculates the diffuse + specular illumination with point sources of light, then the point source shader should already have access to the camera position. the light source and the current pixel in the world, and that’s all you need!

Processing multiple light sources is very simple. It is enough to simply summarize the effect of light scattering by fog for each source. In order for the system to be effective for multiple sources, it is necessary to calculate the integral only for pixels so close to the light source that it can contribute to the visually noticeable amount of light in the scene. For example, in a delayed shader pipeline, you can render a spherical mesh around a light source so that the shader only calculates the effects of the light for pixels near that source.

I found out that the most difficult part of the implementation of this shader is the correct location of the camera and a fragment of the world in the lighting space, so that we can use the simplified integral.

results



Generate Worlds low level with little fog lighting


If there are more sources, the effect becomes stronger


If you also keep track of information about the direction of the light sources, you can create beautiful figured lighting in the fog

Additional reading


Introduction to Light Scattering: An Imaging Sciences Perspective

A Practical Analytic Single Scattering Model for Real Time Rendering

Atmospheric scattering and volumetric fog algorithm part 1

All Articles