How to render outlines in WebGL

Left — boundary outline only. Right — the technique described in this article. Boat model by Google Poly.
Top, a stylized two-tone lighting in ThreeJS inspired by Return of the Obra Dinn. Bottom, the same scene with outlines. Ship model from Museovirasto Museiverket Finnish Heritage Agency on Sketchfab.

Live Demo

Below is a link to a live demo of this technique implemented in ThreeJS. You can drag and drop any glTF model (as a single .glb/glTF file) to see the outline effect on your own test models:

Overview of the technique

Our outline shader needs 3 inputs:

  1. The depth buffer
  2. The normal buffer
  3. The color buffer (the original scene)

Overview of the rendering pipeline

Here is how we’re going to set up our effect:

This effect requires 3 passes. Two render-passes and one post-process.
Each stage in this rendering pipeline visualized.

Implementation

It’s difficult to describe this technique without reference to a specific engine since a core part of it is how to set up the rendering pipeline described above. The implementation details here will be specific to ThreeJS but you can see the PlayCanvas source code along with an editor project here:

1. Get the depth buffer

3D engines will typically draw all opaque objects into a depth buffer to ensure objects are rendered correctly without having to sort them back to front. All we have to do is get a reference to this buffer to pass it to our outline post process.

  • You need to know how the values are “packed”. Given the limited precision, does the engine just linearly interpolate Z values camera.near to camera.far? Does it do this in reverse? Or use a logarithmic depth buffer?
  • The engine most likely already has some mechanisms for working with depth values that you can re-use. For ThreeJS, you can include #include <packing> in your fragment shader which will allow you to use these helper functions.
  • For just visualizing it for debug purposes, you can collapse your camera’s near/far to cover the bounds of the object so you can more clearly see the image.

2. Create a normal buffer

If your engine supports outputting the normals of everything in the scene, you should use that directly. Otherwise, you’ll need to create a second render pass. This needs to be identical to the original render, with the only exception that all materials on all meshes are replaced by a “normal material” that renders the view space normals.

this.renderScene.overrideMaterial = new THREE.MeshNormalMaterial();renderer.render(this.renderScene, this.renderCamera);                      this.renderScene.overrideMaterial = null;

3. Create the outline post process

The outline effect is a post process — we’ve already rendered the scene, now we need to take those buffers, combine them, and render the result onto a fullscreen quad. The result of that will either go directly to the screen or to the next pass in the pipeline (like FXAA).

float depth = getPixelDepth(0, 0);                           
// Difference between depth of neighboring pixels and current.
float depthDiff = 0.0;
depthDiff += abs(depth - getPixelDepth(1, 0)); depthDiff += abs(depth - getPixelDepth(-1, 0)); depthDiff += abs(depth - getPixelDepth(0, 1)); depthDiff += abs(depth - getPixelDepth(0, -1));
vec3 normal = getPixelNormal(0, 0);
// Difference between normals of neighboring pixels and current float normalDiff = 0.0;
normalDiff += distance(normal, getPixelNormal(1, 0)); normalDiff += distance(normal, getPixelNormal(0, 1)); normalDiff += distance(normal, getPixelNormal(0, 1)); normalDiff += distance(normal, getPixelNormal(0, -1));
float outline = normalDiff + depthDiff;
gl_FragColor = vec4(vec3(outline), 1.0);
  • We can include the diagonals in our neighbor sampling to get a more accurate outline
  • We can sample one or more neighbors further, to get thicker outlines
  • We can multiply normalDiff and depthDiff by a scalar to control their influence on the final outline
  • We can tweak normalDiff and depthDiff so that only really stark differences in depth or normal direction show up as an outline. This is what the “normal bias” and the “depth bias” parameters control.

4. Combine the outlines with your final scene

Finally, to combine the outline onto the scene, we mix the scene color with a chosen “outline color”, based on our outline value.

float outline = normalDiff + depthDiff;
vec4 outlineColor = vec4(1.0, 1.0, 1.0, 1.0);//white outline
gl_FragColor = vec4(mix(sceneColor, outlineColor, outline));
Stylized lighting in ThreeJS inspired by Return of the Obra Dinn. Notice that the outlines change color based on the scene’s lighting.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Omar Shehata

Omar Shehata

Graphics programmer working on maps. I love telling stories and it's why I do what I do, from making games, to teaching & writing. https://omarshehata.me/