How to render outlines in WebGL

This article describes how to visualize outlines for a WebGL scene as a post process, with example implementations for ThreeJS & PlayCanvas.

Left — boundary outline only. Right — the technique described in this article. Boat model by Google Poly.

There are a few common approaches that produce boundary-only outlines as shown on the left of the above picture.

Rendering the full outlines of a scene is particularly useful when you need to clearly see the geometry and structure of your scene. For example, the stylized aesthetic of Return of the Obra Dinn would be very hard to navigate without clear outlines.

Top, a stylized two-tone lighting in ThreeJS inspired by Return of the Obra Dinn. Bottom, the same scene with outlines. Ship model from Museovirasto Museiverket Finnish Heritage Agency on Sketchfab.

The technique I describe here is similar to the post process shaders linked above, with the addition of a “normal buffer” in the outline pass that is used to find those inner edges.

Live Demo

Below is a live demo of this technique implemented in ThreeJS. You can drag and drop any glTF model (as a single .glb file) to see the outline effect on your own test models.

Drag and drop any glTF model (single .glb file) to see this outline effect on your own test models. Hosted on CodeSandbox.

You can also find the source code on GitHub:

Overview of the technique

Our outline shader needs 3 inputs:

  1. The depth buffer

Given these 3 inputs we will compute the difference between the current pixel’s depth value and its neighbors. A large depth difference tells us there’s a distance gap (this will typically give you the outer boundary of an object but not fine details on its surface).

We will do the same with the normal buffer. A difference in normal direction means a sharp corner. This is what gives us the finer details.

We then combine those differences to form the final outline, and combine that with the color buffer to add the outlines to the scene.

Tip: The live demo has a scaling factor for each of the normal & depth. You can scale that to 0 to see the influence of each on the final set of outlines.

Overview of the rendering pipeline

Here is how we’re going to set up our effect:

This effect requires 3 passes. Two render-passes and one post-process.

Render pass 1 captures the color of all objects in the scene in “Scene Buffer”.
It also outputs the depth of every pixel in a separate “Depth Buffer”.

Render pass 2 re-renders all objects in the scene with a normal material that colors it using the object’s view-normal at every pixel. This is written
to the “Normal Buffer”.

Finally, Outline pass is a post process, taking the 3 buffers and rendering onto a fullscreen quad.

Each stage in this rendering pipeline visualized.

This can be further optimized by modifying the engine to combine the normal and depth buffers into one “NormalDepth”, similar to how Unity does it, to avoid the need for the 2nd render pass.

A final step not shown in the diagram is an FXAA pass, which we need because we’re rendering the scene onto an off-screen buffer, which disables the browser’s native antialiasing.


It’s difficult to describe this technique without reference to a specific engine since a core part of it is how to set up the rendering pipeline described above. The implementation details here will be specific to ThreeJS but you can see the PlayCanvas source code along with an editor project here:

1. Get the depth buffer

3D engines will typically draw all opaque objects into a depth buffer to ensure objects are rendered correctly without having to sort them back to front. All we have to do is get a reference to this buffer to pass it to our outline post process.

In ThreeJS, this means setting depthBuffer = true on the render target we’re creating so that we capture the “scene color” and the “depth buffer” at the same time. See:

In our demo this is created here:

There are a few caveats to know when working with the depth buffer:

  • You need to know how the values are “packed”. Given the limited precision, does the engine just linearly interpolate Z values camera.near to camera.far? Does it do this in reverse? Or use a logarithmic depth buffer?

2. Create a normal buffer

If your engine supports outputting the normals of everything in the scene, you should use that directly. Otherwise, you’ll need to create a second render pass. This needs to be identical to the original render, with the only exception that all materials on all meshes are replaced by a “normal material” that renders the view space normals.

ThreeJS has a convenient scene.overrideMaterial method we can use for exactly this purpose. Instead of creating a new identical scene and a new identical camera, we can directly re-render the same scene with the given override material.

this.renderScene.overrideMaterial = new THREE.MeshNormalMaterial();renderer.render(this.renderScene, this.renderCamera);                      this.renderScene.overrideMaterial = null;

In our ThreeJS implementation this is encapsulated in CustomOutlinePass.js for convenience, but it is a completely separate render pass.

3. Create the outline post process

The outline effect is a post process — we’ve already rendered the scene, now we need to take those buffers, combine them, and render the result onto a fullscreen quad. The result of that will either go directly to the screen or to the next pass in the pipeline (like FXAA).

We need to pass 3 uniforms: sceneBuffer, depthBuffer, and normalBuffer.

We create helper functions to read the depth at an offset from a given pixel. Then we sum up the difference between the current pixel’s depth value and its neighbors.

float depth = getPixelDepth(0, 0);                           
// Difference between depth of neighboring pixels and current.
float depthDiff = 0.0;
depthDiff += abs(depth - getPixelDepth(1, 0)); depthDiff += abs(depth - getPixelDepth(-1, 0)); depthDiff += abs(depth - getPixelDepth(0, 1)); depthDiff += abs(depth - getPixelDepth(0, -1));

The same thing is done for normals as well. Since the normal is a 3 dimensional vector, we get the difference using the distance function.

vec3 normal = getPixelNormal(0, 0);
// Difference between normals of neighboring pixels and current float normalDiff = 0.0;
normalDiff += distance(normal, getPixelNormal(1, 0)); normalDiff += distance(normal, getPixelNormal(0, 1)); normalDiff += distance(normal, getPixelNormal(0, 1)); normalDiff += distance(normal, getPixelNormal(0, -1));

To render the outline only at this point we would do:

float outline = normalDiff + depthDiff;
gl_FragColor = vec4(vec3(outline), 1.0);

There’s a few parameters here to tweak:

  • We can include the diagonals in our neighbor sampling to get a more accurate outline

This is implemented in CustomOutlinePass.js.

4. Combine the outlines with your final scene

Finally, to combine the outline onto the scene, we mix the scene color with a chosen “outline color”, based on our outline value.

float outline = normalDiff + depthDiff;
vec4 outlineColor = vec4(1.0, 1.0, 1.0, 1.0);//white outline
gl_FragColor = vec4(mix(sceneColor, outlineColor, outline));

This is also where you can create any custom logic for how you combine your outline with your scene.

For example, in the Return of the Obra Dinn, the outlines change color based on the lighting. To achieve this effect we would check the lighting direction against the surface normal in our normal buffer, and color the outline white if it not in direct light, and black if it is facing the light source(s).

Stylized lighting in ThreeJS inspired by Return of the Obra Dinn. Notice that the outlines change color based on the scene’s lighting.

Thanks for reading! If you found this helpful, follow me on Twitter @Omar4ur to see more of my work.

Thanks to Ronja Böhringer whose Outlines via Postprocessing article helped me understand this technique and adapt it for the web.

If you have any suggestions or corrections to the code or technique, open an issue on GitHub ( or reach out to me directly. You can find my contact info at:

Graphics programmer at Cesium. I absolutely love telling stories and it's why I do what I do, from making games, to teaching & writing.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store