How to use WebGPU Timestamp Query


Below is a summary of the workflow for using timestamp queries:

  1. Request access to timestamp-query when initializing the device
  2. Create a query set with capacity N (number of timestamps you want to store in a frame)
  3. Create a storage buffer with size N * 8. This is where the timestamp results are stored in nanoseconds, 64 bit numbers.
  4. Record timestamps by calling commandEncoder.writeTimestamp. This will record a timestamp after all previous commands have finished.
  5. Call commandEncoder.resolveQuerySet to write the recorded timestamps to the storage buffer
  6. Copy the results from the storage buffer to the CPU and decode them as a BigInt64Array (see BigInt)

Step by step guide

A full example implementation can be found in this PR:

0 — Enable timestamp queries

Launch Chrome with the following command line flag:


1 — Queryset & buffer setup

First we add timestamp-query to the list of required features when requesting the device:

const device = await adapter.requestDevice({
requiredFeatures: ["timestamp-query"],
Uncaught (in promise) TypeError: Failed to execute 'requestDevice' on 'GPUAdapter': Unsupported feature: timestamp-query
const capacity = 3;//Max number of timestamps we can store
const querySet = device.createQuerySet({
type: "timestamp",
count: capacity,
const queryBuffer = device.createBuffer({
size: 8 * capacity,
usage: GPUBufferUsage.QUERY_RESOLVE
| GPUBufferUsage.STORAGE
| GPUBufferUsage.COPY_SRC
| GPUBufferUsage.COPY_DST,

2 — Write timestamps

We call commandEncoder.writeTimestamp(querySet, index) at any point in the pipeline where we want to record a timestamp:

// Add timestamps in between GPU commands
commandEncoder.writeTimestamp(querySet, 0);// Initial timestamp
// commandEncoder.draw(...)
commandEncoder.writeTimestamp(querySet, 1);

3 — Resolve timestamps to buffer

At the end of your frame, call commandEncoder.resolveQuerySet to actually write the timestamps to the storage buffer:

0,// index of first query to resolve
capacity,//number of queries to resolve
0);// destination offset

4 — Read the results

To get the timestamp results we need to copy the queryBuffer data to the CPU. Reading from a WebGPU buffer is explained in more detail here:

// === After `commandEncoder.finish()` is called ===
// Read the storage buffer data
const arrayBuffer = await readBuffer(device, queryBuffer);
// Decode it into an array of timestamps in nanoseconds
const timingsNanoseconds = new BigInt64Array(arrayBuffer);
// ....async function readBuffer(device, buffer) {
const size = buffer.size;
const gpuReadBuffer = device.createBuffer({size, usage: GPUBufferUsage.COPY_DST | GPUBufferUsage.MAP_READ });
const copyEncoder = device.createCommandEncoder();
copyEncoder.copyBufferToBuffer(buffer, 0, gpuReadBuffer, 0, size);
const copyCommands = copyEncoder.finish();
await gpuReadBuffer.mapAsync(GPUMapMode.READ);
return gpuReadBuffer.getMappedRange();

(Optional) 5 — Create labels

To make the output more useful, we can define labels for each timestamp we collect, and then print out the diff so we can see the time it took for each step in our pipeline. It may also be useful to convert the result to milliseconds.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Omar Shehata

Omar Shehata

Graphics programmer working on maps. I love telling stories and it's why I do what I do, from making games, to teaching & writing.