
By Harry Roberts
Harry Roberts is an independent consultant web performance engineer. He helps companies of all shapes and sizes find and fix site speed issues.
CARVIEW |
(last updated on )
Written by Harry Roberts on CSS Wizardry.
Independent writing is brought to you via my wonderful Supporters.
Google Chrome recently introduced the Performance Extensibility API, a means of putting your code front-and-centre in Chrome’s Performance panel. Their own documentation is somewhat dry and doesn’t really state the benefits or outcomes very clearly, so I spent a couple of hours looking into it and here are the key takeaways.
Not sure how to use the Performance Extensibility API?
Read up on the User Timing
API’s performance.mark()
and
performance.measure()
first.
If you:
Chrome’s DevTools are incredibly powerful, and are only ever getting stronger, but this latest update hands a lot of control over to us! We can make our own instrumentation a key part of our performance profiling experience.
performance.mark()
and performance.measure()
The Extensibility API is particularly useful in extending the
performance.mark()
and .measure()
User Timings. If you aren’t using these
already, you should be. If you aren’t familiar with them, you need to be before
this post will make sense to you. My 2022 post Measure What You Impact,
Not What You
Influence is
a relatively decent introduction to the topic, but for now, this simple demo
should help:
<script>performance.mark('cssStart');</script>
<link rel=stylesheet href=/app.css>
<script>
performance.mark('cssEnd');
performance.measure('cssTime', 'cssStart', 'cssEnd');
</script>
Here, we drop a high-resolution timestamp—cssStart
—using performance.mark()
.
We then fetch a synchronous stylesheet, app.css
, before dropping a second
high-resolution timestamp, cssEnd
, using performance.mark()
once more.
Lastly, we use performance.measure()
to create a measure of the duration of
the delta between cssStart
and cssEnd
.
We could log any of the above .mark()
s or .measure()
s to the console. For
example, to get the start time of cssStart
, we could do:
console.log(`CSS Start: ${performance.getEntriesByName('cssStart')[0].startTime} ms`);
Or the duration of the cssTime
measure:
console.log(`CSS Duration: ${performance.getEntriesByName('cssTime')[0].duration} ms`);
Note that .mark()
’s useful property is startTime
and .measures()
’s is
duration
.
source:console-api
to scope your Console
messages only to things logged to it. Much cleaner.We can use performance.mark()
and .measure()
in JavaScript, too, naturally:
performance.mark('jsStart');
// Simulate expensive JavaScript execution
setTimeout(() => {
performance.mark('jsEnd');
performance.measure('jsTime', 'jsStart', 'jsEnd');
console.log(performance.getEntriesByName('jsStart')[0].startTime);
console.log(performance.getEntriesByName('jsTime')[0].duration);
}, 1000);
Here we’re simulating an expensive bit of scripting that we might want to instrument and optimise.
These are neat, but the particularly nice thing about them is that Chrome DevTools will automatically pick up these marks and measures, and display them in the Timings track of the Performance panel:
cssTime
and jsTime
take up
a proportional amount of space to their duration, but jsEnd
,
a .mark()
, takes up a thin sliver of the UI as it represents
a moment in time. .mark()
s are found above
.measure()
s.The benefit of the console.log()
approach is that it’s much faster—you don’t
need to run a full performance profile, but the benefit of the Performance panel
method is that you can visualise the times in context of your application’s
runtime. The former is great if you just need the number as quickly as possible;
the latter is great if you’re trying to contextualise your work.
Honestly, if you’ve never seen that before, I dare say this article has provided a bunch of value already! Next, go and see how to put this into use with my Measure What You Impact, Not What You Influence article which gives good use-cases and examples for using these bare-metal metrics.
The new Extensibility API allows us to extend this functionality. We can create arbitrary custom tracks in the Performance panel, not limiting us to only the default Timings track, and we can add our own metadata to these marks and measures that can be used to surface additional information in the DevTools UI!
In order to do so, we need to write quite a lot more code than the
performance.mark()
and .measure()
s we’ve just looked at. We begin by
extending the .mark()
or .measure()
with a devtools
object that lives in
the details
property, The whole point of this post is to cut through the fluff
and show you, pragmatically, exactly what you do and don’t need.
Before we can do anything, in the Performance panel’s settings, we need to enable Show custom tracks:
In this section, I will show you the bare minimum you need to make a start with the Extensibility API. Some aspects are mandatory and others, like colours, are entirely optional.
Starting with .mark()
, the dataType
is required; nothing else is. Let’s
convert our jsEnd
.mark()
to use the Extensibility API:
performance.mark('jsEnd', {
detail: {
devtools: {
dataType: 'marker'
}
}
});
We’re forgoing anything else for now, but this is the first step to adopting the
Extensibility API for performance.mark()
:
jsEnd
marker. That’s the extended
performance.mark()
in action.Key improvements:
.mark()
s infinitely easier.However, there are two major downsides thus far:
.mark()
for a .meaasure()
..mark()
was
fired! Even clicking the marker itself doesn’t show us any timestamp
information. Neither hovering or clicking the marker gives us the high
resolution timing that we’d use a .mark()
for in the first place:
Jack
Franklin, who works on
the Performance panel, read this article and, within less than two hours, has a
change lined
up
to add timestamps to extended
performance.mark()
in M140. Incredible.
To this end, I’d be inclined to use .mark()
less as a timestamp and more as a,
well, marker—it can quickly bring your attention to the relevant part of your
trace.
That’s your minimum viable .mark()
.
performance.measure()
is a little more useful, though. Let’s convert cssTime
to its minimum viable version:
performance.measure('cssTime', {
start: 'cssStart',
end: 'cssEnd',
detail: {
devtools: {
track: 'CSS'
}
}
});
We still need to pass in the reference start .mark()
via the start:
property. Our end marker, if omitted, defaults to right now—when the
.measure()
is being called—or can be provided explicitly via end:
. The
track
property is mandatory, and this forms our minimum viable .measure()
.
Note that we don’t need to supply the dataType
property as its omission
defaults to dataType: 'track-entry'
, which is exactly what we need. Let’s see
what this looks like.
Hey! That’s pretty neat!
We now have our first custom track titled CSS
, sourced from our track:
'CSS'
. This is the least we need to do in order to make use of the
Extensibility API for performance.measure()
. Next, we’ll take it further.
What I just showed you was the bare minimum to get up and running with the
Extensibility API. .mark()
is a little underwhelming, in my opinion, but the
way we can extend .measure()
is very cool. Let’s start with the built-in
extensions we have.
With both .mark()
and .measure()
, we can apply custom colours. Not arbitrary
or fully custom, like #f43059
, but from DevTools’ own palette: primary
,
primary-light
, primary-dark
, secondary
, secondary-light
,
secondary-dark
, tertiary
, tertiary-light
, tertiary-dark
, and error
.
Let’s extend our .mark()
from before a little further:
performance.mark('jsEnd', {
detail: {
devtools: {
dataType: 'marker',
color: 'secondary-dark'
}
}
});
Notice that our jsEnd
marker is now a dark pinky-purple:
secondary-dark
.Easy enough! Next, let’s add some more descriptive tooltip text:
performance.mark('jsEnd', {
detail: {
devtools: {
dataType: 'marker',
color: 'secondary-dark',
tooltipText: 'Simulated JS Execution Complete'
}
}
});
To be honest, this is no better, and certainly no more convenient, than just
using a different string in place of jsEnd
—this text hasn’t produced
a tooltip, but rather just replaced the marker’s text.
The takeaway so far is that the Extensibility API is less useful for
performance.mark()
.
tooltipText
simply replaces the marker’s text and
doesn’t actually create a tooltip.Lastly, for .mark()
, we can pass in arbitrary metadata. That could be pretty
useful for other developers picking up a project:
performance.mark('jsEnd', {
detail: {
devtools: {
dataType: 'marker',
color: 'secondary-dark',
tooltipText: 'Simulated JS Execution Complete',
properties: [
['File', 'app.js'],
['Function', 'setTimeout()']
]
}
}
});
Above, I’ve passed in almost-pointless data to illustrate the point, but I’m sure you can think of your own useful use-cases.
All of the features I just showed you (color
, tooltipText
, and properties
)
apply equally to performance.measure()
, so let’s leap ahead and bring our
performance.measure()
example up to date in one go:
performance.measure('cssTime', {
start: 'cssStart',
end: 'cssEnd',
detail: {
devtools: {
track: 'CSS',
color: 'secondary-dark',
tooltipText: 'External CSS fetched and parsed',
properties: [
['URL', app.css],
['Transferred Size', 29.3 KB],
['Decoded Body Size', 311.8 KB],
['Queuing & Latency', 104 ms],
['Download', 380 ms]
]
}
}
});
I’ve added a color
, a tooltipText
, and some made up metadata in
properties
. Note that I actually built a demo that used the Resource Timing
API to grab these numbers for real. That code is in the appendix.
.measure()
actually gets a proper tooltip, and we
have rich metadata in the Summary pane.Now this is more like it!
tooltipText
actually looks and acts like a tooltip;This is where I see the Extensibility API becoming particularly useful. There’s just one more thing I want to show you: track groups.
We created a custom CSS track using track: 'CSS'
. We might want to make
a track for JS, API calls, you name it. We can then take all of these tracks and
group them into a track group.
Track groups are useful if, say, we want to track first- and third-party
attribution separately, or if our codebase has different teams who want to
isolate their instrumentation from each other. They’re also incredibly easy to
set up. Let’s evolve our .measure()
a little more:
performance.measure('cssTime', {
start: 'cssStart',
end: 'cssEnd',
detail: {
devtools: {
track: 'CSS',
trackGroup: 'First Party',
color: 'secondary-dark',
tooltipText: 'External CSS fetched and parsed',
properties: [
['URL', app.css],
['Transferred Size', 29.3 KB],
['Decoded Body Size', 311.8 KB],
['Queuing & Latency', 104 ms],
['Download', 380 ms]
]
}
}
});
And let’s quickly go back and add trackGroup: 'First Party'
to our JS’s
.measure()
:
performance.measure('jsTime', {
start: 'jsStart',
end: 'jsEnd',
detail: {
devtools: {
track: 'JS',
trackGroup: 'First Party',
color: 'secondary-dark',
tooltipText: 'Simulated JS Execution Complete',
properties: [
['File', 'app.js'],
['Function', 'setTimeout()']
]
}
}
});
…and what do we get?
Now we have a track group called First Party which contains both a CSS and a JS sub-track!
I hope you can already begin to imagine and envision use-cases for tracks and
track groups. If you’re profiling and instrumenting your application with
performance.mark()
and performance.measure()
already, the idea of getting it
all organised surely excites you!
The syntax for all of this is very repetetive and cumbersome, so all I would say
is start with as little as you can get away with. Personally, I would not
recommend using the Extensibility API for performance.mark()
, so I’m not going
to confuse folk by recapping it.
For performance.measure()
, all you really need to get off the ground is:
performance.measure('<name>', {
start: '<start>',
end: '<end>',
detail: {
devtools: {
track: '<track-name>'
}
}
});
This will automatically move this measure into its own custom track named, in this case, <track-name>.
Next up, I’d suggest looking into track groups so that you can better organise yourself:
performance.measure('<name>', {
start: '<start>',
end: '<end>',
detail: {
devtools: {
track: '<track-name>',
trackGroup: '<group-name>'
}
}
});
Perhaps a group for any code that comes from your design system or your own first party application, with sub-tracks for JS, API calls, etc. It really is up to you.
Beyond that, we’re mostly thinking about adding metadata and custom colours, but don’t worry about that until you’ve got the mechanism dialled in.
If you work on a framework or a third party that instruments its own User Timings, please consider moving them into your own track group. It would be nice to see, for example, Next.js route-change or hydration timings in their own place.
Drop this straight into an HTML file, fire it up in Chrome, and it will Just Work™.
<!doctype html>
<html lang=en-gb>
<meta charset=utf-8>
<meta name=viewport content="width=device-width, minimum-scale=1.0">
<title>Extensibility API</title>
<script>
performance.mark('cssStart', {
detail: {
devtools: {
dataType: "marker",
tooltipText: 'Get Bootstrap CSS from CDN',
color: "secondary-light"
}
}
});
</script>
<link rel=stylesheet href=https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css id=jsCSS>
<script>
performance.mark('cssEnd');
// Grab stylesheet’s timing metadata.
const css = document.getElementById('jsCSS');
const cssURL = css.href;
const cssTimingInformation = performance.getEntriesByName(cssURL)[0];
const cssTransferSize = (cssTimingInformation.transferSize / 1024).toFixed(2);
const cssDecodedBodySize = (cssTimingInformation.decodedBodySize / 1024).toFixed(2);
const cssLatencyDuration = (cssTimingInformation.responseStart - cssTimingInformation.startTime).toFixed(2);
const cssdownloadDuration = (cssTimingInformation.responseEnd - cssTimingInformation.responseStart).toFixed(2);
</script>
<script>
performance.measure('cssTime', {
start: 'cssStart',
end: 'cssEnd',
detail: {
devtools: {
dataType: 'track-entry',
trackGroup: 'Third Party Instrumentation',
track: 'CSS',
tooltipText: 'CSS Downloaded and Parsed',
color: 'secondary-light',
properties: [
['URL', cssURL],
['Transferred Size', `${cssTransferSize} KB`],
['Decoded Body Size', `${cssDecodedBodySize} KB`],
['Queuing & Latency', `${cssLatencyDuration} ms`],
['Download', `${cssdownloadDuration} ms`]
]
}
}
});
</script>
Harry Roberts is an independent consultant web performance engineer. He helps companies of all shapes and sizes find and fix site speed issues.
Hi there, I’m Harry Roberts. I am an award-winning Consultant Web Performance Engineer, designer, developer, writer, and speaker from the UK. I write, Tweet, speak, and share code about measuring and improving site-speed. You should hire me.
Cache-Control
DOMContentLoaded
Still Matter?
max-age
calculator
max-age
Value?
I help teams achieve class-leading web performance, providing consultancy, guidance, and hands-on expertise.
I specialise in tackling complex, large-scale projects where speed, scalability, and reliability are critical to success.