Skip to content

Commit 3be567a

Browse files
improved docs
1 parent 1d41ed7 commit 3be567a

File tree

1 file changed

+58
-25
lines changed

1 file changed

+58
-25
lines changed

website/src/content/docs/docs/extending-typespec/performance-reporting.md

Lines changed: 58 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,14 @@
11
---
2-
title: Performance reporting
2+
title: Performance Reporting
33
---
44

5-
# Performance reporting
5+
The TypeSpec compiler can report performance statistics after compilation, helping you identify bottlenecks and optimize your build process. Enable this feature by passing the `--stats` flag to the CLI:
66

7-
TypeSpec compiler can report performance statistics after a compilation. This can be enabled by passing `--stats` to the CLI.
7+
```bash
8+
tsp compile . --stats
9+
```
10+
11+
<!-- cspell:disable -->
812

913
```ansi frame="terminal"
1014
tsp compile . --stats
@@ -25,51 +29,67 @@ Compiler statistics:
2529
linter: 0ms
2630
```
2731

28-
Since TypeSpec 1.9.0 emitters can now also report their own performance statistics which will then be displayed in the same report.
32+
<!-- cspell:enable -->
33+
34+
The report includes:
2935

30-
Reporting performance in an emitter can be done using the `EmitContext.perf` API. There is a few approaches depending on the use case:
36+
- **Complexity metrics**: Number of types created and finished during compilation
37+
- **Performance breakdown**: Time spent in each compilation phase (loading, resolving, type-checking, validation, and linting)
3138

32-
## 1. `startTimer`
39+
## Emitter Performance Reporting
3340

34-
This approach is useful when you want to start a timer at one point in your code and stop it later.
41+
:::note[Since TypeSpec 1.9.0]
42+
:::
43+
Emitters can report their own performance statistics, which are displayed alongside the compiler metrics in the same report.
44+
45+
Use the `EmitContext.perf` API to instrument your emitter code. The API provides several methods depending on your use case.
46+
47+
### `startTimer` - Manual Timer Control
48+
49+
Best for when the start and stop points are in different parts of your code, or when you need conditional timing:
3550

3651
```ts
3752
const timer = context.perf.startTimer("my-task");
38-
// ... do some work
53+
54+
// ... do some work across multiple statements
55+
3956
timer.stop();
4057
```
4158

42-
## 2. `time`
59+
### `time` - Synchronous Function Timing
4360

44-
This approach is useful when you have a synchronous function and want to measure its execution time.
61+
Best for wrapping synchronous code blocks. Returns the result of the callback function:
4562

4663
```ts
47-
context.perf.time("my-task", () => {
64+
const result = context.perf.time("my-task", () => {
4865
// ... do some work
66+
return computedValue;
4967
});
5068
```
5169

52-
## 3. `timeAsync`
70+
### `timeAsync` - Asynchronous Function Timing
5371

54-
This approach is useful when you have an asynchronous function and want to measure its execution time.
72+
Best for wrapping async operations. Returns a promise with the callback's result:
5573

5674
```ts
57-
await context.perf.timeAsync("my-task", async () => {
58-
// ... do some work
75+
const result = await context.perf.timeAsync("my-task", async () => {
76+
// ... do some async work
77+
return await fetchData();
5978
});
6079
```
6180

62-
## 4. `reportTime`
81+
### `reportTime` - Report Pre-measured Duration
6382

64-
This approach is useful when you already have the duration of a task and want to report it directly.
65-
You can then use the `perf` utilities to measure the duration in that task.
83+
Best when you already have timing data from another source (e.g., a child process or external tool):
6684

67-
```ts title=emit.ts
85+
```ts title="emit.ts"
6886
const { duration } = runTask();
69-
context.perf.reportTime("my-task", durationInMs);
87+
context.perf.reportTime("my-task", duration);
7088
```
7189

72-
```ts title=task-runner.ts
90+
You can use the standalone `perf` utilities to measure duration in code that doesn't have access to the emit context:
91+
92+
```ts title="task-runner.ts"
7393
import { perf } from "@typespec/compiler/utils";
7494

7595
function runTask(): { duration: number } {
@@ -79,21 +99,30 @@ function runTask(): { duration: number } {
7999
}
80100
```
81101

82-
## Example
102+
## Complete Example
103+
104+
Here's how to instrument a typical emitter with multiple phases:
83105

84106
```ts
85-
export function $onEmit(context: EmitContext) {
107+
import { EmitContext } from "@typespec/compiler";
108+
109+
export async function $onEmit(context: EmitContext) {
110+
// Manual timer for the preparation phase
86111
const timer = context.perf.startTimer("prepare");
87112
prepare();
88113
timer.stop();
89114

115+
// Wrap synchronous rendering with automatic timing
90116
const renderResult = context.perf.time("render", () => render());
91117

92-
context.perf.timeAsync("write", () => writeOutput(renderResult));
118+
// Wrap async file writing with automatic timing
119+
await context.perf.timeAsync("write", async () => writeOutput(renderResult));
93120
}
94121
```
95122

96-
Which would result in the following
123+
Running `tsp compile . --stats` with this instrumented emitter produces:
124+
125+
<!-- cspell:disable -->
97126

98127
```ansi frame="terminal"
99128
tsp compile . --stats
@@ -118,3 +147,7 @@ Compiler statistics:
118147
render: 28ms
119148
write: 51ms
120149
```
150+
151+
<!-- cspell:enable -->
152+
153+
The emitter's custom metrics (`prepare`, `render`, `write`) appear nested under the emitter name, giving you a clear breakdown of where time is spent during code generation.

0 commit comments

Comments
 (0)