You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: website/src/content/docs/docs/extending-typespec/performance-reporting.md
+58-25Lines changed: 58 additions & 25 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,14 @@
1
1
---
2
-
title: Performance reporting
2
+
title: Performance Reporting
3
3
---
4
4
5
-
# Performance reporting
5
+
The TypeSpec compiler can report performance statistics after compilation, helping you identify bottlenecks and optimize your build process. Enable this feature by passing the `--stats` flag to the CLI:
6
6
7
-
TypeSpec compiler can report performance statistics after a compilation. This can be enabled by passing `--stats` to the CLI.
7
+
```bash
8
+
tsp compile . --stats
9
+
```
10
+
11
+
<!-- cspell:disable -->
8
12
9
13
```ansi frame="terminal"
10
14
tsp compile . --stats
@@ -25,51 +29,67 @@ Compiler statistics:
25
29
[90mlinter[39m: [32m0ms[39m
26
30
```
27
31
28
-
Since TypeSpec 1.9.0 emitters can now also report their own performance statistics which will then be displayed in the same report.
32
+
<!-- cspell:enable -->
33
+
34
+
The report includes:
29
35
30
-
Reporting performance in an emitter can be done using the `EmitContext.perf` API. There is a few approaches depending on the use case:
36
+
-**Complexity metrics**: Number of types created and finished during compilation
37
+
-**Performance breakdown**: Time spent in each compilation phase (loading, resolving, type-checking, validation, and linting)
31
38
32
-
## 1. `startTimer`
39
+
## Emitter Performance Reporting
33
40
34
-
This approach is useful when you want to start a timer at one point in your code and stop it later.
41
+
:::note[Since TypeSpec 1.9.0]
42
+
:::
43
+
Emitters can report their own performance statistics, which are displayed alongside the compiler metrics in the same report.
44
+
45
+
Use the `EmitContext.perf` API to instrument your emitter code. The API provides several methods depending on your use case.
46
+
47
+
### `startTimer` - Manual Timer Control
48
+
49
+
Best for when the start and stop points are in different parts of your code, or when you need conditional timing:
35
50
36
51
```ts
37
52
const timer =context.perf.startTimer("my-task");
38
-
// ... do some work
53
+
54
+
// ... do some work across multiple statements
55
+
39
56
timer.stop();
40
57
```
41
58
42
-
##2. `time`
59
+
### `time` - Synchronous Function Timing
43
60
44
-
This approach is useful when you have a synchronous function and want to measure its execution time.
61
+
Best for wrapping synchronous code blocks. Returns the result of the callback function:
45
62
46
63
```ts
47
-
context.perf.time("my-task", () => {
64
+
const result =context.perf.time("my-task", () => {
48
65
// ... do some work
66
+
returncomputedValue;
49
67
});
50
68
```
51
69
52
-
##3. `timeAsync`
70
+
### `timeAsync` - Asynchronous Function Timing
53
71
54
-
This approach is useful when you have an asynchronous function and want to measure its execution time.
72
+
Best for wrapping async operations. Returns a promise with the callback's result:
Running `tsp compile . --stats` with this instrumented emitter produces:
124
+
125
+
<!-- cspell:disable -->
97
126
98
127
```ansi frame="terminal"
99
128
tsp compile . --stats
@@ -118,3 +147,7 @@ Compiler statistics:
118
147
[90mrender[39m: [32m28ms[39m
119
148
[90mwrite[39m: [32m51ms[39m
120
149
```
150
+
151
+
<!-- cspell:enable -->
152
+
153
+
The emitter's custom metrics (`prepare`, `render`, `write`) appear nested under the emitter name, giving you a clear breakdown of where time is spent during code generation.
0 commit comments