Skip to content

Commit e0e3421

Browse files
committed
refactor: improve demo with better branding and features
- Merged best features from demo.vhs into scripts/demo.tape - Added title screen with project branding - Multiple output formats (GIF, MP4, WebM) - Increased dimensions to 1200x800 for better readability - Changed to Dracula theme for better aesthetics - Added verbose encoding demonstration - Added special tokens decode with --show-special - Added JSON output with jq for token counting - Added help command demonstrations - Added nice ending with thanks and GitHub link - Removed old demo.vhs to avoid confusion The demo now provides a comprehensive showcase of tokenizer's features with better visual presentation and flow.
1 parent 049655a commit e0e3421

File tree

2 files changed

+63
-21
lines changed

2 files changed

+63
-21
lines changed

docs/demo.gif

-290 KB
Loading

scripts/demo.tape

Lines changed: 63 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,46 +1,88 @@
1-
# VHS tape file for generating tokenizer demo GIF
1+
# VHS tape file for generating tokenizer demo
22
# Run with: vhs scripts/demo.tape
33

44
# Setup
55
Output docs/demo.gif
6+
Output docs/demo.mp4
7+
Output docs/demo.webm
8+
69
Set FontSize 14
7-
Set Width 800
8-
Set Height 600
9-
Set Theme "Monokai Pro"
10-
11-
# Show installation command (but don't run it)
12-
Type "# Install Tokenizer CLI" Sleep 500ms Enter
13-
Sleep 1s
14-
Type "# brew install agentstation/tap/tokenizer" Sleep 500ms Enter
10+
Set Width 1200
11+
Set Height 800
12+
Set Theme "Dracula"
13+
14+
# Title screen
15+
Type "# Tokenizer - High-performance tokenizer implementations in Go" Sleep 500ms Enter
1516
Sleep 2s
17+
Clear
18+
19+
# Install tokenizer
20+
Type "# Install via Homebrew" Sleep 500ms Enter
21+
Sleep 500ms
22+
Type "brew install agentstation/tap/tokenizer" Sleep 500ms Enter
23+
Sleep 5s
1624

1725
Type "# Check version" Sleep 500ms Enter
1826
Type "tokenizer version" Sleep 500ms Enter
1927
Sleep 2s
2028

21-
Ctrl+L
29+
Clear
2230

23-
Type "# Encode text with Llama 3" Sleep 500ms Enter
24-
Sleep 1s
31+
# Basic encoding/decoding
32+
Type "# Encode text to tokens" Sleep 500ms Enter
33+
Sleep 500ms
2534
Type "tokenizer llama3 'Hello, world!'" Sleep 500ms Enter
2635
Sleep 2s
2736

2837
Type "# Decode tokens back to text" Sleep 500ms Enter
29-
Type "tokenizer llama3 decode 128000 9906 11 1917 0 128001" Sleep 500ms Enter
38+
Type "tokenizer llama3 decode 128000 9906 11 1917 59 0 128001" Sleep 500ms Enter
39+
Sleep 2s
40+
41+
Type "# Decode with special tokens shown" Sleep 500ms Enter
42+
Type "tokenizer llama3 decode --show-special 128000 9906 11 1917 59 0 128001" Sleep 500ms Enter
3043
Sleep 2s
3144

32-
Ctrl+L
45+
Clear
3346

34-
Type "# Get tokenizer info" Sleep 500ms Enter
47+
# Advanced features
48+
Type "# Encode with verbose output" Sleep 500ms Enter
49+
Type "tokenizer llama3 encode --verbose 'The quick brown fox jumps over the lazy dog.'" Sleep 500ms Enter
50+
Sleep 3s
51+
52+
Clear
53+
54+
# Tokenizer info
55+
Type "# Get tokenizer information" Sleep 500ms Enter
3556
Type "tokenizer llama3 info | head -15" Sleep 500ms Enter
3657
Sleep 3s
3758

38-
Ctrl+L
59+
Clear
3960

40-
Type "# Stream processing" Sleep 500ms Enter
41-
Type "echo 'The quick brown fox' | tokenizer llama3 stream" Sleep 500ms Enter
61+
# Pipe and streaming
62+
Type "# Stream text from stdin" Sleep 500ms Enter
63+
Type "echo 'Tokenizers are amazing!' | tokenizer llama3" Sleep 500ms Enter
4264
Sleep 2s
4365

44-
Type "# Direct encoding (shorthand)" Sleep 500ms Enter
45-
Type "tokenizer llama3 'I love tokenizers!'" Sleep 500ms Enter
46-
Sleep 2s
66+
Type "# Count tokens using JSON output" Sleep 500ms Enter
67+
Type "echo 'This is a test of the tokenizer.' | tokenizer llama3 encode --format json | jq '.token_count'" Sleep 500ms Enter
68+
Sleep 2s
69+
70+
Clear
71+
72+
# Help commands
73+
Type "# Get help and see all options" Sleep 500ms Enter
74+
Type "tokenizer --help" Sleep 500ms Enter
75+
Sleep 3s
76+
77+
Clear
78+
79+
Type "# See llama3 specific options" Sleep 500ms Enter
80+
Type "tokenizer llama3 --help" Sleep 500ms Enter
81+
Sleep 3s
82+
83+
Clear
84+
85+
# Final message
86+
Type "# Thanks for watching! 🚀" Sleep 500ms Enter
87+
Type "# Learn more at: https://github.com/agentstation/tokenizer" Sleep 500ms Enter
88+
Sleep 3s

0 commit comments

Comments
 (0)