Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 1 addition & 36 deletions app/Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

6 changes: 3 additions & 3 deletions app/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -19,17 +19,16 @@ icalendar = { version = "0.16.13", features = [
] }
moka = { version = "0.12.10", features = ["future"] }
openai = { version = "1.1.0", default-features = false, features = ["rustls"] }
opentelemetry = { version = "0.28.0", features = ["trace"] }
opentelemetry = { version = "0.28.0", features = ["trace", "metrics"] }
opentelemetry-http = "0.28.0"
opentelemetry-otlp = { version = "0.28.0", features = [
"grpc-tonic",
"metrics",
"trace",
] }
opentelemetry-prometheus = "0.28.0"
opentelemetry-semantic-conventions = "0.28.0"
opentelemetry-stdout = "0.28.0"
opentelemetry_sdk = { version = "0.28.0", features = ["rt-async-std", "trace"] }
opentelemetry_sdk = { version = "0.28.0", features = ["rt-async-std", "trace", "metrics"] }
poem = { version = "3.1.7", features = [
"opentelemetry",
"rustls",
Expand Down Expand Up @@ -72,6 +71,7 @@ tracing = { version = "0.1.40", features = ["attributes"] }
tracing-opentelemetry = { version = "0.29.0" }
tracing-subscriber = { version = "0.3.19", features = ["env-filter", "fmt"] }
uuid = { version = "1.17.0", features = ["v4", "serde"] }
once_cell = "1.19.0"
jsonwebtoken = "9.3.0"
urlencoding = "2.1.3"
url = "2.5.4"
Expand Down
50 changes: 49 additions & 1 deletion app/compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,54 @@ services:
timeout: 5s
retries: 5

# OpenTelemetry Collector - receives OTLP data from your Rust app
otel-collector:
image: otel/opentelemetry-collector-contrib:latest
command: ["--config=/etc/otel-collector-config.yaml"]
volumes:
- ./docker/otel-collector-config.yaml:/etc/otel-collector-config.yaml
ports:
- "4317:4317" # OTLP gRPC receiver
- "4318:4318" # OTLP HTTP receiver
- "8889:8889" # Prometheus metrics endpoint
depends_on:
- prometheus

# Prometheus - stores metrics
prometheus:
image: prom/prometheus:latest
command:
- '--config.file=/etc/prometheus/prometheus.yml'
- '--storage.tsdb.path=/prometheus'
- '--web.console.libraries=/etc/prometheus/console_libraries'
- '--web.console.templates=/etc/prometheus/consoles'
- '--storage.tsdb.retention.time=200h'
- '--web.enable-lifecycle'
volumes:
- ./docker/prometheus.yml:/etc/prometheus/prometheus.yml
- prometheus-data:/prometheus
ports:
- "9090:9090"

# Grafana - visualizes metrics
grafana:
image: grafana/grafana:latest
environment:
- GF_SECURITY_ADMIN_PASSWORD=admin
- GF_USERS_ALLOW_SIGN_UP=false
volumes:
- grafana-data:/var/lib/grafana
- ./docker/grafana/dashboards:/var/lib/grafana/dashboards
- ./docker/grafana/provisioning:/etc/grafana/provisioning
ports:
- "3001:3000"
depends_on:
- prometheus

volumes:
pg-data:
driver: local
driver: local
prometheus-data:
driver: local
grafana-data:
driver: local
66 changes: 66 additions & 0 deletions app/docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# OpenTelemetry Observability Stack

This Docker Compose setup provides a complete observability stack for monitoring your Ethereum Forum application's OpenAI usage metrics.

## Services

### OpenTelemetry Collector
- **Port**: 4317 (gRPC), 4318 (HTTP)
- **Purpose**: Receives OTLP data from your Rust application
- **Config**: `otel-collector-config.yaml`

### Prometheus
- **Port**: 9090
- **Purpose**: Stores metrics data
- **Config**: `prometheus.yml`
- **URL**: http://localhost:9090

### Grafana
- **Port**: 3001
- **Purpose**: Visualizes metrics with dashboards
- **Credentials**: admin/admin
- **URL**: http://localhost:3001

## Getting Started

1. **Start the stack:**
```bash
docker compose up -d
```

2. **Configure your Rust app** to send OTLP data to `localhost:4317` (gRPC) or `localhost:4318` (HTTP)

3. **Access Grafana** at http://localhost:3001 (admin/admin)
- The "OpenAI Usage Metrics" dashboard is automatically provisioned
- View real-time token usage, rates, and user breakdowns

4. **Access Prometheus** at http://localhost:9090 for raw metric queries

## Metrics Available

Your Rust application exports these OpenAI usage metrics:
- `openai_prompt_tokens_total` - Total prompt tokens used
- `openai_completion_tokens_total` - Total completion tokens used
- `openai_total_tokens_total` - Total tokens used (prompt + completion)

Each metric includes a `user_id` label for per-user tracking.

## Environment Variables for Rust App

Make sure your Rust application is configured to send OTLP data:

```bash
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
OTEL_SERVICE_NAME=ethereum-forum
```

## Stopping the Stack

```bash
docker compose down
```

To also remove volumes:
```bash
docker compose down -v
```
Loading