Covers getting started, repositories, scanning, findings, configuration, SBOM, code graph, impact analysis, DAST, AI chat, issue tracker integration, Docker deployment, environment variables, Keycloak auth, and OpenTelemetry. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
3.1 KiB
OpenTelemetry Observability
Compliance Scanner exports traces and logs via OpenTelemetry Protocol (OTLP) for integration with observability platforms like SigNoz, Grafana (Tempo + Loki), Jaeger, and others.
Enabling
Set the OTEL_EXPORTER_OTLP_ENDPOINT environment variable to enable OTLP export:
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4317
When this variable is not set, telemetry export is disabled and only console logging is active.
What Is Exported
Traces
Distributed traces for:
- HTTP request handling (via
tower-httpTraceLayer) - Database operations
- Scan pipeline phases
- External API calls (LiteLLM, Keycloak, Git providers)
Logs
All tracing::info!, tracing::warn!, tracing::error! log events are exported as OTel log records, including structured fields.
Configuration
| Variable | Description | Default |
|---|---|---|
OTEL_EXPORTER_OTLP_ENDPOINT |
Collector gRPC endpoint | (disabled) |
OTEL_SERVICE_NAME |
Service name in traces | compliance-agent or compliance-dashboard |
RUST_LOG |
Log level filter | info |
Docker Compose Setup
The included docker-compose.yml provides an OTel Collector service:
otel-collector:
image: otel/opentelemetry-collector-contrib:latest
ports:
- "4317:4317" # gRPC
- "4318:4318" # HTTP
volumes:
- ./otel-collector-config.yaml:/etc/otelcol-contrib/config.yaml
The agent and dashboard are pre-configured to send telemetry to the collector:
agent:
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: http://otel-collector:4317
OTEL_SERVICE_NAME: compliance-agent
dashboard:
environment:
OTEL_EXPORTER_OTLP_ENDPOINT: http://otel-collector:4317
OTEL_SERVICE_NAME: compliance-dashboard
Collector Configuration
Edit otel-collector-config.yaml to configure your backend. The default exports to debug (stdout) only.
SigNoz
exporters:
otlp/signoz:
endpoint: "signoz-otel-collector:4317"
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/signoz]
logs:
receivers: [otlp]
processors: [batch]
exporters: [otlp/signoz]
Grafana Tempo (Traces) + Loki (Logs)
exporters:
otlp/tempo:
endpoint: "tempo:4317"
tls:
insecure: true
loki:
endpoint: "http://loki:3100/loki/api/v1/push"
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/tempo]
logs:
receivers: [otlp]
processors: [batch]
exporters: [loki]
Jaeger
exporters:
otlp/jaeger:
endpoint: "jaeger:4317"
tls:
insecure: true
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp/jaeger]
Verifying
After starting with telemetry enabled, look for this log on startup:
OpenTelemetry OTLP export enabled endpoint=http://otel-collector:4317 service=compliance-agent
If the endpoint is unreachable, the application still starts normally — telemetry export fails silently without affecting functionality.