Skip to main content
BeeAI Platform includes built-in observability through OpenTelemetry (OTLP), with Arize Phoenix available out-of-the-box for immediate use. Additional backends like Langfuse can be configured for advanced analytics.

Quickstart

Enable Phoenix Observability

1

Start platform with Phoenix enabled

beeai platform start --set phoenix.enabled=true
2

Access Phoenix UI

Navigate to http://localhost:6006 in your browser to view traces, spans, and LLM interactions.

Advanced

Configure Langfuse Integration

Langfuse is an LLM observability platform that can be integrated with the BeeAI Platform through OpenTelemetry.
1

Get Langfuse credentials

  1. Sign up at cloud.langfuse.com
  2. Create a project and generate API keys
  3. Encode your keys: echo -n "public_key:secret_key" | base64
2

Create a configuration file (config.yaml):

collector:
  exporters:
    otlphttp/langfuse:
      endpoint: "https://cloud.langfuse.com/api/public/otel" # EU data region
      headers:
        Authorization: "Basic <auth-string>"
  pipelines:
    traces:
      receivers: [ otlp ]
      processors: [ memory_limiter, filter/phoenix, batch ]
      exporters: [ otlphttp/langfuse ]
3

Start the platform with the configuration

beeai platform start -f config.yaml
4

Access Langfuse UI

Check your Langfuse project dashboard for incoming traces and metrics.

Additional Resources

I