Quickstart
Enable Phoenix Observability
1
Start platform with Phoenix enabled
2
Access Phoenix UI
Navigate to http://localhost:6006 in your browser to view traces, spans, and LLM interactions.
Advanced
Configure Langfuse Integration
Langfuse is an LLM observability platform that can be integrated with the BeeAI Platform through OpenTelemetry.1
Get Langfuse credentials
- Sign up at cloud.langfuse.com
- Create a project and generate API keys
- Encode your keys:
echo -n "public_key:secret_key" | base64
2
Create a configuration file (config.yaml):
3
Start the platform with the configuration
4
Access Langfuse UI
Check your Langfuse project dashboard for incoming traces and metrics.
Additional Resources
- OpenTelemetry Docs: https://opentelemetry.io/docs/
- Langfuse Docs: https://langfuse.com/docs
- Phoenix Docs: https://docs.arize.com/phoenix
- Prometheus Docs: https://prometheus.io/docs/
- Grafana Docs: https://grafana.com/docs/