Observe Agents
Monitor, debug, and instrument your BeeAI agents
BeeAI provides observability tools to monitor and debug your agents through logging, telemetry, and integration with external monitoring systems.
View Agent Logs
Stream real-time logs from any running agent:
What you’ll see:
- Agent startup and initialization
- Request processing steps
- Error messages and stack traces
- Container lifecycle events
Logs are only available for managed (containerized) agents that are currently running.
Telemetry Collection
BeeAI includes OpenTelemetry instrumentation to collect traces and metrics. Telemetry data helps with performance monitoring, error tracking, usage analytics, and debugging agent interactions.
Default Configuration
By default, BeeAI sends telemetry to:
- Local Phoenix instance (if running) for trace visualization
- BeeAI telemetry service for anonymized platform improvement
The telemetry includes:
- Platform version and runtime details
- Agent execution traces
- Performance metrics
- Anonymized usage statistics
No sensitive data like prompts, responses, or personal information is collected.
Configure Telemetry Sharing
To disable external telemetry sharing:
To re-enable telemetry sharing:
This controls whether anonymized usage data is sent to help improve BeeAI.
Integration with Phoenix
Arize Phoenix provides visualization for OpenTelemetry traces from your agents.
Install Arize Phoenix
Install and start Phoenix using beeai platform start
command:
You can run this even if your platform is already running without loosing data.
Check if Phoenix is running
Spinning up phoenix can take a while, even after the platform start
will report success.
Go to http://localhost:6006 and check if it’s running. If not, please wait a few
minutes or check your internet connection.
Run Agent with Phoenix Configuration
Execute the following command to run an example chat agent:
View Traces in Phoenix
Go to http://localhost:6006 in your browser and open the default project.
Want richer trace detail? Use the OpenInference standard for custom instrumentation.
Instrumenting with OpenInference
To enable full traceability of your BeeAI agents, you can instrument them using OpenInference. This guide walks you through the installation and setup process for both Python and JavaScript frameworks.
This guide only covers frameworks officially supported by the OpenInference ecosystem. If your framework isn’t listed, instrumentation guidance is not currently provided.
Before you begin, make sure the Phoenix server is running.
Install required packages
Instrument the BeeAI Framework
Use BeeAI as Usual
You can now run your BeeAI agents as normal. Telemetry data will be captured and exported automatically.
For advanced usage (e.g., running instrumentation outside the BeeAI lifecycle), see: OpenInference Instrumentation for BeeAI
Install required packages
Instrument the BeeAI Framework
Use BeeAI as Usual
You can now run your BeeAI agents as normal. Telemetry data will be captured and exported automatically.
For advanced usage (e.g., running instrumentation outside the BeeAI lifecycle), see: OpenInference Instrumentation for BeeAI
Install required packages
Instrument the BeeAI Framework
Use BeeAI as Usual
You can now run your BeeAI agents as normal. Telemetry data will be captured and exported automatically.
For advanced usage (e.g., running instrumentation outside the BeeAI lifecycle), see: OpenInference Instrumentation for BeeAI