View Agent Logs
Stream real-time logs from any running agent:- Agent startup and initialization
- Request processing steps
- Error messages and stack traces
- Container lifecycle events
Logs are only available for managed (containerized) agents that are currently
running.
Telemetry Collection
BeeAI includes OpenTelemetry instrumentation to collect traces and metrics. Telemetry data helps with performance monitoring, error tracking, usage analytics, and debugging agent interactions.Default Configuration
By default, BeeAI sends telemetry to:- Local Phoenix instance (if running) for trace visualization
- Platform version and runtime details
- Agent execution traces
Integration with Phoenix
Arize Phoenix provides visualization for OpenTelemetry traces from your agents.Important License Notice: Phoenix is disabled by default in BeeAI. When
you enable Phoenix, be aware that Arize Phoenix is licensed under the Elastic
License v2 (ELv2), which has specific terms regarding commercial use and
distribution. By enabling Phoenix, you acknowledge that you are responsible
for ensuring compliance with the ELv2 license terms for your specific use
case. Please review the Phoenix
license before
enabling this feature in production environments.
1
Install Arize Phoenix
Install and start Phoenix using You can run this even if your platform is already running without loosing data.
beeai platform start
command:2
Check if Phoenix is running
Spinning up phoenix can take a while, even after the
platform start
will report success.
Go to http://localhost:6006 and check if it’s running. If not, please wait a few
minutes or check your internet connection.3
Run Agent with Phoenix Configuration
Execute the following command to run an example chat agent:
4
View Traces in Phoenix
Go to http://localhost:6006 in your browser and open the default project.
Want richer trace detail? Use the
OpenInference standard for
custom instrumentation.
Instrumenting with OpenInference
To enable full traceability of your BeeAI agents, you can instrument them using OpenInference. This guide walks you through the installation and setup process for both Python and JavaScript frameworks.This guide only covers frameworks officially supported by the OpenInference
ecosystem. If your framework isn’t listed, instrumentation guidance is not
currently provided.
- Python Intrumentation
- JavaScript Instrumentation
1
Install required packages
2
Instrument the BeeAI Framework
3
Use BeeAI as Usual
You can now run your BeeAI agents as normal. Telemetry data will be captured and exported automatically.
For advanced usage (e.g., running instrumentation outside the BeeAI
lifecycle), see: OpenInference Instrumentation for
BeeAI