BeeAI provides observability tools to monitor and debug your agents through logging, telemetry, and integration with external monitoring systems.

View Agent Logs

Stream real-time logs from any running agent:

beeai logs <agent-name>

What you’ll see:

  • Agent startup and initialization
  • Request processing steps
  • Error messages and stack traces
  • Container lifecycle events

Logs are only available for managed (containerized) agents that are currently running.

Telemetry Collection

BeeAI includes OpenTelemetry instrumentation to collect traces and metrics. Telemetry data helps with performance monitoring, error tracking, usage analytics, and debugging agent interactions.

Default Configuration

By default, BeeAI sends telemetry to:

  • Local Phoenix instance (if running) for trace visualization
  • BeeAI telemetry service for anonymized platform improvement

The telemetry includes:

  • Platform version and runtime details
  • Agent execution traces
  • Performance metrics
  • Anonymized usage statistics

No sensitive data like prompts, responses, or personal information is collected.

Configure Telemetry Sharing

To disable external telemetry sharing:

beeai platform start --no-telemetry-sharing

To re-enable telemetry sharing:

# Omit --no-telemetry-sharing
beeai platform start

This controls whether anonymized usage data is sent to help improve BeeAI.

Integration with Phoenix

Arize Phoenix provides visualization for OpenTelemetry traces from your agents.

1

Install Arize Phoenix

Install and start Phoenix using beeai platform start command:

beeai platform start --set phoenix.enabled=true

You can run this even if your platform is already running without loosing data.

2

Check if Phoenix is running

Spinning up phoenix can take a while, even after the platform start will report success. Go to http://localhost:6006 and check if it’s running. If not, please wait a few minutes or check your internet connection.

3

Run Agent with Phoenix Configuration

Execute the following command to run an example chat agent:

beeai run chat "Hello"
4

View Traces in Phoenix

Go to http://localhost:6006 in your browser and open the default project.

Want richer trace detail? Use the OpenInference standard for custom instrumentation.

Instrumenting with OpenInference

To enable full traceability of your BeeAI agents, you can instrument them using OpenInference. This guide walks you through the installation and setup process for both Python and JavaScript frameworks.

This guide only covers frameworks officially supported by the OpenInference ecosystem. If your framework isn’t listed, instrumentation guidance is not currently provided.

Before you begin, make sure the Phoenix server is running.

1

Install required packages

pip install beeai-framework openinference-instrumentation-beeai
2

Instrument the BeeAI Framework

from openinference.instrumentation.beeai import BeeAIInstrumentor

BeeAIInstrumentor().instrument()
3

Use BeeAI as Usual

You can now run your BeeAI agents as normal. Telemetry data will be captured and exported automatically.

For advanced usage (e.g., running instrumentation outside the BeeAI lifecycle), see: OpenInference Instrumentation for BeeAI