Fundamental breaking changes are happening now, affecting protocol, transport, and APIs. Consider this version experimental. Join Alpha discussions to help shape it.

Architecture

BeeAI enables a local-first experience, running all agents on your laptop for full data control and seamless integration with local inference engines like ollama.

The platform is made up of several core concepts as shown in the following diagram:

Agent providers

An agent provider exposes one or more agents via the Agent Communication Protocol. Providers typically define a manifest that instructs BeeAI on how to run them and may include additional features like tools.

For details on registering providers, see providers.

BeeAI includes several built-in providers, see our agent catalog.

BeeAI Server

Explore the beeai-server source code.

BeeAI server manages providers, spawns/destroys provider processes, and provides a unified endpoint for routing requests and notifications between providers and clients.

It also manages provider configurations and environment variables and offers a simple REST API for agent communication and telemetry data forwarding to the Arize Phoenix OTEL backend.

BeeAI CLI and UI

Explore the beeai-cli and beeai-ui source code.

These components offer interfaces for interacting with agents.

Agents can have custom input/output interfaces, but the platform provides standard UIs like chat and hands-off.

Use beeai run <name> or the graphical chat at localhost:8333 to interact with agents.

Python and TypeScript clients

Explore the acp-python-sdk and acp-typescript-sdk source code.

You can use the ACP SDK to programmatically interact with the agents and integrate multiple agents into a workflow for your application.

Arize Phoenix

BeeAI integrates with Arize Phoenix, an open-source agent tracing tool. For more information see agent observability.