🚀 IMPORTANT UPDATE

ACP is now part of A2A under the Linux Foundation!👉 Learn more | 🛠️ Migration Guide
BeeAI is an open-source platform that makes it easy to discover, run, and share AI agents across frameworks. Built on the Agent2Agent Protocol (A2A) and hosted by the Linux Foundation, BeeAI bridges the gap between different agent ecosystems.

The Problem BeeAI Solves

Teams building AI agents often run into three main challenges:
  • Framework Fragmentation: Multiple frameworks lead to silos and duplicated work.
  • Deployment Complexity: Each agent needs its own setup, slowing down scale.
  • Discovery Challenges: Finding and reusing agents is hard without a central hub.
BeeAI provides a standardized platform to discover, run, and share agents from any framework.

Key Features

FeatureWhat it does
Instant web interfaceSpin up a shareable front-end in minutes—focus on your agent, not UI frameworks.
Complete infrastructureDeploy your agent container instantly, with database, storage, scaling, monitoring, plus MCP gateway and RAG services.
Framework-agnosticCombine agents from BeeAI, LangChain, CrewAI, and more using A2A.
Multi-provider playgroundTest agents across OpenAI, Anthropic, Gemini, IBM watsonx, Ollama, and others to compare performance and cost instantly.

How BeeAI Works

Run BeeAI locally, and your agents connect to the platform for discovery, testing, and sharing through a simple web interface. You get:
  • Access to ready-to-use agents from our official catalog
  • Easy testing across multiple LLM providers
  • Framework-agnostic development via the BeeAI SDK powered by A2A
  • Ready-made web interfaces for all your agents
Want to deploy BeeAI for your team? BeeAI can also run as a hosted platform with full Kubernetes infrastructure. Learn more about platform deployment.

Get Started

Join the Community