Deployment
Overview
Deploy the BeeAI Platform for your team with centralized agent management and shared resources
BeeAI Platform can be deployed as a centralized instance for your team, providing a shared environment where developers can publish agents and end users can discover and run them through a unified interface.
Internal Use Only: BeeAI is not secure for public internet exposure. It includes an unprotected LLM API endpoint (/chat/completions
) that would allow anyone to use your LLM credits for free. Deploy only on internal networks behind firewalls.
Why Deploy BeeAI?
Moving from individual BeeAI instances to a team deployment provides several key advantages:
- Centralized Catalog: All team agents in one searchable location
- Shared Configuration: Manage LLM providers and API keys centrally
- Agent Sharing: Developers can easily share and discover agents
- Resource Management: Automatic scaling and resource allocation
Deployment
BeeAI deploys on Kubernetes using Helm charts. See the Kubernetes guide for detailed instructions.
Requirements
- Kubernetes 1.24+ with admin access
- Helm 3.8+
- Persistent storage (20GB+ for PostgreSQL)
- LLM provider API access (OpenAI, Anthropic, etc.)
Quick Start
- Follow the Kubernetes guide
- Configure your LLM provider using environment variables
- Enable authentication for shared access
- Add agents from the public registry or your own sources