MCP Servers
OpenChoreo provides Model Context Protocol (MCP) servers that enable AI assistants to interact with your OpenChoreo platform.
Overviewβ
OpenChoreo provides two MCP servers:
- Control Plane MCP Server - Provides tools for managing OpenChoreo resources (namespaces, projects, components, builds, deployments, infrastructure)
- Observability Plane MCP Server - Provides tools for accessing observability data (logs, metrics, traces)
Each MCP server is independently accessible and requires separate configuration.
Using Both MCP Servers Togetherβ
For the best experience, configure both the Control Plane and Observability Plane MCP servers together. Many observability tools use resource UUIDs (for projects, components, deployments, etc.) as parameters. These UUIDs can be obtained from the Control Plane MCP server, making it much easier to query observability data.
Example workflow:
- Use Control Plane MCP to list components and get their UUIDs
- Use those UUIDs with Observability Plane MCP to fetch logs, metrics, or traces for specific components
Finding MCP Server URLsβ
MCP server URLs follow a simple pattern based on the respective service's hostname:
- Control Plane MCP Server:
<control-plane-api-hostname>/mcp - Observability Plane MCP Server:
<observer-service-hostname>/mcp
Determining Your Hostnamesβ
The hostnames are constructed from the global.baseDomain value configured during Helm installation:
- Control Plane API:
api.<global.baseDomain> - Observability Observer:
observer.<global.baseDomain>
To find your configured base domain:
# Check the base domain from Helm release values
helm get values openchoreo-control-plane -n openchoreo-control-plane | grep baseDomain
# Or derive it from the API hostname (OpenChoreo uses Gateway API HTTPRoutes)
kubectl get httproute openchoreo-api -n openchoreo-control-plane -o jsonpath='{.spec.hostnames[0]}' | sed 's/^api\.//'
Once you have your baseDomain, construct your MCP server URLs:
- Control Plane:
http://api.<baseDomain>:<port>/mcporhttps://api.<baseDomain>/mcp - Observability:
http://observer.<baseDomain>:<port>/mcporhttps://observer.<baseDomain>/mcp
Example: Self-Hosted Kubernetes Deploymentβ
If you followed k3d Self-Hosted Kubernetes Deployment guide, your MCP server URLs will be:
Single Cluster Installation:
- Control Plane MCP Server:
http://api.openchoreo.localhost:8080/mcp - Observability Plane MCP Server:
http://observer.openchoreo.localhost:11080/mcp
Multi-Cluster Installation:
- Control Plane MCP Server:
http://api.openchoreo.localhost:8080/mcp - Observability Plane MCP Server:
http://observer.observability.openchoreo.localhost:11087/mcp
The k3d Self-Hosted Kubernetes Deployment examples show different ports and hostnames:
- Control plane services use port 8080
- Observability plane uses port 11080 (single cluster) or 11087 (multi-cluster)
- Multi-cluster deployments use distinct subdomain patterns for namespace isolation
Prerequisitesβ
Before configuring MCP servers with your AI assistant, ensure you have:
- Running OpenChoreo instance - Complete the Self-Hosted Kubernetes Deployment
- MCP server hostnames - Use the commands above to determine your hostnames from
global.baseDomain - AI Assistant Installed - Have one of the supported AI assistants installed (e.g., Claude Desktop, Cline, etc.)
Connection Requirementsβ
Both OpenChoreo MCP servers expose HTTP endpoints that AI assistants connect to. You'll need:
- Endpoint URL: The MCP server URL based on your installation (see above)
- Authentication: Bearer token in
Authorizationheader - Transport: HTTP-based MCP transport (most AI assistants support this)
Next Stepsβ
To configure your AI assistant to connect to OpenChoreo MCP servers, follow the AI Configuration Guide to obtain an authentication token and set up the connection.