FinOps Agent
The FinOps Agent is an AI-powered component that analyzes Kubernetes resource usage and cloud cost data from your OpenChoreo components to generate reports with cost optimization recommendations. It integrates with Large Language Models (LLMs) to provide cost attribution, rightsizing suggestions, and actionable insights to help teams reduce cloud spend.
Prerequisites
Before enabling the FinOps Agent, ensure the following:
- OpenChoreo Observability Plane installed with at least a metrics module.
- An LLM API key from OpenAI (support for other providers coming soon)
- Metrics collection configured for your components.
Enabling the FinOps Agent
Step 1: Create the FinOps Agent Secret
The FinOps Agent requires a Kubernetes Secret named finops-agent in the openchoreo-observability-plane namespace with the following keys:
| Key | Description |
|---|---|
LLM_API_KEY | Your LLM provider API key |
OAUTH_CLIENT_SECRET | OAuth client secret (only needed for external IdP) |
You can create this secret using any method you prefer. If you followed the Try It Out on k3d locally guide, you can follow along:
kubectl exec -n openbao openbao-0 -- \
env BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN=root \
bao kv put secret/finops-llm-api-key value="<YOUR_LLM_API_KEY>"
kubectl apply -f - <<EOF
apiVersion: external-secrets.io/v1
kind: ExternalSecret
metadata:
name: finops-agent
namespace: openchoreo-observability-plane
spec:
refreshInterval: 1h
secretStoreRef:
kind: ClusterSecretStore
name: default
target:
name: finops-agent
data:
- secretKey: LLM_API_KEY
remoteRef:
key: finops-llm-api-key
property: value
- secretKey: OAUTH_CLIENT_SECRET
remoteRef:
key: finops-agent-oauth-client-secret
property: value
EOF
Step 2: Upgrade the Observability Plane
Enable the FinOps Agent and configure the LLM model. The --reuse-values flag preserves your existing configuration.
helm upgrade --install openchoreo-observability-plane oci://ghcr.io/openchoreo/helm-charts/openchoreo-observability-plane \
--version 0.0.0-latest-dev \
--namespace openchoreo-observability-plane \
--reuse-values \
--set finOpsAgent.enabled="true" \
--set finOpsAgent.llmName=<model-name>
The FinOps Agent currently supports the OpenAI GPT model series (e.g., gpt-5.4, gpt-5.2-pro, gpt-5 etc.). Support for additional model providers is coming soon.
If the observability plane and control plane are in separate clusters, also set finOpsAgent.openchoreoApiUrl to the control plane API URL (defaults to http://api.openchoreo.localhost:8080).
Step 3: Register with the control plane
Configure finOpsAgentURL in the ClusterObservabilityPlane resource so the UI knows where to reach the FinOps Agent:
kubectl patch clusterobservabilityplane default --type=merge -p '{"spec":{"finOpsAgentURL":"http://finops-agent.openchoreo.localhost:11080"}}'
Step 4: Verify the installation
Check that the FinOps Agent pod is running:
kubectl get pods -n openchoreo-observability-plane -l app.kubernetes.io/component=finops-agent
If you are using the default identity provider (Thunder) and the default SQLite report storage, your setup is complete.
Report Storage
By default, FinOps reports are stored in SQLite with a persistent volume — no external database required.
For production deployments that need horizontal scaling or shared storage, you can use PostgreSQL instead.
Store the PostgreSQL connection URI in OpenBao:
kubectl exec -n openbao openbao-0 -- \
env BAO_ADDR=http://127.0.0.1:8200 BAO_TOKEN=root \
bao kv put secret/finops-sql-backend-uri value="postgresql+asyncpg://<USER>:<PASSWORD>@<HOST>:<PORT>/<DBNAME>"
Add the SQL_BACKEND_URI key to the ExternalSecret from Step 1:
kubectl patch externalsecret finops-agent -n openchoreo-observability-plane --type=json \
-p '[{"op":"add","path":"/spec/data/-","value":{"secretKey":"SQL_BACKEND_URI","remoteRef":{"key":"finops-sql-backend-uri","property":"value"}}}]'
Then set the report backend in your Helm values:
finOpsAgent:
reportBackend: postgresql