Skip to main content
Overcut ships with a set of system-managed models, but you can bring your own API keys (BYOK) to connect models from any supported provider. Custom models are configured once at the workspace level and can then be assigned to individual agents, workflows, or set as the workspace default.

Overview

Bring Your Own Key

Connect your existing API keys from OpenAI, Anthropic, Azure, AWS Bedrock, or OpenRouter.

Per-Agent Control

Assign a specific model to each agent — use a fast model for triage and a powerful model for code review.

Cascade Defaults

Set defaults at the workspace, workflow, or agent level. Agents inherit the closest configured default automatically.

Encrypted Storage

API keys are encrypted at rest and never exposed in the UI, logs, or to the LLM itself.

Supported Providers

ProviderConfig TypeNotes
OpenAIAPI key + model nameSupports custom baseURL for proxies and compatible APIs
Azure OpenAIAPI key + endpoint + deploymentStandard Azure OpenAI Service
Azure OpenAI (Responses API)API key + model + endpointAzure-native Responses API
AnthropicAPI key + model nameClaude models via Anthropic’s API directly
AWS BedrockRegion + credentialsClaude models and others via AWS Bedrock Converse API
OpenRouterAPI key + model nameAccess 200+ models through a single API key

Creating a Custom Model

Open LLM Models

Navigate to LLM Models in the main menu.

Create a new model

Click New Model. Enter a display name (e.g. “Claude Sonnet — Production”) and select a provider.

Configure the provider

Fill in the provider-specific configuration. Each provider requires different fields — see the Provider Configuration section below.

Save

Overcut validates the configuration against the provider schema, encrypts your API key, and stores the model. It is now available for assignment.
Once a model is created, the provider and model key cannot be changed. To switch providers, create a new model and reassign your agents.

Provider Configuration

OpenAI

{
  "apiKey": "sk-...",
  "model": "gpt-4.1"
}
apiKey
string
required
Your OpenAI API key.
model
string
required
The model identifier (e.g. gpt-4.1, gpt-4o, o3).
baseURL
string
Optional. Override the API base URL for proxies or OpenAI-compatible endpoints.

Azure OpenAI

{
  "apiKey": "your-azure-key",
  "baseURL": "https://your-resource.openai.azure.com",
  "azureOpenAIApiDeploymentName": "gpt-4-deployment",
  "apiVersion": "2024-02-15-preview"
}
apiKey
string
required
Your Azure OpenAI resource key.
baseURL
string
required
The Azure OpenAI resource endpoint URL.
azureOpenAIApiDeploymentName
string
required
The name of the deployed model in your Azure resource.
apiVersion
string
required
The Azure OpenAI API version (e.g. 2024-02-15-preview).

Anthropic

{
  "anthropicApiKey": "sk-ant-...",
  "model": "claude-sonnet-4-20250514"
}
anthropicApiKey
string
required
Your Anthropic API key.
model
string
required
The Claude model identifier (e.g. claude-sonnet-4-20250514, claude-opus-4-20250514).

AWS Bedrock

{
  "region": "us-east-1",
  "credentials": {
    "accessKeyId": "AKIA...",
    "secretAccessKey": "..."
  }
}
region
string
required
The AWS region where your Bedrock models are available.
credentials
object
AWS credentials with accessKeyId and secretAccessKey. If omitted, Overcut uses the default credential chain (IAM role, environment variables).

OpenRouter

{
  "apiKey": "sk-or-...",
  "model": "anthropic/claude-sonnet-4"
}
apiKey
string
required
Your OpenRouter API key.
model
string
required
The OpenRouter model identifier, in provider/model format (e.g. anthropic/claude-sonnet-4, openai/gpt-4.1).

Assigning Models

Per-Agent

Each agent has a model selector in its settings. Choose a specific model or leave it on Default Model to inherit from the workflow or workspace.

Open the agent

Navigate to Agent Roles and select an agent.

Select a model

Use the Model dropdown to pick a custom model or Default Model.

Per-Workflow

Set a workflow-level default in the Workflow Builder so all agents in that workflow inherit the same model unless they have their own override.

Open Workflow Settings

Click the canvas background to open Workflow Settings.

Set Default LLM Model

Select a model from the Default LLM Model dropdown.

Per-Workspace

Set a workspace-wide default in Settings → General. All workflows and agents that don’t specify their own model will use this default.

Model Cascade

When an agent executes, Overcut resolves its model by walking down this cascade and using the first configured value:
  1. Coordinator overridecoordinatorModelKey on an agent.session step (applies only to the coordinator, never to child agents)
  2. Agent model — the agent’s own model selection
  3. Workflow default — the Default LLM Model set in Workflow Settings
  4. Workspace default — configured in Settings → General
  5. System default — Overcut’s managed default model
For a step-by-step walkthrough of configuring defaults at each level, see the Default Model Configuration guide.

Overcut Agent vs. Claude Code

Overcut supports two execution engines. The engine is selected per workflow step in the Workflow Builder.

Overcut Agent (default)

The default engine. Runs on Overcut’s agent architecture with full model flexibility.
  • Model selection: Follows the cascade described above — you control exactly which model each agent uses
  • Providers: Any supported provider (OpenAI, Azure, Anthropic, AWS Bedrock, OpenRouter)
  • API keys: Uses workspace-configured custom models, or Overcut’s system models if no custom model is set
  • Billing: Through Overcut when using system models, or directly against your provider when using custom models

Claude Code (Claude Agent SDK)

An alternative engine that runs Anthropic’s Claude Code Agent SDK directly.
  • Model selection: Automatic — the Claude SDK chooses the optimal model. No model configuration is available
  • Providers: Anthropic only
  • API keys: Requires your own Anthropic API key, configured in Settings → Secrets (the claudeApiKey field)
  • Billing: Directly against your Anthropic account

When to Use Each

ScenarioRecommended Engine
You need to use a specific model or provider (e.g. GPT-4.1, Azure, Bedrock)Overcut Agent
You want model selection controlled per agent/workflow/workspaceOvercut Agent
You want to leverage Claude Code’s native coding capabilitiesClaude Code
You already have an Anthropic API key and want direct billingClaude Code
Custom model selection (BYOK) applies only to the Overcut Agent engine. The Claude Code engine always uses Claude’s automatic model selection with your Anthropic API key.
Both engines support MCP Servers, the same agent roles, and the same Overcut tools (ticketing, pull requests, git operations). The difference is in model selection and the underlying execution architecture.

Security

  • API keys are encrypted at rest and never returned through the UI or API.
  • Decrypting a model’s configuration requires the llmModel.readEncryptedConfig permission.
  • System models (managed by Overcut) cannot be edited or deleted by workspace users.

Next Steps