Skip to content

Latest commit

 

History

History
84 lines (54 loc) · 1.84 KB

File metadata and controls

84 lines (54 loc) · 1.84 KB
title Models
description Configuring an LLM provider and model.

opencode uses the AI SDK and Models.dev to support for 75+ LLM providers and it supports running local models.


Providers

Most popular providers are preloaded by default. If you've added the credentials for a provider through opencode auth login, they'll be available when you start opencode.

Learn more about providers.


Select a model

Once you've configured your provider you can select the model you want by typing in:

/models

Recommended models

There are a lot of models out there, with new models coming out every week.

:::tip Consider using one of the models we recommend. :::

However, there are a only a few of them that are good at both generating code and tool calling.

Here are the ones we recommend with opencode:

  • Claude Sonnet 4
  • Claude Opus 4
  • Kimi K2
  • Qwen3 Coder
  • GPT 4.1
  • Gemini 2.5 Pro

Set a default

To set one of these as the default model, you can set the model key in your opencode config.

{
  "$schema": "https://opencode.ai/config.json",
  "model": "lmstudio/google/gemma-3n-e4b"
}

Here the full ID is provider_id/model_id.

If you've configured a custom provider, the provider_id is key from the provider part of your config, and the model_id is the key from provider.models.


Loading models

When opencode starts up, it checks for the following:

  1. The model list in the opencode config.

    {
      "$schema": "https://opencode.ai/config.json",
      "model": "anthropic/claude-sonnet-4-20250514"
    }

    The format here is provider/model.

  2. The last used model.

  3. The first model using an internal priority.