Skip to content
Copied!
published on 2026-04-11

Configuration Reference

The configuration file uses TOML format.

Path: ~/.config/myllm/config.toml

If XDG_CONFIG_HOME is set, $XDG_CONFIG_HOME/myllm/config.toml is used instead.

general

Application-wide common settings.

toml
[general]
default_provider = "ollama"
auto_copy = true
keep_alive = "5m"

default_provider

The provider to use when a task does not specify one.

  • Type: string
  • Valid values: "ollama" / "openai" / "anthropic"
  • Default: — (required)

auto_copy

Whether to automatically copy results to the clipboard (intended for the macOS GUI app myllm). In the CLI, use the --with-clipboard option instead.

  • Type: boolean
  • Default: false

keep_alive

How long Ollama keeps a model loaded in memory. Keeping the model loaded after a request speeds up subsequent requests.

  • Type: string (Ollama duration format)
  • Examples: "5m" (5 minutes), "1h" (1 hour), "0" (unload immediately)
  • Default: "5m"

providers

Connection settings for each provider. Define only the sections for providers you intend to use.

[providers.ollama]

Connection settings for a locally running Ollama server. No API key required.

toml
[providers.ollama]
base_url = "http://localhost:11434"
default_model = "llama3.2"

base_url

The base URL of the Ollama server.

  • Type: string
  • Default: — (required)

default_model

The default model for this provider. Used when a task does not specify a model.

  • Type: string
  • Examples: "llama3.2", "llama3.1:8b", "qwen2.5:14b"
  • Default: — (required)

[providers.openai]

Connection settings for the OpenAI API.

toml
[providers.openai]
base_url = "https://api.openai.com/v1"
default_model = "gpt-4o"
api_key_env = "MYLLM_OPENAI_API_KEY"

base_url

The base URL for the OpenAI API. Change this endpoint when using an OpenAI-compatible API.

  • Type: string
  • Default: — (required)

default_model

The default model for this provider.

  • Type: string
  • Examples: "gpt-4o", "gpt-4o-mini"
  • Default: — (required)

api_key

Specifies the API key directly. This is not mutually exclusive with api_key_env; if both are set, api_key takes precedence. Writing the API key directly in the configuration file risks accidentally committing it to a repository. Using api_key_env is recommended.

  • Type: string
  • Default:

api_key_env

The name of the environment variable from which to read the API key. Export it in your shell configuration file (e.g., .zshrc).

  • Type: string
  • Example: "MYLLM_OPENAI_API_KEY"
  • Default:
bash
# ~/.zshrc
export MYLLM_OPENAI_API_KEY="sk-..."

[providers.anthropic]

Connection settings for the Anthropic API.

toml
[providers.anthropic]
base_url = "https://api.anthropic.com/v1"
default_model = "claude-sonnet-4-6"
api_key_env = "MYLLM_ANTHROPIC_API_KEY"

base_url

The base URL for the Anthropic API.

  • Type: string
  • Default: — (required)

default_model

The default model for this provider.

  • Type: string
  • Examples: "claude-opus-4", "claude-sonnet-4-6"
  • Default: — (required)

api_key / api_key_env

Same specification as [providers.openai]. See above for details.


tasks

Settings for user-defined tasks. The task name (key) becomes the subcommand name.

toml
[tasks.polish]
name = "Polish"
instruction = '''
Improve the clarity and correctness of the provided text.
Output only the revised text, no explanation.
Use the same language as the original text.
'''

Run with:

bash
echo "draft text" | myllm polish

name

The display name of the task. Shown by myllm list and myllm info.

  • Type: string
  • Default:

instruction

The system instruction sent to the model. Use a TOML literal string (''') for multi-line content. The user's input text is appended after the instruction, separated by a blank line.

  • Type: string (TOML literal string '''...''' recommended)
  • Default: — (required)

Writing effective instructions

Ending an instruction with "Use the same language as the original text." makes the output language match the input. This is especially useful for tasks other than translation.

provider

The provider to use for this task. Falls back to [general] default_provider when omitted.

  • Type: string
  • Valid values: "ollama" / "openai" / "anthropic"
  • Default: value of [general] default_provider

model

The model to use for this task. Falls back to the provider's default_model when omitted.

  • Type: string
  • Examples: "gpt-4o", "llama3.2"
  • Default: provider's default_model

hotkey

Hotkey configuration for the macOS GUI app myllm. Not used by the CLI.

  • Type: string
  • Example: "cmd+shift+p"

auto_copy (per-task)

Overrides [general] auto_copy at the task level. Intended for the macOS GUI app.

  • Type: boolean
  • Default: value of [general] auto_copy

translation

Settings for the translate command.

toml
[translation]
enabled = true
provider = "ollama"
model = "translategemma:12b"
default_source = "en"
default_target = "ja"
fallback_target = "en"

enabled

Whether to enable the translation feature.

  • Type: boolean
  • Default: true

provider

The provider to use for translation. Falls back to [general] default_provider when omitted.

  • Type: string
  • Default: value of [general] default_provider

model

The model to use for translation. Using a translation-specialized model (e.g., translategemma:12b) is recommended. Falls back to the provider's default_model when omitted.

  • Type: string
  • Default: provider's default_model

hotkey

Hotkey configuration for the macOS GUI app myllm. Not used by the CLI.

  • Type: string

default_source

The source language assumed when language detection is unavailable (whichlang-cli is not installed).

  • Type: string (ISO 639-1 code)
  • Default: "en"

default_target

The target language when the detected source language matches default_source.

  • Type: string (ISO 639-1 code)
  • Default: "ja"

fallback_target

The target language when the detected source language does not match default_source.

  • Type: string (ISO 639-1 code)
  • Default: "en"
Language detection and target selection logic
Input text


Detect language with whichlang-cli

    ├─ Cannot detect (not installed)
    │       → assume default_source

    └─ Detected

            ├─ detected language == default_source
            │       → translate to default_target

            └─ detected language != default_source
                    → translate to fallback_target

With the default settings (default_source = "en", default_target = "ja", fallback_target = "en"):

  • English → Japanese
  • Japanese (or any other language) → English

Supported Languages (Translation)

whichlang-cli can detect the following 16 languages. These are the same codes accepted by the --from / --to options.

CodeLanguage
arArabic
nlDutch
enEnglish
frFrench
deGerman
hiHindi
itItalian
jaJapanese
koKorean
zhMandarin Chinese
ptPortuguese
ruRussian
esSpanish
svSwedish
trTurkish
viVietnamese