myllm-cli Notes
Reference, configuration, and usage notes for myllm-cli, a command-line tool for running LLMs locally.
Overview
A personal LLM command-line toolkit. Pipe text into named tasks and get processed output. Supports both local models via Ollama and cloud APIs (OpenAI and Anthr…
Command Reference
Text can be passed as an argument or piped via standard input (stdin). If both are provided, the argument takes precedence. Displays the help message. Displays…
Configuration Reference
The configuration file uses TOML format. Path: ~/.config/myllm/config.toml If XDG_CONFIG_HOME is set, $XDG_CONFIG_HOME/myllm/config.toml is used instead. Applic…