Runtime Settings

Models, keys, and provider routing

This is the dedicated configuration surface. Chat stays in the workbook. Keys, base URLs, model picks, and connection testing live here.

Selected provider
Ollama Local
qwen3
Runtime status
Ready
Requires local Ollama host
Anthropic env
Not set
Browser key or another provider required
API Keys & Runtime

Cursor-style provider configuration, but shaped for this app: local Ollama, hosted Ollama, NVIDIA, Anthropic, and generic OpenAI-compatible runtimes.

Base URL
Model

Official Ollama docs expose the API at `http://localhost:11434/api`. This app accepts either the host root or the `/api` URL. Tool-calling is most reliable on models like `qwen3`.

Suggested Models
Runtime status
Requires local Ollama host

Keys entered here stay in browser storage for this workspace. They are not written into the repo. The future Excel add-in should reuse this same provider contract against the cloud route.