Running Claude and OpenCode Locally with One Command
Running Claude and OpenCode Locally with One Command
I have wanted to run Claude Code with local models for a while, but I never actually tried. The setup looked like a small project by itself: run a router/proxy, wire environment variables, and keep everything in sync. There are good community tools (for example claude-code-router), but it still felt a little too involved just to test an idea.
That changed with a new Ollama feature: ollama launch. The short version is that it makes local (or cloud) model setup for coding tools feel like a one-command task. The official announcement is here: https://ollama.com/blog/launch
What ollama launch does
According to the Ollama team, ollama launch can set up and run coding tools like Claude Code and OpenCode without any manual environment variables or config files. It walks you through model selection and starts the tool right away.
Claude in one command
If you already have Ollama installed (v0.15+), the flow is straightforward.
# local model (large context needs lots of VRAM)
ollama pull gpt-oss:20b
# or use a cloud model
ollama pull gpt-oss:120b-cloud
ollama launch claude
OpenCode: simpler config flow
OpenCode was already usable with local models, but the setup usually meant editing its JSON config to add models and a provider. ollama launch turns that into a guided flow.
ollama launch opencode
If you only want the configuration without launching the tool immediately:
ollama launch opencode --config
Why this matters
For me the main win is lowering the activation energy. Instead of spending time on proxies and environment setup, I can spend that time actually using the tools. I also get to keep my data local, and for some workflows, save money.
A few extra notes from the announcement:
- Coding tools work best with a large context window (Ollama recommends 64k tokens).
- If local hardware is tight, Ollama's cloud models are an easy fallback.
Takeaway
ollama launch turns what used to be a fiddly local setup into something I can try in minutes. If you have been on the fence about running Claude Code or OpenCode with local models, this is the simplest entry point I have seen so far.