Skip to main content

Provider Setup Guide

CmdForge works with any AI CLI tool that accepts input via stdin or arguments. This guide covers setup for the most popular providers.

Provider Comparison

We profiled 12 providers with a 4-task benchmark (Math, Code, Reasoning, Data Extraction):

ProviderSpeedScoreCostBest For
opencode-deepseek13s4/4~$0.28/M tokensBest value - daily driver
opencode-pickle13s4/4FREEBest free - accurate
claude-haiku14s4/4~$0.25/M tokensFast + high quality
codex14s4/4~$1.25/M tokensReliable, auto-routes
claude18s4/4VariesAuto-routes to best
claude-opus18s4/4~$15/M tokensHighest quality
claude-sonnet21s4/4~$3/M tokensBalanced
opencode-nano24s4/4PaidGPT-5 Nano
gemini-flash28s4/4~$0.075/M tokensGoogle, faster
opencode-reasoner33s4/4~$0.28/M tokensComplex reasoning
gemini91s3/4~$1.25/M tokens1M token context
opencode-grok11s2/4FREEFastest but unreliable

Recommendations

  • Daily use: opencode-deepseek or opencode-pickle (free)
  • Quality work: claude-haiku or claude-opus
  • Complex reasoning: opencode-reasoner
  • Large documents: gemini (1M token context window)
  • Budget: opencode-pickle (free) or opencode-deepseek (cheap)

Provider Setup

OpenCode provides access to multiple models including free options.

Install:

curl -fsSL https://opencode.ai/install | bash

Authenticate:

~/.opencode/bin/opencode auth

Available Models:

Provider NameModelCost
opencode-deepseekdeepseek-chatCheap
opencode-picklebig-pickleFREE
opencode-grokgrok-codeFREE
opencode-nanogpt-5-nanoPaid
opencode-reasonerdeepseek-reasonerCheap

Test:

echo "Hello" | ~/.opencode/bin/opencode run --model opencode/big-pickle

Claude CLI

Anthropic's official CLI for Claude models.

Install:

npm install -g @anthropic-ai/claude-code

Authenticate:

claude  # Opens browser for sign-in (auto-saves auth tokens)

Available Models:

Provider NameModelCost
claudeAuto-routesVaries
claude-haikuHaiku 4.5Cheap
claude-sonnetSonnet 4.5Medium
claude-opusOpus 4.5Expensive

Test:

echo "Hello" | claude -p

Codex (OpenAI)

OpenAI's Codex CLI with auto-routing.

Install:

npm install -g @openai/codex

Authenticate:

codex  # Opens browser for sign-in (auto-saves auth tokens)

Test:

echo "Hello" | codex exec -

Gemini

Google's Gemini models. Best for large context (1M tokens).

Install:

npm install -g @google/gemini-cli

Authenticate:

gemini  # Opens browser for Google sign-in

Available Models:

Provider NameModelNotes
geminigemini-2.5-proQuality, slow CLI
gemini-flashgemini-2.5-flashFaster

Note: Gemini CLI has known performance issues. Use gemini-flash for interactive tasks, gemini for large documents.

Test:

echo "Hello" | gemini --model gemini-2.5-flash

Managing Providers

The easiest way to install providers:

cmdforge providers install

This interactive guide:

  • Shows available AI providers with costs
  • Runs the installation command
  • Updates PATH automatically
  • Shows next steps for authentication

List Providers

cmdforge providers list

Check Availability

cmdforge providers check

Add Custom Provider

cmdforge providers add myname "my-command --args" -d "Description"

Or edit ~/.cmdforge/providers.yaml:

providers:
- name: my-custom
command: my-ai-tool --prompt
description: My custom AI tool

Provider Command Format

The command should:

  1. Accept input via stdin
  2. Output response to stdout
  3. Exit 0 on success

Example commands:

# Claude
claude -p

# OpenCode
$HOME/.opencode/bin/opencode run --model deepseek/deepseek-chat

# Gemini
gemini --model gemini-2.5-flash

# Codex
codex exec -

# Custom (any tool that reads stdin)
my-tool --input -

Using Providers in Tools

In Tool Config

steps:
- type: prompt
prompt: "Summarize: {input}"
provider: opencode-pickle # Use this provider
output_var: response

Override at Runtime

# Use a different provider for this run
cat file.txt | summarize --provider claude-opus

Provider Selection Strategy

  1. Tool default - Set in tool's config.yaml
  2. Runtime override - --provider flag
  3. Cost optimization - Use cheap providers for simple tasks
  4. Quality needs - Use opus/sonnet for important work

Troubleshooting

"Provider 'X' not found"

  1. Check it's in your providers list: cmdforge providers
  2. Verify the command works: echo "test" | <command>
  3. Add it: cmdforge providers add

"Command 'X' not found"

The AI CLI tool isn't installed or not in PATH:

which claude  # Should show path
which opencode # Might need full path

For OpenCode, use full path in provider:

command: $HOME/.opencode/bin/opencode run

Slow Provider

  • Use gemini-flash instead of gemini
  • Use claude-haiku instead of claude-opus
  • Use opencode-deepseek for best speed/quality ratio

Cost Optimization

Free Providers

  • opencode-pickle - Big Pickle model (FREE, accurate)
  • opencode-grok - Grok Code (FREE, fast but less reliable)

Cheap Providers

  • opencode-deepseek - ~$0.28/M tokens
  • opencode-reasoner - ~$0.28/M tokens
  • claude-haiku - ~$0.25/M tokens

Tips

  1. Use opencode-pickle for simple tasks (free + accurate)
  2. Use claude-haiku when you need reliability
  3. Reserve claude-opus for important work
  4. Use gemini only for large document analysis

Adding New Providers

Any CLI tool that:

  • Reads from stdin
  • Writes to stdout
  • Exits 0 on success

Can be a provider. Examples:

Local LLM (Ollama):

- name: ollama-llama
command: ollama run llama3
description: Local Llama 3

Custom API wrapper:

- name: my-api
command: curl -s -X POST https://my-api.com/chat -d @-
description: My custom API

Python script:

- name: my-python
command: python3 ~/scripts/my_ai.py
description: Custom Python AI