smarttools/docs/PROVIDERS.md

6.7 KiB

Provider Setup Guide

SmartTools works with any AI CLI tool that accepts input via stdin or arguments. This guide covers setup for the most popular providers.

Provider Comparison

We profiled 12 providers with a 4-task benchmark (Math, Code, Reasoning, Data Extraction):

Provider Speed Score Cost Best For
opencode-deepseek 13s 4/4 ~$0.28/M tokens Best value - daily driver
opencode-pickle 13s 4/4 FREE Best free - accurate
claude-haiku 14s 4/4 ~$0.25/M tokens Fast + high quality
codex 14s 4/4 ~$1.25/M tokens Reliable, auto-routes
claude 18s 4/4 Varies Auto-routes to best
claude-opus 18s 4/4 ~$15/M tokens Highest quality
claude-sonnet 21s 4/4 ~$3/M tokens Balanced
opencode-nano 24s 4/4 Paid GPT-5 Nano
gemini-flash 28s 4/4 ~$0.075/M tokens Google, faster
opencode-reasoner 33s 4/4 ~$0.28/M tokens Complex reasoning
gemini 91s 3/4 ~$1.25/M tokens 1M token context
opencode-grok 11s 2/4 FREE Fastest but unreliable

Recommendations

  • Daily use: opencode-deepseek or opencode-pickle (free)
  • Quality work: claude-haiku or claude-opus
  • Complex reasoning: opencode-reasoner
  • Large documents: gemini (1M token context window)
  • Budget: opencode-pickle (free) or opencode-deepseek (cheap)

Provider Setup

OpenCode provides access to multiple models including free options.

Install:

curl -fsSL https://opencode.ai/install | bash

Authenticate:

~/.opencode/bin/opencode auth

Available Models:

Provider Name Model Cost
opencode-deepseek deepseek-chat Cheap
opencode-pickle big-pickle FREE
opencode-grok grok-code FREE
opencode-nano gpt-5-nano Paid
opencode-reasoner deepseek-reasoner Cheap

Test:

echo "Hello" | ~/.opencode/bin/opencode run --model opencode/big-pickle

Claude CLI

Anthropic's official CLI for Claude models.

Install:

npm install -g @anthropic-ai/claude-cli
# or
brew install claude

Authenticate:

claude auth

Available Models:

Provider Name Model Cost
claude Auto-routes Varies
claude-haiku Haiku 4.5 Cheap
claude-sonnet Sonnet 4.5 Medium
claude-opus Opus 4.5 Expensive

Test:

echo "Hello" | claude -p

Codex (OpenAI)

OpenAI's Codex CLI with auto-routing.

Install:

npm install -g codex-cli

Authenticate:

codex auth  # Uses ChatGPT account

Test:

echo "Hello" | codex exec -

Gemini

Google's Gemini models. Best for large context (1M tokens).

Install:

npm install -g @anthropic-ai/gemini-cli
# or
pip install google-generativeai

Authenticate:

gemini auth  # Uses Google account

Available Models:

Provider Name Model Notes
gemini gemini-2.5-pro Quality, slow CLI
gemini-flash gemini-2.5-flash Faster

Note: Gemini CLI has known performance issues. Use gemini-flash for interactive tasks, gemini for large documents.

Test:

echo "Hello" | gemini --model gemini-2.5-flash

Managing Providers

List Providers

smarttools providers

Add Custom Provider

smarttools providers add

Or edit ~/.smarttools/providers.yaml:

providers:
  - name: my-custom
    command: my-ai-tool --prompt
    description: My custom AI tool

Provider Command Format

The command should:

  1. Accept input via stdin
  2. Output response to stdout
  3. Exit 0 on success

Example commands:

# Claude
claude -p

# OpenCode
$HOME/.opencode/bin/opencode run --model deepseek/deepseek-chat

# Gemini
gemini --model gemini-2.5-flash

# Codex
codex exec -

# Custom (any tool that reads stdin)
my-tool --input -

Environment Variables

Provider commands can use environment variables:

providers:
  - name: opencode
    command: $HOME/.opencode/bin/opencode run

$HOME and ~ are expanded automatically.

Using Providers in Tools

In Tool Config

steps:
  - type: prompt
    prompt: "Summarize: {input}"
    provider: opencode-pickle  # Use this provider
    output_var: response

Override at Runtime

# Use a different provider for this run
cat file.txt | summarize --provider claude-opus

Provider Selection Strategy

  1. Tool default - Set in tool's config.yaml
  2. Runtime override - --provider flag
  3. Cost optimization - Use cheap providers for simple tasks
  4. Quality needs - Use opus/sonnet for important work

Troubleshooting

"Provider 'X' not found"

  1. Check it's in your providers list: smarttools providers
  2. Verify the command works: echo "test" | <command>
  3. Add it: smarttools providers add

"Command 'X' not found"

The AI CLI tool isn't installed or not in PATH:

which claude  # Should show path
which opencode  # Might need full path

For OpenCode, use full path in provider:

command: $HOME/.opencode/bin/opencode run

Slow Provider

  • Use gemini-flash instead of gemini
  • Use claude-haiku instead of claude-opus
  • Use opencode-deepseek for best speed/quality ratio

Provider Errors

Check the provider works directly:

echo "Say hello" | claude -p
echo "Say hello" | ~/.opencode/bin/opencode run

If it works directly but not in SmartTools, check:

  1. Provider command in ~/.smarttools/providers.yaml
  2. Environment variables are expanded correctly

Cost Optimization

Free Providers

  • opencode-pickle - Big Pickle model (FREE, accurate)
  • opencode-grok - Grok Code (FREE, fast but less reliable)

Cheap Providers

  • opencode-deepseek - ~$0.28/M tokens
  • opencode-reasoner - ~$0.28/M tokens
  • claude-haiku - ~$0.25/M tokens

Tips

  1. Use opencode-pickle for simple tasks (free + accurate)
  2. Use claude-haiku when you need reliability
  3. Reserve claude-opus for important work
  4. Use gemini only for large document analysis

Adding New Providers

Any CLI tool that:

  • Reads from stdin
  • Writes to stdout
  • Exits 0 on success

Can be a provider. Examples:

Local LLM (Ollama):

- name: ollama-llama
  command: ollama run llama3
  description: Local Llama 3

Custom API wrapper:

- name: my-api
  command: curl -s -X POST https://my-api.com/chat -d @-
  description: My custom API

Python script:

- name: my-python
  command: python3 ~/scripts/my_ai.py
  description: Custom Python AI