6.7 KiB
Provider Setup Guide
SmartTools works with any AI CLI tool that accepts input via stdin or arguments. This guide covers setup for the most popular providers.
Provider Comparison
We profiled 12 providers with a 4-task benchmark (Math, Code, Reasoning, Data Extraction):
| Provider | Speed | Score | Cost | Best For |
|---|---|---|---|---|
| opencode-deepseek | 13s | 4/4 | ~$0.28/M tokens | Best value - daily driver |
| opencode-pickle | 13s | 4/4 | FREE | Best free - accurate |
| claude-haiku | 14s | 4/4 | ~$0.25/M tokens | Fast + high quality |
| codex | 14s | 4/4 | ~$1.25/M tokens | Reliable, auto-routes |
| claude | 18s | 4/4 | Varies | Auto-routes to best |
| claude-opus | 18s | 4/4 | ~$15/M tokens | Highest quality |
| claude-sonnet | 21s | 4/4 | ~$3/M tokens | Balanced |
| opencode-nano | 24s | 4/4 | Paid | GPT-5 Nano |
| gemini-flash | 28s | 4/4 | ~$0.075/M tokens | Google, faster |
| opencode-reasoner | 33s | 4/4 | ~$0.28/M tokens | Complex reasoning |
| gemini | 91s | 3/4 | ~$1.25/M tokens | 1M token context |
| opencode-grok | 11s | 2/4 | FREE | Fastest but unreliable |
Recommendations
- Daily use:
opencode-deepseekoropencode-pickle(free) - Quality work:
claude-haikuorclaude-opus - Complex reasoning:
opencode-reasoner - Large documents:
gemini(1M token context window) - Budget:
opencode-pickle(free) oropencode-deepseek(cheap)
Provider Setup
OpenCode (Recommended)
OpenCode provides access to multiple models including free options.
Install:
curl -fsSL https://opencode.ai/install | bash
Authenticate:
~/.opencode/bin/opencode auth
Available Models:
| Provider Name | Model | Cost |
|---|---|---|
opencode-deepseek |
deepseek-chat | Cheap |
opencode-pickle |
big-pickle | FREE |
opencode-grok |
grok-code | FREE |
opencode-nano |
gpt-5-nano | Paid |
opencode-reasoner |
deepseek-reasoner | Cheap |
Test:
echo "Hello" | ~/.opencode/bin/opencode run --model opencode/big-pickle
Claude CLI
Anthropic's official CLI for Claude models.
Install:
npm install -g @anthropic-ai/claude-cli
# or
brew install claude
Authenticate:
claude auth
Available Models:
| Provider Name | Model | Cost |
|---|---|---|
claude |
Auto-routes | Varies |
claude-haiku |
Haiku 4.5 | Cheap |
claude-sonnet |
Sonnet 4.5 | Medium |
claude-opus |
Opus 4.5 | Expensive |
Test:
echo "Hello" | claude -p
Codex (OpenAI)
OpenAI's Codex CLI with auto-routing.
Install:
npm install -g codex-cli
Authenticate:
codex auth # Uses ChatGPT account
Test:
echo "Hello" | codex exec -
Gemini
Google's Gemini models. Best for large context (1M tokens).
Install:
npm install -g @anthropic-ai/gemini-cli
# or
pip install google-generativeai
Authenticate:
gemini auth # Uses Google account
Available Models:
| Provider Name | Model | Notes |
|---|---|---|
gemini |
gemini-2.5-pro | Quality, slow CLI |
gemini-flash |
gemini-2.5-flash | Faster |
Note: Gemini CLI has known performance issues. Use gemini-flash for interactive tasks, gemini for large documents.
Test:
echo "Hello" | gemini --model gemini-2.5-flash
Managing Providers
List Providers
smarttools providers
Add Custom Provider
smarttools providers add
Or edit ~/.smarttools/providers.yaml:
providers:
- name: my-custom
command: my-ai-tool --prompt
description: My custom AI tool
Provider Command Format
The command should:
- Accept input via stdin
- Output response to stdout
- Exit 0 on success
Example commands:
# Claude
claude -p
# OpenCode
$HOME/.opencode/bin/opencode run --model deepseek/deepseek-chat
# Gemini
gemini --model gemini-2.5-flash
# Codex
codex exec -
# Custom (any tool that reads stdin)
my-tool --input -
Environment Variables
Provider commands can use environment variables:
providers:
- name: opencode
command: $HOME/.opencode/bin/opencode run
$HOME and ~ are expanded automatically.
Using Providers in Tools
In Tool Config
steps:
- type: prompt
prompt: "Summarize: {input}"
provider: opencode-pickle # Use this provider
output_var: response
Override at Runtime
# Use a different provider for this run
cat file.txt | summarize --provider claude-opus
Provider Selection Strategy
- Tool default - Set in tool's config.yaml
- Runtime override -
--providerflag - Cost optimization - Use cheap providers for simple tasks
- Quality needs - Use opus/sonnet for important work
Troubleshooting
"Provider 'X' not found"
- Check it's in your providers list:
smarttools providers - Verify the command works:
echo "test" | <command> - Add it:
smarttools providers add
"Command 'X' not found"
The AI CLI tool isn't installed or not in PATH:
which claude # Should show path
which opencode # Might need full path
For OpenCode, use full path in provider:
command: $HOME/.opencode/bin/opencode run
Slow Provider
- Use
gemini-flashinstead ofgemini - Use
claude-haikuinstead ofclaude-opus - Use
opencode-deepseekfor best speed/quality ratio
Provider Errors
Check the provider works directly:
echo "Say hello" | claude -p
echo "Say hello" | ~/.opencode/bin/opencode run
If it works directly but not in SmartTools, check:
- Provider command in
~/.smarttools/providers.yaml - Environment variables are expanded correctly
Cost Optimization
Free Providers
opencode-pickle- Big Pickle model (FREE, accurate)opencode-grok- Grok Code (FREE, fast but less reliable)
Cheap Providers
opencode-deepseek- ~$0.28/M tokensopencode-reasoner- ~$0.28/M tokensclaude-haiku- ~$0.25/M tokens
Tips
- Use
opencode-picklefor simple tasks (free + accurate) - Use
claude-haikuwhen you need reliability - Reserve
claude-opusfor important work - Use
geminionly for large document analysis
Adding New Providers
Any CLI tool that:
- Reads from stdin
- Writes to stdout
- Exits 0 on success
Can be a provider. Examples:
Local LLM (Ollama):
- name: ollama-llama
command: ollama run llama3
description: Local Llama 3
Custom API wrapper:
- name: my-api
command: curl -s -X POST https://my-api.com/chat -d @-
description: My custom API
Python script:
- name: my-python
command: python3 ~/scripts/my_ai.py
description: Custom Python AI