Claude Code can shell out to other AI CLIs. Use cheaper or specialized models as minions for specific tasks while Claude orchestrates.
Gemini CLI as a Research Minion
Install Gemini CLI, then tell Claude to use it:
Add to your CLAUDE.md:
When you need to research a topic, search the web, or get a second
opinion, run `gemini` via bash. Use it for broad research tasks.
Use yourself for code generation and reasoning.
Then in Claude Code:
Research the best pagination strategies for GraphQL APIs.
Use gemini to search the web for current best practices,
then synthesize the findings and implement the best approach.
Local Models for Bulk Work
Use Ollama or LM Studio for cheap, fast subtasks:
# In CLAUDE.md
For generating boilerplate, test data, or simple translations,
use `ollama run llama3` via bash to save tokens.
The Orchestrator Pattern
You're the orchestrator. For this task:
1. Use gemini to research current React form libraries
2. Use your own judgment to pick the best one
3. Implement it yourself
Don't delegate code writing to other models.
When This Makes Sense
- Web research: Gemini has built-in search
- Second opinions: get a different model’s take on architecture
- Bulk generation: simple repetitive content at lower cost
- Translation/i18n: send strings to a cheaper model
When It Doesn’t
Don’t delegate complex reasoning or multi-file code changes. Claude’s strength is deep code understanding, and splitting that across models creates coordination overhead that isn’t worth it.