-
Notifications
You must be signed in to change notification settings - Fork 15
Description
Problem
When using Chief with Claude Code configured for local models (e.g., via LM Studio), Chief fails because Claude Code's -p (non-interactive/print) mode ignores the model configured in ~/.claude/settings.json.
Error Message
There's an issue with the selected model (claude-sonnet-4-5-20250929). It may not exist or you may not have access to it.
Run --model to pick a different model.
Root Cause
Chief spawns Claude Code with:
exec.CommandContext(ctx, "claude", "--dangerously-skip-permissions", "-p", prompt, "--output-format", "stream-json", "--verbose", )
When using Claude Code's -p (non-interactive) mode, it does not respect the model setting from ~/.claude/settings.json. The --model flag must be explicitly provided.
Environment Setup
For context, this is how local models are configured:
# LM Studio setup export ANTHROPIC_BASE_URL=http://localhost:1234/v1 export ANTHROPIC_API_KEY=lm-studio # Claude Code works fine interactively claude --model qwen/qwen3-coder-next
But Chief's non-interactive calls fail because --model isn't passed.
Requested Solution
Add a way to configure the model that Chief passes to Claude Code. Options:
Option 1: Environment Variable (Preferred)
export CHIEF_CLAUDE_MODEL=qwen/qwen3-coder-next
chiefOption 2: CLI Flag
chief --claude-model qwen/qwen3-coder-next
Option 3: Config File
# .chief/config.yaml claude: model: qwen/qwen3-coder-next
Affected Code Locations
The --model flag needs to be added in:
internal/loop/loop.go:263-268(main agent loop)internal/prd/generator.go:144-148(PRD conversion)internal/tui/first_time_setup.go:511(first-time setup detection)
Workarounds Attempted
- Setting
~/.claude/settings.jsonmodel: ❌ Ignored in-pmode - Exporting env vars before running Chief: ❌ Not passed through properly
- Wrapper scripts: ❌ Hacky and fragile
Use Case
Users with powerful local machines (e.g., Mac Studio) want to use quantized models via LM Studio/MLX instead of paying for API calls. This would enable Chief to work with:
- Local models via LM Studio
- Custom endpoints
- Alternative model providers
Priority
This blocks usage of Chief for anyone not using Anthropic's official API/models. Thanks.