Installation¶
Get Attune AI installed and configured in about 2 minutes.
Step 1: Install the Package¶
Choose your installation option:
Includes: CLI tools, VSCode extension support, all workflows, local telemetry.
Verify Installation¶
Step 2: Configure an LLM Provider¶
Attune AI uses Anthropic Claude as its LLM provider.
Option A: Environment Variable (Quick)¶
Option B: .env File (Persistent)¶
Create a .env file in your project root:
The framework auto-detects .env files.
Option C: Interactive Setup¶
Verify Provider¶
Expected output:
Current provider: anthropic
Available models: claude-opus-4-6, claude-sonnet-4-6, claude-haiku-4-5
API key configured
Step 3: Optional - Redis for Memory¶
Redis enables multi-agent coordination and session persistence. Skip this for now if you just want to try the framework.
The framework works without Redis (falls back to in-memory storage).
You can add Redis later when you need:
- Multi-agent coordination
- Session persistence
- Pattern staging
See Redis Setup for production configuration.
Troubleshooting¶
"No API key configured"¶
# Check your environment
env | grep API_KEY
# Set it
export ANTHROPIC_API_KEY="sk-ant-..."
# Verify
python -m attune.models.cli provider --check
"ModuleNotFoundError: attune"¶
Python version too old¶
See Also¶
- Redis Setup - Configure Redis for production use
- MCP Integration - Connect to Claude Code
- Configuration Reference - Customize settings
- First Steps - Run your first workflow
Next Step¶
You're installed! Continue to First Steps to run your first workflow.