Skip to main content

API Keys

Fliiq needs at least one LLM API key. Add them to ~/.fliiq/.env:
.env
# LLM providers (at least one required)
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-...
GEMINI_API_KEY=...
Priority order: Anthropic > OpenAI > Gemini. If your primary provider fails, Fliiq automatically falls back to the next available one. To change the default, see Provider Switching below. For best results with Anthropic (recommended provider):
ModelBest forNotes
Claude Sonnet 4.5Most tasksBest balance of speed and capability. Recommended default.
Claude Opus 4.6Heavy computationComplex refactors, deep research, multi-step reasoning. Overkill for most use cases — Sonnet handles the vast majority of tasks well.
Fliiq works with any model your provider supports. Sonnet 4.5 is the sweet spot for daily use.

Provider Switching

When multiple API keys are set, Fliiq uses the default priority order (Anthropic > OpenAI > Gemini). Override this per-command with the --provider flag:
fliiq --provider openai               # REPL with OpenAI/GPT-4o
fliiq run "task" --provider gemini     # Single-shot with Gemini
fliiq tui --provider anthropic         # TUI with Claude
For a persistent default, set FLIIQ_PROVIDER in your .env:
.env
FLIIQ_PROVIDER=openai
Resolution order:
  1. --provider flag (highest priority)
  2. FLIIQ_PROVIDER environment variable
  3. First API key found (Anthropic > OpenAI > Gemini)
See CLI Reference for the full list of commands that accept --provider.

Model Selection

Switch models per-command with --model (or -m):
fliiq run "build a Jira skill" --model opus-4.6    # Use Opus for complex tasks
fliiq run "check my inbox" --model sonnet-4.5       # Use Sonnet for daily tasks
fliiq run "task" --model gpt-4.1                    # Use OpenAI GPT-4.1
View all available aliases with:
fliiq models

Built-in Aliases

Default aliases shipped with fliiq init:
AnthropicOpenAIGemini
opus-4.6gpt-4.1gemini-2.5-pro
sonnet-4.5gpt-4.1-minigemini-2.5-flash
sonnet-4.0gpt-4o
haiku-4.5gpt-4o-mini
o3
o3-mini
o4-mini

Custom Aliases

Aliases are defined in ~/.fliiq/models.yaml (created by fliiq init). Add your own for Ollama, Mistral, DeepSeek, or any other provider:
# ~/.fliiq/models.yaml
aliases:
  # ... default aliases above ...

  # Custom aliases — add yours here
  mistral-large: mistral-large-latest
  mistral-small: mistral-small-latest
  mistral-codestral: codestral-latest
  deepseek-v3: deepseek-chat
  deepseek-coder: deepseek-coder
  llama-3.2: llama3.2:latest
  qwen-2.5: qwen2.5:latest
Then use them like any built-in alias:
fliiq run "task" --model mistral-large --provider openai
Any string not found in aliases is passed through as-is to the provider API.

Persistent Model Override

Override the default model for a provider via environment variable in your .env:
.env
ANTHROPIC_MODEL=claude-opus-4-6
OPENAI_MODEL=gpt-4.1
GEMINI_MODEL=gemini-2.5-flash
Resolution order: --model flag (highest) > env var (ANTHROPIC_MODEL, etc.) > hardcoded default.

Self-Hosted LLMs

Fliiq can connect to any OpenAI-compatible server — Ollama, vLLM, llama.cpp, LM Studio, LocalAI, or any other server that implements the OpenAI API. Set OPENAI_BASE_URL to point Fliiq at your local server:
.env
OPENAI_API_KEY=not-needed
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.2
OPENAI_API_KEY must still be set (most local servers ignore it — any non-empty value works).

Quick Start with Ollama

# 1. Install Ollama (https://ollama.com) and start it
ollama serve

# 2. Pull a model
ollama pull llama3.2
Add to ~/.fliiq/.env:
.env
OPENAI_API_KEY=not-needed
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL=llama3.2
# 3. Run Fliiq with your local model
fliiq --provider openai
If you also have a cloud API key set, switch between local and cloud with --provider:
fliiq --provider openai      # Local Ollama
fliiq --provider anthropic   # Cloud Claude

Anthropic-Compatible Proxies

If you use a proxy that implements the Anthropic API, set ANTHROPIC_BASE_URL:
.env
ANTHROPIC_BASE_URL=https://your-proxy.example.com
Self-hosted models vary in capability. Fliiq’s agent loop works best with models that support tool/function calling. Most large models (Llama 3.2+, Mistral, Qwen) support it via the OpenAI-compatible API.

Integration Credentials

Optional — add these when you need the corresponding skills:

Google (OAuth)

.env
GOOGLE_CLIENT_ID=your-client-id.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=GOCSPX-your-client-secret
Required for fliiq google auth — the OAuth flow that authorizes Fliiq to access Gmail, Google Calendar, Drive, Sheets, Docs, and Slides. See Google Integration for full setup. Used by: send_email, receive_emails, mark_email_read, archive_email, delete_email, google_calendar, google_drive, google_sheets, google_docs, google_slides

Gmail — Fliiq’s Bot Email

.env
FLIIQ_GMAIL_ADDRESS=my-fliiq@gmail.com
FLIIQ_GMAIL_APP_PASSWORD=xxxx-xxxx-xxxx-xxxx
Gives Fliiq its own email address as a channel. The app password is optional if you’ve authorized this account via OAuth instead.

Gmail — App Password (legacy)

.env
GMAIL_ADDRESS=you@gmail.com
GMAIL_APP_PASSWORD=xxxx-xxxx-xxxx-xxxx
Fallback for email skills when OAuth is not configured. Requires a Google App Password. We recommend OAuth instead — it’s more secure and unlocks Calendar access.

Twilio (SMS)

.env
TWILIO_ACCOUNT_SID=AC...
TWILIO_AUTH_TOKEN=...
TWILIO_PHONE_NUMBER=+1234567890
Get these from the Twilio Console. Used by: send_sms, receive_sms

Telegram

.env
TELEGRAM_BOT_TOKEN=123456:ABC-DEF...
TELEGRAM_ALLOWED_CHAT_IDS=12345678,87654321
Create a bot via @BotFather on Telegram. TELEGRAM_ALLOWED_CHAT_IDS restricts which chats the bot responds to. Used by: send_telegram, send_telegram_audio, Telegram real-time listener (daemon)

MiniMax (Text-to-Speech)

.env
MINIMAX_API_KEY=...
MINIMAX_GROUP_ID=...
Get these from the MiniMax Platform. The Group ID is found in your account settings. Used by: text_to_speech
.env
BRAVE_API_KEY=BSA...
Get a free key from Brave Search API. Used by: web_search

Spotify

.env
SPOTIFY_CLIENT_ID=...
SPOTIFY_CLIENT_SECRET=...
SPOTIFY_ACCESS_TOKEN=...
Create an app at the Spotify Developer Dashboard. Used by: spotify

Directory Structure

Global (~/.fliiq/)

Created by fliiq init. Used from any terminal.
PathPurpose
.envAPI keys and credentials
user.yamlUser profile — name, labeled email accounts (details)
models.yamlModel aliases for --model flag (details)
google_tokens.jsonOAuth tokens for authorized Google accounts (auto-managed)
memory/Persistent memory files
memory/MEMORY.mdCurated memory (always loaded into prompt)
audit/Audit trails from every agent run
skills/User-generated skills (available globally)

Project (.fliiq/) — optional

Created by fliiq init --project. Overrides global for this project.
PathPurpose
SOUL.mdAgent personality overrides
playbooks/Custom domain playbooks
mcp.jsonMCP server connections
memory/Project-specific memory
audit/Project-specific audit trails
jobs/Scheduled job definitions (YAML)
skills/Project-specific skills

Bundled (inside the package)

Ships with Fliiq. Cannot be modified directly.
PathPurpose
data/skills/core/33 bundled skills
data/soul/SOUL.mdDefault agent personality
playbooks/coding.mdDefault coding playbook

Config Resolution

When Fliiq looks for a resource (skills, SOUL.md, playbooks, memory):
  1. Project .fliiq/ — checked first
  2. Global ~/.fliiq/ — fallback
  3. Bundled defaults — final fallback
This lets you customize per-project without affecting your global setup, and ensures sensible defaults when nothing is configured.