Providers
Quark is provider-agnostic via litellm — 2,600+ models across 140+ providers. Pass any litellm model string to Agent(model=...) and set the appropriate API key as an environment variable.
OpenAI
Anthropic
agent = Agent(model="claude-opus-4-6")
agent = Agent(model="claude-sonnet-4-6")
agent = Agent(model="claude-haiku-4-5")
AWS Bedrock
Option 1 — Bearer token (preferred).
Option 2 — Boto3. Uses boto3's credential chain: env vars, ~/.aws/credentials, IAM roles, SSO.
agent = Agent(model="bedrock/anthropic.claude-3-5-haiku-20241022-v1:0")
agent = Agent(model="bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0")
agent = Agent(model="bedrock/amazon.nova-pro-v1:0")
Google Gemini
Ollama (local)
No API key needed. Run Ollama locally first: ollama serve
agent = Agent(model="ollama/llama3")
agent = Agent(model="ollama/mistral")
agent = Agent(model="ollama/deepseek-r1")
Azure OpenAI
export AZURE_API_KEY=...
export AZURE_API_BASE=https://your-resource.openai.azure.com
export AZURE_API_VERSION=2024-02-01
OpenRouter
Access hundreds of models through a single API key.
Try for free: OpenRouter has a free auto-router that picks from available free models — no cost, great for testing. Free models may be rate-limited; if you hit a 429, try again shortly or switch to a specific free model.
Mixing providers in a pipeline
Each agent in a pipeline can use a different provider: