Overview
The AnthropicAgent
is a powerful and flexible agent class in the Multi-Agent Orchestrator System.
It leverages the Anthropic API to interact with various Large Language Models (LLMs) provided by Anthropic, such as Claude.
This agent can handle a wide range of processing tasks, making it suitable for diverse applications such as conversational AI, question-answering systems, and more.
Key Features
Integration with Anthropic’s API
Support for multiple LLM models available on Anthropic’s platform
Streaming and non-streaming response options
Customizable inference configuration
Ability to set and update custom system prompts
Optional integration with retrieval systems for enhanced context
Support for Tool use within the conversation flow
Creating a AnthropicAgent
Here are various examples showing different ways to create and configure an AnthropicAgent:
Python Package
If you haven’t already installed the Anthropic-related dependencies, make sure to install them:
pip install " multi-agent-orchestrator[anthropic] "
Basic Examples
1. Minimal Configuration
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' A versatile AI assistant ' ,
apiKey: ' your-anthropic-api-key '
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' A versatile AI assistant ' ,
api_key = ' your-anthropic-api-key '
2. Using Custom Client
import { Anthropic } from ' @anthropic-ai/sdk ' ;
const customClient = new Anthropic ( { apiKey: ' your-anthropic-api-key ' } );
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' A versatile AI assistant ' ,
from anthropic import Anthropic
custom_client = Anthropic ( api_key = ' your-anthropic-api-key ' )
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' A versatile AI assistant ' ,
3. Custom Model and Streaming
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' A streaming-enabled assistant ' ,
apiKey: ' your-anthropic-api-key ' ,
modelId: ' claude-3-opus-20240229 ' ,
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' A streaming-enabled assistant ' ,
api_key = ' your-anthropic-api-key ' ,
model_id = ' claude-3-opus-20240229 ' ,
4. With Inference Configuration
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' An assistant with custom inference settings ' ,
apiKey: ' your-anthropic-api-key ' ,
stopSequences: [ ' Human: ' , ' AI: ' ]
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' An assistant with custom inference settings ' ,
api_key = ' your-anthropic-api-key ' ,
' stopSequences ' : [ ' Human: ' , ' AI: ' ]
5. With Simple System Prompt
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' An assistant with custom prompt ' ,
apiKey: ' your-anthropic-api-key ' ,
template: ' You are a helpful AI assistant focused on technical support. '
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' An assistant with custom prompt ' ,
api_key = ' your-anthropic-api-key ' ,
' template ' : ' You are a helpful AI assistant focused on technical support. '
6. With System Prompt Variables
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' An assistant with variable prompt ' ,
apiKey: ' your-anthropic-api-key ' ,
template: ' You are an AI assistant specialized in {{DOMAIN}}. Always use a {{TONE}} tone. ' ,
DOMAIN: ' customer support ' ,
TONE: ' friendly and helpful '
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' An assistant with variable prompt ' ,
api_key = ' your-anthropic-api-key ' ,
' template ' : ' You are an AI assistant specialized in {{ DOMAIN }} . Always use a {{ TONE }} tone. ' ,
' DOMAIN ' : ' customer support ' ,
' TONE ' : ' friendly and helpful '
7. With Custom Retriever
const retriever = new CustomRetriever ( {
// Retriever configuration
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' An assistant with retriever ' ,
apiKey: ' your-anthropic-api-key ' ,
retriever = CustomRetriever (
# Retriever configuration
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' An assistant with retriever ' ,
api_key = ' your-anthropic-api-key ' ,
8. With Tool Configuration
const agent = new AnthropicAgent ( {
name: ' Anthropic Assistant ' ,
description: ' An assistant with tool support ' ,
apiKey: ' your-anthropic-api-key ' ,
description: " Get current weather data " ,
description: " City name " ,
useToolHandler : ( response , conversation ) => {
role: ParticipantRole . USER ,
" tool_use_id " : " weather_tool " ,
" content " : " Current weather data for the location "
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Anthropic Assistant ' ,
description = ' An assistant with tool support ' ,
api_key = ' your-anthropic-api-key ' ,
' description ' : ' Get current weather data ' ,
' description ' : ' City name '
' useToolHandler ' : lambda response , conversation : {
' role ' : ParticipantRole.USER.value,
' tool_use_id ' : ' weather_tool ' ,
' content ' : ' Current weather data for the location '
9. Complete Example with All Options
import { AnthropicAgent } from ' multi-agent-orchestrator ' ;
const agent = new AnthropicAgent ( {
name: ' Advanced Anthropic Assistant ' ,
description: ' A fully configured AI assistant powered by Anthropic models ' ,
apiKey: ' your-anthropic-api-key ' ,
modelId: ' claude-3-opus-20240229 ' , // Choose Anthropic model
streaming: true , // Enable streaming responses
retriever: customRetriever , // Custom retriever for additional context
// Inference configuration
maxTokens: 500 , // Maximum tokens to generate
temperature: 0.7 , // Control randomness (0-1)
topP: 0.9 , // Control diversity via nucleus sampling
stopSequences: [ ' Human: ' , ' AI: ' ] // Sequences that stop generation
description: " Get the current weather for a given location " ,
description: " Geographical WGS84 latitude "
description: " Geographical WGS84 longitude "
required: [ " latitude " , " longitude " ]
useToolHandler : ( response , conversation ) => ( {
role: ParticipantRole . USER ,
tool_use_id: " tool_user_id " ,
content: " Response from the tool "
// Custom system prompt with variables
template: ` You are an AI assistant specialized in {{DOMAIN}}.
- Maintain a {{TONE}} tone
- Prioritize {{PRIORITY}} ` ,
DOMAIN: ' scientific research ' ,
' - Advanced data analysis ' ,
' - Statistical methodology ' ,
TONE: ' professional and academic ' ,
FOCUS: ' accuracy and clarity ' ,
PRIORITY: ' evidence-based insights '
from multi_agent_orchestrator import AnthropicAgent, AnthropicAgentOptions
from multi_agent_orchestrator.types import ParticipantRole
agent = AnthropicAgent ( AnthropicAgentOptions (
name = ' Advanced Anthropic Assistant ' ,
description = ' A fully configured AI assistant powered by Anthropic models ' ,
api_key = ' your-anthropic-api-key ' ,
model_id = ' claude-3-opus-20240229 ' , # Choose Anthropic model
streaming = True , # Enable streaming responses
retriever = custom_retriever , # Custom retriever for additional context
# Inference configuration
' maxTokens ' : 500 , # Maximum tokens to generate
' temperature ' : 0.7 , # Control randomness (0-1)
' topP ' : 0.9 , # Control diversity via nucleus sampling
' stopSequences ' : [ ' Human: ' , ' AI: ' ] # Sequences that stop generation
' description ' : ' Get the current weather for a given location ' ,
' description ' : ' Geographical WGS84 latitude '
' description ' : ' Geographical WGS84 longitude '
' required ' : [ ' latitude ' , ' longitude ' ]
' useToolHandler ' : lambda response , conversation : {
' role ' : ParticipantRole.USER.value,
' tool_use_id ' : ' tool_user_id ' ,
' content ' : ' Response from the tool '
# Custom system prompt with variables
' template ' : """ You are an AI assistant specialized in {{ DOMAIN }} .
- Maintain a {{ TONE }} tone
- Prioritize {{ PRIORITY }} """ ,
' DOMAIN ' : ' scientific research ' ,
' - Advanced data analysis ' ,
' - Statistical methodology ' ,
' TONE ' : ' professional and academic ' ,
' FOCUS ' : ' accuracy and clarity ' ,
' PRIORITY ' : ' evidence-based insights '
Option Explanations
name
and description
: Identify and describe the agent’s purpose.
apiKey
: Your Anthropic API key for authentication.
modelId
: Specifies the LLM model to use (e.g., Claude 3 Sonnet).
streaming
: Enables streaming responses for real-time output.
inferenceConfig
: Fine-tunes the model’s output characteristics.
retriever
: Integrates a retrieval system for enhanced context.
toolConfig
: Defines tools the agent can use and how to handle their responses.
Setting a New Prompt
You can dynamically set or update the system prompt for the agent:
` You are an AI assistant specialized in {{DOMAIN}}.
Your main goal is to {{GOAL}}.
Always maintain a {{TONE}} tone in your responses. ` ,
GOAL: " help users understand and mitigate potential security threats " ,
TONE: " professional and reassuring "
""" You are an AI assistant specialized in {{ DOMAIN }} .
Your main goal is to {{ GOAL }} .
Always maintain a {{ TONE }} tone in your responses. """ ,
" DOMAIN " : " cybersecurity " ,
" GOAL " : " help users understand and mitigate potential security threats " ,
" TONE " : " professional and reassuring "
This method allows you to dynamically change the agent’s behavior and focus without creating a new instance.
Adding the Agent to the Orchestrator
To integrate the Anthropic Agent into your orchestrator, follow these steps:
First, ensure you have created an instance of the orchestrator:
import { MultiAgentOrchestrator } from " multi-agent-orchestrator " ;
const orchestrator = new MultiAgentOrchestrator ();
from multi_agent_orchestrator.orchestrator import MultiAgentOrchestrator
orchestrator = MultiAgentOrchestrator ()
Then, add the agent to the orchestrator:
orchestrator . addAgent (agent);
orchestrator. add_agent ( agent )
Now you can use the orchestrator to route requests to the appropriate agent, including your Anthropic agent:
const response = await orchestrator . routeRequest (
" What is the base rate interest for 30 years? " ,
response = await orchestrator. route_request (
" What is the base rate interest for 30 years? " ,
By leveraging the AnthropicAgent , you can create sophisticated, context-aware AI agents capable of handling a wide range of tasks and interactions, all powered by the latest LLM models available through Anthropic’s platform.