Anthropic Agent
Overview
The AnthropicAgent
is a powerful and flexible agent class in the Agent Squad System.
It leverages the Anthropic API to interact with various Large Language Models (LLMs) provided by Anthropic, such as Claude.
This agent can handle a wide range of processing tasks, making it suitable for diverse applications such as conversational AI, question-answering systems, and more.
Key Features
- Integration with Anthropic’s API
- Support for multiple LLM models available on Anthropic’s platform
- Streaming and non-streaming response options
- Customizable inference configuration
- Ability to set and update custom system prompts
- Optional integration with retrieval systems for enhanced context
- Support for Tool use within the conversation flow
Creating a AnthropicAgent
Here are various examples showing different ways to create and configure an AnthropicAgent:
Python Package
If you haven’t already installed the Anthropic-related dependencies, make sure to install them:
pip install "agent-squad[anthropic]"
Basic Examples
1. Minimal Configuration
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'A versatile AI assistant', apiKey: 'your-anthropic-api-key'});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='A versatile AI assistant', api_key='your-anthropic-api-key'))
2. Using Custom Client
import { Anthropic } from '@anthropic-ai/sdk';const customClient = new Anthropic({ apiKey: 'your-anthropic-api-key' });const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'A versatile AI assistant', client: customClient});
from anthropic import Anthropic
custom_client = Anthropic(api_key='your-anthropic-api-key')
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='A versatile AI assistant', client=custom_client))
3. Custom Model and Streaming
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'A streaming-enabled assistant', apiKey: 'your-anthropic-api-key', modelId: 'claude-3-opus-20240229', streaming: true});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='A streaming-enabled assistant', api_key='your-anthropic-api-key', model_id='claude-3-opus-20240229', streaming=True))
4. With Inference Configuration
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'An assistant with custom inference settings', apiKey: 'your-anthropic-api-key', inferenceConfig: { maxTokens: 500, temperature: 0.7, topP: 0.9, stopSequences: ['Human:', 'AI:'] }});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='An assistant with custom inference settings', api_key='your-anthropic-api-key', inference_config={ 'maxTokens': 500, 'temperature': 0.7, 'topP': 0.9, 'stopSequences': ['Human:', 'AI:'] }))
5. With Simple System Prompt
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'An assistant with custom prompt', apiKey: 'your-anthropic-api-key', customSystemPrompt: { template: 'You are a helpful AI assistant focused on technical support.' }});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='An assistant with custom prompt', api_key='your-anthropic-api-key', custom_system_prompt={ 'template': 'You are a helpful AI assistant focused on technical support.' }))
6. With System Prompt Variables
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'An assistant with variable prompt', apiKey: 'your-anthropic-api-key', customSystemPrompt: { template: 'You are an AI assistant specialized in {{DOMAIN}}. Always use a {{TONE}} tone.', variables: { DOMAIN: 'customer support', TONE: 'friendly and helpful' } }});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='An assistant with variable prompt', api_key='your-anthropic-api-key', custom_system_prompt={ 'template': 'You are an AI assistant specialized in {{DOMAIN}}. Always use a {{TONE}} tone.', 'variables': { 'DOMAIN': 'customer support', 'TONE': 'friendly and helpful' } }))
7. With Custom Retriever
const retriever = new CustomRetriever({ // Retriever configuration});const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'An assistant with retriever', apiKey: 'your-anthropic-api-key', retriever: retriever});
retriever = CustomRetriever( # Retriever configuration)
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='An assistant with retriever', api_key='your-anthropic-api-key', retriever=retriever))
8. With Tool Configuration
const agent = new AnthropicAgent({ name: 'Anthropic Assistant', description: 'An assistant with tool support', apiKey: 'your-anthropic-api-key', toolConfig: { tool: [ { name: "Weather_Tool", description: "Get current weather data", input_schema: { type: "object", properties: { location: { type: "string", description: "City name", } }, required: ["location"] } } ], useToolHandler: (response, conversation) => { return { role: ParticipantRole.USER, content: { "type": "tool_result", "tool_use_id": "weather_tool", "content": "Current weather data for the location" } } } }});
agent = AnthropicAgent(AnthropicAgentOptions( name='Anthropic Assistant', description='An assistant with tool support', api_key='your-anthropic-api-key', tool_config={ 'tool': [{ 'name': 'Weather_Tool', 'description': 'Get current weather data', 'input_schema': { 'type': 'object', 'properties': { 'location': { 'type': 'string', 'description': 'City name' } }, 'required': ['location'] } }], 'useToolHandler': lambda response, conversation: { 'role': ParticipantRole.USER.value, 'content': { 'type': 'tool_result', 'tool_use_id': 'weather_tool', 'content': 'Current weather data for the location' } } }))
9. Complete Example with All Options
import { AnthropicAgent } from 'agent-squad';
const agent = new AnthropicAgent({ // Required fields name: 'Advanced Anthropic Assistant', description: 'A fully configured AI assistant powered by Anthropic models', apiKey: 'your-anthropic-api-key',
// Optional fields modelId: 'claude-3-opus-20240229', // Choose Anthropic model streaming: true, // Enable streaming responses retriever: customRetriever, // Custom retriever for additional context
// Inference configuration inferenceConfig: { maxTokens: 500, // Maximum tokens to generate temperature: 0.7, // Control randomness (0-1) topP: 0.9, // Control diversity via nucleus sampling stopSequences: ['Human:', 'AI:'] // Sequences that stop generation },
// Tool configuration toolConfig: { tool: [{ name: "Weather_Tool", description: "Get the current weather for a given location", input_schema: { type: "object", properties: { latitude: { type: "string", description: "Geographical WGS84 latitude" }, longitude: { type: "string", description: "Geographical WGS84 longitude" } }, required: ["latitude", "longitude"] } }], useToolHandler: (response, conversation) => ({ role: ParticipantRole.USER, content: { type: "tool_result", tool_use_id: "tool_user_id", content: "Response from the tool" } }) },
// Custom system prompt with variables customSystemPrompt: { template: `You are an AI assistant specialized in {{DOMAIN}}. Your core competencies: {{SKILLS}}
Communication style: - Maintain a {{TONE}} tone - Focus on {{FOCUS}} - Prioritize {{PRIORITY}}`, variables: { DOMAIN: 'scientific research', SKILLS: [ '- Advanced data analysis', '- Statistical methodology', '- Research design', '- Technical writing' ], TONE: 'professional and academic', FOCUS: 'accuracy and clarity', PRIORITY: 'evidence-based insights' } }});
from agent_squad import AnthropicAgent, AnthropicAgentOptionsfrom agent_squad.types import ParticipantRoleagent = AnthropicAgent(AnthropicAgentOptions(# Required fieldsname='Advanced Anthropic Assistant',description='A fully configured AI assistant powered by Anthropic models',api_key='your-anthropic-api-key',# Optional fieldsmodel_id='claude-3-opus-20240229', # Choose Anthropic modelstreaming=True, # Enable streaming responsesretriever=custom_retriever, # Custom retriever for additional context
# Inference configurationinference_config={ 'maxTokens': 500, # Maximum tokens to generate 'temperature': 0.7, # Control randomness (0-1) 'topP': 0.9, # Control diversity via nucleus sampling 'stopSequences': ['Human:', 'AI:'] # Sequences that stop generation},
# Tool configurationtool_config={ 'tool': [{ 'name': 'Weather_Tool', 'description': 'Get the current weather for a given location', 'input_schema': { 'type': 'object', 'properties': { 'latitude': { 'type': 'string', 'description': 'Geographical WGS84 latitude' }, 'longitude': { 'type': 'string', 'description': 'Geographical WGS84 longitude' } }, 'required': ['latitude', 'longitude'] } }], 'useToolHandler': lambda response, conversation: { 'role': ParticipantRole.USER.value, 'content': { 'type': 'tool_result', 'tool_use_id': 'tool_user_id', 'content': 'Response from the tool' } }},
# Custom system prompt with variablescustom_system_prompt={ 'template': """You are an AI assistant specialized in {{DOMAIN}}. Your core competencies: {{SKILLS}}
Communication style: - Maintain a {{TONE}} tone - Focus on {{FOCUS}} - Prioritize {{PRIORITY}}""", 'variables': { 'DOMAIN': 'scientific research', 'SKILLS': [ '- Advanced data analysis', '- Statistical methodology', '- Research design', '- Technical writing' ], 'TONE': 'professional and academic', 'FOCUS': 'accuracy and clarity', 'PRIORITY': 'evidence-based insights' }}))
Option Explanations
name
anddescription
: Identify and describe the agent’s purpose.apiKey
: Your Anthropic API key for authentication.modelId
: Specifies the LLM model to use (e.g., Claude 3 Sonnet).streaming
: Enables streaming responses for real-time output.inferenceConfig
: Fine-tunes the model’s output characteristics.retriever
: Integrates a retrieval system for enhanced context.toolConfig
: Defines tools the agent can use and how to handle their responses (See AgentTools for Agents for seamless tool definition)
Setting a New Prompt
You can dynamically set or update the system prompt for the agent:
agent.setSystemPrompt( `You are an AI assistant specialized in {{DOMAIN}}. Your main goal is to {{GOAL}}. Always maintain a {{TONE}} tone in your responses.`, { DOMAIN: "cybersecurity", GOAL: "help users understand and mitigate potential security threats", TONE: "professional and reassuring" });
agent.set_system_prompt( """You are an AI assistant specialized in {{DOMAIN}}. Your main goal is to {{GOAL}}. Always maintain a {{TONE}} tone in your responses.""", { "DOMAIN": "cybersecurity", "GOAL": "help users understand and mitigate potential security threats", "TONE": "professional and reassuring" })
This method allows you to dynamically change the agent’s behavior and focus without creating a new instance.
Adding the Agent to the Orchestrator
To integrate the Anthropic Agent into your orchestrator, follow these steps:
- First, ensure you have created an instance of the orchestrator:
import { AgentSquad } from "agent-squad";
const orchestrator = new AgentSquad();
from agent_squad.orchestrator import AgentSquad
orchestrator = AgentSquad()
- Then, add the agent to the orchestrator:
orchestrator.addAgent(agent);
orchestrator.add_agent(agent)
- Now you can use the orchestrator to route requests to the appropriate agent, including your Anthropic agent:
const response = await orchestrator.routeRequest( "What is the base rate interest for 30 years?", "user123", "session456");
response = await orchestrator.route_request( "What is the base rate interest for 30 years?", "user123", "session456")
By leveraging the AnthropicAgent, you can create sophisticated, context-aware AI agents capable of handling a wide range of tasks and interactions, all powered by the latest LLM models available through Anthropic’s platform.