Overview
The LLMClient lets you connect to the ATP Agent Server, retrieve toolkit context, and execute tools or workflows using JSON payloads—perfect for LLM-based agents.
It supports:
- OpenAI (GPT-4, GPT-3.5, etc.)
- Anthropic (Claude 3 Opus, Sonnet, Haiku)
- Mistral AI (Mistral Large, Medium, Small)
Constructor
from atp_sdk.clients import LLMClient
llm_client = LLMClient(
api_key: str,
protocol: str = "ws",
base_url: str = "https://api.chat-atp.com/ws/v1/atp/llm-client/"
)
Parameters
Protocol to use (“ws” for WebSocket or “http” for HTTP).
base_url
string
default:"https://api.chat-atp.com/ws/v1/atp/llm-client/"
ATP server URL. Use the default unless you’re running a custom ATP server.
Methods
get_toolkit_context
Retrieves the toolkit context and system instructions for a given toolkit and user prompt.
context = llm_client.get_toolkit_context(
toolkit_id: str,
provider: str,
user_prompt: str
)
Parameters
Unique ID of the toolkit you want to use.
The LLM provider: "openai", "anthropic", or "mistralai".
The user’s prompt or task description.
Returns
A dictionary containing the toolkit context, including provider-specific tool schemas.
Example response:
{
"toolkit_id": "your_toolkit_id",
"toolkit_name": "Example Toolkit",
"caption": "Example Caption",
"provider": "openai",
"tools": [
{
"type": "function",
"name": "hello_world",
"description": "Returns a greeting.",
"parameters": {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "Name to greet"
}
},
"required": ["name"]
}
}
],
"user_prompt": "What do you want to achieve?"
}
Executes a tool or workflow on the ATP server.
response = llm_client.call_tool(
toolkit_id: str,
tool_calls: str | dict,
provider: str,
user_prompt: str
)
Parameters
Unique ID of the toolkit.
JSON payload from an LLM containing the tool call. Can be a string or dict.Example:{
"function": "hello_world",
"parameters": {"name": "Alice"}
}
The LLM provider: "openai", "anthropic", or "mistralai".
Additional user input to include in the execution.
Returns
The result of the tool execution.
Example response:
{
"result": {
"message": "Hello, Alice!"
}
}
OAuth2 Methods
initiate_oauth_connection
Starts the OAuth flow and returns an authorization URL for the user.
connection = llm_client.initiate_oauth_connection(
platform_id: str,
external_user_id: str,
developer_redirect_url: str
)
Parameters
The platform ID (e.g., “hubspot”, “google”, “salesforce”).
Your user’s unique identifier (e.g., email address).
The URL to redirect the user to after OAuth authorization.
Returns
{
"authorization_url": "https://oauth-provider.com/authorize?..."
}
wait_for_connection
Polls for OAuth connection completion and retrieves the integration ID.
account = llm_client.wait_for_connection(
platform_id: str,
external_user_id: str
)
Parameters
The platform ID (e.g., “hubspot”, “google”, “salesforce”).
Your user’s unique identifier (e.g., email address).
Returns
{
"integration_id": "uuid"
}
get_user_tokens
Fetches the user’s access and refresh tokens for use in tool calls.
tokens = llm_client.get_user_tokens(
platform_id: str,
external_user_id: str
)
Parameters
The platform ID (e.g., “hubspot”, “google”, “salesforce”).
Your user’s unique identifier (e.g., email address).
Returns
{
"access_token": "ACCESS_TOKEN",
"refresh_token": "REFRESH_TOKEN"
}
Integration Examples
OpenAI
import openai
from atp_sdk.clients import LLMClient
openai_client = openai.OpenAI(api_key="YOUR_OPENAI_API_KEY")
llm_client = LLMClient(api_key="YOUR_ATP_LLM_CLIENT_API_KEY")
# Get toolkit context
context = llm_client.get_toolkit_context(
toolkit_id="your_toolkit_id",
provider="openai",
user_prompt="Create a company and then list contacts."
)
# Use OpenAI to generate tool calls
response = openai_client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Create a company and then list contacts."}
],
tools=context["tools"],
tool_choice="auto"
)
# Extract and execute tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
result = llm_client.call_tool(
toolkit_id="your_toolkit_id",
tool_calls=tool_calls,
provider="openai",
user_prompt="Create a company and then list contacts."
)
print(f"Tool call result: {result}")
Anthropic
import anthropic
from atp_sdk.clients import LLMClient
anthropic_client = anthropic.Anthropic(api_key="YOUR_ANTHROPIC_API_KEY")
llm_client = LLMClient(api_key="YOUR_ATP_LLM_CLIENT_API_KEY")
# Get toolkit context
context = llm_client.get_toolkit_context(
toolkit_id="your_toolkit_id",
provider="anthropic",
user_prompt="Create a company and then list contacts."
)
# Use Anthropic to generate tool calls
response = anthropic_client.messages.create(
model="claude-3-opus-20240229",
max_tokens=1024,
messages=[
{"role": "user", "content": "Create a company and then list contacts."}
],
tools=context["tools"]
)
# Extract and execute tool calls
tool_calls = response.content
if tool_calls[-1].type == "tool_calls":
result = llm_client.call_tool(
toolkit_id="your_toolkit_id",
tool_calls=tool_calls[-1].content,
provider="anthropic",
user_prompt="Create a company and then list contacts."
)
print(f"Tool call result: {result}")
Mistral AI
from mistralai.client import MistralClient
from atp_sdk.clients import LLMClient
mistral_client = MistralClient(api_key="YOUR_MISTRAL_API_KEY")
llm_client = LLMClient(api_key="YOUR_ATP_LLM_CLIENT_API_KEY")
# Get toolkit context
context = llm_client.get_toolkit_context(
toolkit_id="your_toolkit_id",
provider="mistralai",
user_prompt="Create a company and then list contacts."
)
# Use Mistral to generate tool calls
response = mistral_client.chat(
model="mistral-large-latest",
messages=[{"role": "user", "content": "Create a company and then list contacts."}],
tools=context["tools"]
)
# Extract and execute tool calls
tool_calls = response.choices[0].message.tool_calls
if tool_calls:
result = llm_client.call_tool(
toolkit_id="your_toolkit_id",
tool_calls=tool_calls,
provider="mistralai",
user_prompt="Create a company and then list contacts."
)
print(f"Tool call result: {result}")
OAuth2 Flow Example
from atp_sdk.clients import LLMClient
llm_client = LLMClient(api_key="YOUR_ATP_LLM_CLIENT_API_KEY")
# Step 1: Initiate OAuth connection
connection = llm_client.initiate_oauth_connection(
platform_id="hubspot",
external_user_id="user@example.com",
developer_redirect_url="https://your-app.com/oauth/callback"
)
print("Authorize at:", connection["authorization_url"])
# Step 2: Wait for connection
account = llm_client.wait_for_connection(
platform_id="hubspot",
external_user_id="user@example.com"
)
print("Integration ID:", account["integration_id"])
# Step 3: Fetch tokens
tokens = llm_client.get_user_tokens(
platform_id="hubspot",
external_user_id="user@example.com"
)
print("Access token:", tokens["access_token"])
The ATP SDK will automatically inject tokens into tool calls as needed. You only need to handle OAuth and fetch tokens once.
Next Steps