Skip to main content

Overview

Custom Functions let your agent call external APIs during a live conversation — fetching data, triggering actions, or integrating with your systems. When the LLM decides a function is needed, TalkifAI makes an HTTP request to your webhook URL and injects the result back into the conversation.

How It Works

User: "What's my order status?"


   LLM decides to call check_order_status()


   TalkifAI → POST https://your-api.com/check-order
              Body: { "order_id": "ORD-123" }
              Headers: Content-Type, custom headers


   Your API returns: { "status": "shipped", "eta": "tomorrow" }


   Agent says: "Your order is shipped and arrives tomorrow!"

Creating a Custom Function

Step 1: Define the function in Studio

  1. Go to your agent → SettingsFunctionsCustom Functions
  2. Click Create Function
  3. Fill in:
FieldDescription
NameSnake_case identifier (e.g., check_order_status)
LabelHuman-readable name shown in Studio
DescriptionWhat the function does (LLM reads this to decide when to call it)
EndpointYour HTTPS endpoint URL
MethodHTTP method: POST (default), GET, PUT, PATCH
HeadersCustom headers sent with every call (JSON format)
ParametersJSON Schema defining the input

Step 2: Define parameters (JSON Schema)

{
  "type": "object",
  "properties": {
    "order_id": {
      "type": "string",
      "description": "The customer's order ID"
    },
    "customer_email": {
      "type": "string",
      "description": "Customer email for verification"
    }
  },
  "required": ["order_id"]
}
Parameter types: string, number, boolean, array, object are all supported.

Step 3: Implement your webhook

Your endpoint receives a POST request with the function arguments: Request from TalkifAI:
POST https://your-api.com/check-order
Content-Type: application/json

{
  "order_id": "ORD-123",
  "customer_email": "user@example.com"
}
Headers:
  • Content-Type: application/json (always included)
  • Your custom headers (from function configuration)
No automatic user context headers: Unlike some documentation suggests, TalkifAI does NOT automatically inject X-User-ID or X-Org-ID headers into custom function calls. If you need user context, you must:
  1. Include user ID as a function parameter, OR
  2. Use internal endpoints (/internal/*) which DO receive context headers

Step 4: Return a response

Your API must respond within 30 seconds with a JSON response:
# Flask example
from flask import Flask, request, jsonify

app = Flask(__name__)

@app.route('/check-order', methods=['POST'])
def check_order():
    data = request.json
    order_id = data.get('order_id')
    customer_email = data.get('customer_email')

    # Your business logic
    order = db.get_order(order_id)

    # Return result in one of these formats:
    return jsonify({
        "result": f"Order {order_id} is {order.status}. Expected delivery: {order.eta}."
    })
    
    # OR return data directly:
    # return jsonify({
    #     "status": order.status,
    #     "eta": order.eta,
    #     "carrier": order.carrier
    # })
Response formats recognized:
  • { "result": "..." } — Used directly as agent response
  • { "data": {...} } — Converted to text for agent
  • { "message": "..." } — Used as response
  • Any other JSON — Converted to string
The response is converted to text and read by the agent. Keep it conversational and concise.

Enable on an Agent

  1. Go to your agent → SettingsFunctions
  2. Under Custom Functions, toggle on the functions you want available
  3. Save the agent
The agent will only call functions that are explicitly enabled on it.
After enabling functions, mention them in your system prompt so the agent knows when to use them. Example: "To check order status, use the check_order_status function."

Built-in Functions

TalkifAI also provides built-in functions you can enable without any webhook:
FunctionDescription
web_searchSearch the internet in real-time
end_callProgrammatically end the conversation
search_knowledge_baseSearch linked knowledge bases
Enable them in Agent Settings → Functions → Built-in Functions.

Security

Custom Headers

You can configure custom headers in the function definition for authentication:
{
  "Authorization": "Bearer your-secret-key",
  "X-API-Key": "your-api-key",
  "X-Source": "talkifai"
}
These headers are sent with every function call.

User Context Injection

For internal endpoints only (/internal/* path): If your endpoint path contains /internal/, TalkifAI automatically injects:
  • X-User-ID — The authenticated user’s ID
  • X-Org-ID — The active organization ID
@app.route('/internal/check-order', methods=['POST'])
def check_order():
    user_id = request.headers.get('X-User-ID')
    org_id = request.headers.get('X-Org-ID')

    if not user_id or not org_id:
        return jsonify({"error": "Unauthorized"}), 401

    # Proceed safely with user context
External endpoints don’t receive user context: For security, external custom functions (not under /internal/) do NOT receive user context headers. Pass user ID as a function parameter if needed.

Best Practices

  • ✅ Use HTTPS-only endpoints
  • ✅ Validate all input parameters
  • ✅ Implement rate limiting
  • ✅ Use custom headers for API keys
  • ✅ Log all function calls for auditing

Timeout Behavior

Functions must respond within 30 seconds. If your endpoint times out: Agent receives: Error message Agent says: “I’m having trouble accessing that information right now.” For slow operations:
  • Return a quick acknowledgment
  • Use async patterns (webhook callbacks coming soon)
  • Consider breaking into multiple faster functions

Testing Your Function

Test with curl

curl -X POST https://your-api.com/check-order \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer your-key" \
  -d '{
    "order_id": "ORD-123",
    "customer_email": "user@example.com"
  }'

Test in Studio

  1. Go to your agent → Test Agent
  2. Start a conversation
  3. Trigger the function naturally: “What’s my order status?”
  4. Watch the function call in real-time
  5. Check logs for any errors

Troubleshooting

Check:
  1. Is function calling enabled on the agent?
  2. Is the function enabled in agent settings?
  3. Does your system prompt mention the function?
  4. Is the function description clear about when to use it?
Fix: Add explicit instructions: "Use the check_order_status function when the user asks about their order."
Check:
  1. Is your endpoint accessible (no firewall issues)?
  2. Are headers correct (auth keys valid)?
  3. Is the request format correct (JSON, parameters)?
  4. Check function logs in Studio
Test: Use curl to test your endpoint directly.
Problem: Endpoint taking too long to respond.Solution:
  • Optimize your API response time
  • Return partial results quickly
  • Use caching for frequently called functions
  • Consider async patterns for long operations
Problem: Agent says something odd about the result.Solution:
  • Return clearer, more conversational responses
  • Include context in the result: "Order ORD-123 is shipped and will arrive on January 20th."
  • Add instructions to system prompt about interpreting results