Overview
Custom Functions let your agent call external APIs during a live conversation — fetching data, triggering actions, or integrating with your systems. When the LLM decides a function is needed, TalkifAI makes an HTTP request to your webhook URL and injects the result back into the conversation.How It Works
Creating a Custom Function
Step 1: Define the function in Studio
- Go to your agent → Settings → Functions → Custom Functions
- Click Create Function
- Fill in:
| Field | Description |
|---|---|
| Name | Snake_case identifier (e.g., check_order_status) |
| Label | Human-readable name shown in Studio |
| Description | What the function does (LLM reads this to decide when to call it) |
| Endpoint | Your HTTPS endpoint URL |
| Method | HTTP method: POST (default), GET, PUT, PATCH |
| Headers | Custom headers sent with every call (JSON format) |
| Parameters | JSON Schema defining the input |
Step 2: Define parameters (JSON Schema)
Parameter types:
string, number, boolean, array, object are all supported.Step 3: Implement your webhook
Your endpoint receives a POST request with the function arguments: Request from TalkifAI:Content-Type: application/json(always included)- Your custom headers (from function configuration)
Step 4: Return a response
Your API must respond within 30 seconds with a JSON response:{ "result": "..." }— Used directly as agent response{ "data": {...} }— Converted to text for agent{ "message": "..." }— Used as response- Any other JSON — Converted to string
The response is converted to text and read by the agent. Keep it conversational and concise.
Enable on an Agent
- Go to your agent → Settings → Functions
- Under Custom Functions, toggle on the functions you want available
- Save the agent
After enabling functions, mention them in your system prompt so the agent knows when to use them. Example:
"To check order status, use the check_order_status function."Built-in Functions
TalkifAI also provides built-in functions you can enable without any webhook:| Function | Description |
|---|---|
web_search | Search the internet in real-time |
end_call | Programmatically end the conversation |
search_knowledge_base | Search linked knowledge bases |
Security
Custom Headers
You can configure custom headers in the function definition for authentication:User Context Injection
For internal endpoints only (/internal/* path):
If your endpoint path contains /internal/, TalkifAI automatically injects:
X-User-ID— The authenticated user’s IDX-Org-ID— The active organization ID
Best Practices
- ✅ Use HTTPS-only endpoints
- ✅ Validate all input parameters
- ✅ Implement rate limiting
- ✅ Use custom headers for API keys
- ✅ Log all function calls for auditing
Timeout Behavior
Functions must respond within 30 seconds. If your endpoint times out: Agent receives: Error message Agent says: “I’m having trouble accessing that information right now.” For slow operations:- Return a quick acknowledgment
- Use async patterns (webhook callbacks coming soon)
- Consider breaking into multiple faster functions
Testing Your Function
Test with curl
Test in Studio
- Go to your agent → Test Agent
- Start a conversation
- Trigger the function naturally: “What’s my order status?”
- Watch the function call in real-time
- Check logs for any errors
Troubleshooting
Function not being called
Function not being called
Check:
- Is function calling enabled on the agent?
- Is the function enabled in agent settings?
- Does your system prompt mention the function?
- Is the function description clear about when to use it?
"Use the check_order_status function when the user asks about their order."Function returns error
Function returns error
Check:
- Is your endpoint accessible (no firewall issues)?
- Are headers correct (auth keys valid)?
- Is the request format correct (JSON, parameters)?
- Check function logs in Studio
Timeout errors
Timeout errors
Problem: Endpoint taking too long to respond.Solution:
- Optimize your API response time
- Return partial results quickly
- Use caching for frequently called functions
- Consider async patterns for long operations
Agent misinterprets function result
Agent misinterprets function result
Problem: Agent says something odd about the result.Solution:
- Return clearer, more conversational responses
- Include context in the result:
"Order ORD-123 is shipped and will arrive on January 20th." - Add instructions to system prompt about interpreting results
Related Documentation
- Custom Functions API — Full API reference
- Function Calling — Enable/disable functions
- Advanced Configuration — Complete agent configuration