Tool Calling
❓What is Tool Calling?
Tool Calling is a feature that allows the AI to interact with external tools, APIs, or functions to perform tasks beyond simple text generation. Instead of just providing an answer, the LLM can:
- Recognize when a task requires external computation.
- Generate structured requests (e.g. API calls, function calls).
- Execute it.
⭐ 3 Things You Need to Know
- Why tool Calling?
- Extends AI Capabilities: Combines AI with specialized tools.
- Precision: Ensures accurate results (e.g. real-time data, complex computations).
- Automation: Enables workflows where the AI interacts with other software.
- Add MCP Servers via Models → MCP Server (but you can just use the builtin functionality if you just want to test it)
- Who can call Tools?
- Chars (via Char Config)
- Pals (via Char and via Behaviors)
- You (via Widget Action Buttons)
How does it work?
- User Request – You ask the LLM a question or give it a task.
- Tool Identification – The LLM determines if an external tool (e.g., calculator, weather API, ..) is needed based on the tool descriptions.
- Structured Request – The model outputs a formatted call (e.g., JSON) specifying:
- The tool/function to use.
- Required parameters (e.g. {"location": "Tokyo"} for a weather API).
- Execution – Your system processes the request using the specified tool.
- Response – The result is fed back to the LLM.
The standard in AI Pals Engine will be MCP (Model Context Protocol). Python is supported too, but this may change (the documentation will therefore focus on the MCP).
More Info: Devlog #3