Back to Glossary
LLM

Function Calling

Definition

Function calling is an LLM capability that generates structured JSON output matching predefined function schemas, enabling reliable tool invocation and data extraction with validated parameters.

Why It Matters

Before function calling, integrating LLMs with external systems required parsing free-form text, which was fragile and error-prone. The model might describe what it wants to do, but extracting structured parameters from natural language was unreliable.

Function calling solves this by training models to output structured JSON that conforms to your defined schemas. When you need the model to call a weather API, it doesn’t say “please look up weather for New York” but instead returns {"location": "New York", "units": "celsius"} that you can directly use.

For AI engineers, function calling is foundational to building reliable agents. It’s the difference between a demo that works sometimes and a production system that handles thousands of requests without breaking. Every major LLM provider now supports function calling because it’s essential for real-world applications.

Implementation Basics

Function calling works through schema definition and structured output:

1. Schema Definition You provide the model with function schemas in JSON format, describing each function’s name, purpose, and parameters. Good descriptions are critical because the model uses them to decide when and how to call functions.

{
  "name": "get_weather",
  "description": "Get current weather for a city",
  "parameters": {
    "type": "object",
    "properties": {
      "location": {"type": "string"},
      "units": {"enum": ["celsius", "fahrenheit"]}
    },
    "required": ["location"]
  }
}

2. Model Output When the model decides to call a function, it returns structured JSON matching your schema instead of regular text. The response includes the function name and validated arguments.

3. Execution and Response Your code executes the actual function with the provided arguments, then sends results back to the model for incorporation into its response.

Modern providers offer “parallel function calling” for efficiency and “forced function calling” when you need guaranteed structured output. Use strict mode to ensure parameters always match your schema exactly.

Source

Function calling allows models to generate structured outputs matching user-defined schemas, enabling reliable integration with external tools and APIs.

https://platform.openai.com/docs/guides/function-calling