A Coding Guide to Different Function Calling Methods to Create Real-Time, Tool-Enabled Conversational AI Agents


Serve as calling we could an LLM act as a bridge between natural-language activates and real-world code or APIs. As an alternative of merely producing textual content, the type makes a decision when to invoke a predefined operate, emits a structured JSON name with the operate title and arguments, after which waits on your utility to execute that decision and go back the consequences. This back-and-forth can loop, probably invoking a couple of purposes in series, enabling wealthy, multi-step interactions fully underneath conversational regulate. On this educational, we’ll enforce a climate assistant with Gemini 2.0 Flash to exhibit easy methods to arrange and arrange that function-calling cycle. We will be able to enforce other variants of Serve as Calling. By means of integrating operate calls, we change into a talk interface right into a dynamic device for real-time duties, whether or not fetching reside climate information, checking order statuses, scheduling appointments, or updating databases. Customers now not fill out advanced bureaucracy or navigate a couple of displays; they only describe what they want, and the LLM orchestrates the underlying movements seamlessly. This pure language automation allows the simple building of AI brokers that may get right of entry to exterior information assets, carry out transactions, or cause workflows, all inside a unmarried dialog.

Serve as Calling with Google Gemini 2.0 Flash

!pip set up "google-genai>=1.0.0" geopy requests

We set up the Gemini Python SDK (google-genai ≥ 1.0.0), at the side of geopy for changing location names to coordinates and requests for making HTTP calls, making sure the entire core dependencies for our Colab climate assistant are in position.

import os
from google import genai


GEMINI_API_KEY = "Use_Your_API_Key"  


consumer = genai.Shopper(api_key=GEMINI_API_KEY)


model_id = "gemini-2.0-flash"

We import the Gemini SDK, set your API key, and create a genai.Shopper example configured to make use of the “gemini-2.0-flash” type, organising the root for all next function-calling requests.

res = consumer.fashions.generate_content(
    type=model_id,
    contents=["Tell me 1 good fact about Nuremberg."]
)
print(res.textual content)

We ship a person instructed (“Inform me 1 just right reality about Nuremberg.”) to the Gemini 2.0 Flash type by the use of generate_content, then print out the type’s textual content answer, demonstrating a fundamental, end-to-end textual content‐technology name the use of the SDK.

Serve as Calling with JSON Schema

weather_function = {
    "title": "get_weather_forecast",
    "description": "Retrieves the elements the use of Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour.",
    "parameters": {
        "kind": "object",
        "houses": {
            "location": {
                "kind": "string",
                "description": "The town and state, e.g., San Francisco, CA"
            },
            "date": {
                "kind": "string",
                "description": "the forecasting date for when to get the elements layout (yyyy-mm-dd)"
            }
        },
        "required": ["location","date"]
    }
}

Right here, we outline a JSON Schema for our get_weather_forecast device, specifying its title, a descriptive instructed to lead Gemini on when to make use of it, and the precise enter parameters (location and date) with their varieties, descriptions, and required fields, so the type can emit legitimate operate calls.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use equipment to get right of entry to and retrieve knowledge from a climate API. As of late is 2025-03-04.",
    equipment=[{"function_declarations": [weather_function]}],
)

We create a GenerateContentConfig that tells Gemini it’s appearing as a climate‐retrieval assistant and registers your climate operate underneath equipment. Therefore, the type is aware of easy methods to generate structured calls when requested for forecast information.

reaction = consumer.fashions.generate_content(
    type=model_id,
    contents="Whats the elements in Berlin as of late?"
)
print(reaction.textual content)

This name sends the naked instructed (“What’s the elements in Berlin as of late?”) with out together with your config (and thus no operate definitions), so Gemini falls again to standard textual content of completion, providing generic recommendation as a substitute of invoking your climate‐forecast device.

reaction = consumer.fashions.generate_content(
    type=model_id,
    config=config,
    contents="Whats the elements in Berlin as of late?"
)


for section in reaction.applicants[0].content material.portions:
    print(section.function_call)

By means of passing in config (which incorporates your JSON‐schema device), Gemini acknowledges it will have to name get_weather_forecast reasonably than answer in undeniable textual content. The loop over reaction.applicants[0].content material.portions then prints out every section’s .function_call object, appearing you precisely which operate the type determined to invoke (with its title and arguments).

from google.genai import varieties
from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")
def get_weather_forecast(location, date):
    location = geolocator.geocode(location)
    if location:
        take a look at:
            reaction = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = reaction.json()
            go back {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        with the exception of Exception as e:
            go back {"error": str(e)}
    else:
        go back {"error": "Location no longer discovered"}


purposes = {
    "get_weather_forecast": get_weather_forecast
}


def call_function(function_name, **kwargs):
    go back purposes[function_name](**kwargs)


def function_call_loop(instructed):
    contents = [types.Content(role="user", parts=[types.Part(text=prompt)])]
    reaction = consumer.fashions.generate_content(
        type=model_id,
        config=config,
        contents=contents
    )
    for section in reaction.applicants[0].content material.portions:
        contents.append(varieties.Content material(position="type", portions=[part]))
        if section.function_call:
            print("Software name detected")
            function_call = section.function_call
            print(f"Calling device: {function_call.title} with args: {function_call.args}")
            tool_result = call_function(function_call.title, **function_call.args)
            function_response_part = varieties.Phase.from_function_response(
                title=function_call.title,
                reaction={"consequence": tool_result},
            )
            contents.append(varieties.Content material(position="person", portions=[function_response_part]))
            print(f"Calling LLM with device effects")
            func_gen_response = consumer.fashions.generate_content(
                type=model_id, config=config, contents=contents
            )
            contents.append(varieties.Content material(position="type", portions=[func_gen_response]))
    go back contents[-1].portions[0].textual content.strip()
   
consequence = function_call_loop("Whats the elements in Berlin as of late?")
print(consequence)

We enforce a complete “agentic” loop: it sends your instructed to Gemini, inspects the reaction for a operate name, executes get_weather_forecast (the use of Geopy plus an Open-Meteo HTTP request), after which feeds the device’s consequence again into the type to provide and go back the overall conversational answer.

Serve as Calling the use of Python purposes

from geopy.geocoders import Nominatim
import requests


geolocator = Nominatim(user_agent="weather-app")


def get_weather_forecast(location: str, date: str) -> str:
    """
    Retrieves the elements the use of Open-Meteo API for a given location (metropolis) and a date (yyyy-mm-dd). Returns an inventory dictionary with the time and temperature for every hour."
   
    Args:
        location (str): The town and state, e.g., San Francisco, CA
        date (str): The forecasting date for when to get the elements layout (yyyy-mm-dd)
    Returns:
        Dict[str, float]: A dictionary with the time as key and the temperature as price
    """
    location = geolocator.geocode(location)
    if location:
        take a look at:
            reaction = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude={location.latitude}&longitude={location.longitude}&hourly=temperature_2m&start_date={date}&end_date={date}")
            information = reaction.json()
            go back {time: temp for time, temp in zip(information["hourly"]["time"], information["hourly"]["temperature_2m"])}
        with the exception of Exception as e:
            go back {"error": str(e)}
    else:
        go back {"error": "Location no longer discovered"}

The get_weather_forecast operate first makes use of Geopy’s Nominatim to transform a city-and-state string into coordinates, then sends an HTTP request to the Open-Meteo API to retrieve hourly temperature information for the given date, returning a dictionary that maps every timestamp to its corresponding temperature. It additionally handles mistakes gracefully, returning an error message if the site isn’t discovered or the API name fails.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that may assist with climate comparable questions. As of late is 2025-03-04.", # to provide the LLM context at the present date.
    equipment=[get_weather_forecast],
    automatic_function_calling={"disable": True}
)

This config registers your Python get_weather_forecast operate as a callable device. It units a transparent machine instructed (together with the date) for context, whilst disabling “automatic_function_calling” in order that Gemini will emit the operate name payload as a substitute of invoking it internally.

r = consumer.fashions.generate_content(
    type=model_id,
    config=config,
    contents="Whats the elements in Berlin as of late?"
)
for section in r.applicants[0].content material.portions:
    print(section.function_call)

By means of sending the instructed together with your customized config (together with the Python device however with computerized calls disabled), this snippet captures Gemini’s uncooked operate‐name choice. Then it loops over every reaction section to print out the .function_call object, letting you check out precisely which device the type needs to invoke and with what arguments.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use equipment to get right of entry to and retrieve knowledge from a climate API. As of late is 2025-03-04.", # to provide the LLM context at the present date.
    equipment=[get_weather_forecast],
)


r = consumer.fashions.generate_content(
    type=model_id,
    config=config,
    contents="Whats the elements in Berlin as of late?"
)


print(r.textual content)

With this config (which incorporates your get_weather_forecast operate and leaves computerized calling enabled via default), calling generate_content could have Gemini invoke your climate device at the back of the scenes after which go back a pure‐language answer. Printing r.textual content outputs that ultimate reaction, together with the true temperature forecast for Berlin at the specified date.

from google.genai.varieties import GenerateContentConfig


config = GenerateContentConfig(
    system_instruction="You're a useful assistant that use equipment to get right of entry to and retrieve knowledge from a climate API.",
    equipment=[get_weather_forecast],
)


instructed = f"""
As of late is 2025-03-04. You're speaking to Andrew, you could have get right of entry to to extra details about him.


Consumer Context:
- title: Andrew
- location: Nuremberg


Consumer: Am i able to put on a T-shirt later as of late?"""


r = consumer.fashions.generate_content(
    type=model_id,
    config=config,
    contents=instructed
)


print(r.textual content)

We prolong your assistant with non-public context, telling Gemini Andrew’s title and site (Nuremberg) and asking if it’s T-shirt climate, whilst nonetheless the use of the get_weather_forecast device underneath the hood. It then prints the type’s natural-language advice in keeping with the true forecast for that day.

In conclusion, we now know the way to outline purposes (by the use of JSON schema or Python signatures), configure Gemini 2.0 Flash to stumble on and emit operate calls, and enforce the “agentic” loop that executes the ones calls and composes the overall reaction. With those construction blocks, we will be able to prolong any LLM right into a succesful, tool-enabled assistant that automates workflows, retrieves reside information, and interacts together with your code or APIs as easily as speaking to a colleague.


This is the Colab Notebook. Additionally, don’t put out of your mind to observe us on Twitter and sign up for our Telegram Channel and LinkedIn Group. Don’t Omit to enroll in our 90k+ ML SubReddit.

🔥 [Register Now] miniCON Virtual Conference on AGENTIC AI: FREE REGISTRATION + Certificate of Attendance + 4 Hour Short Event (May 21, 9 am- 1 pm PST) + Hands on Workshop


Asif Razzaq is the CEO of Marktechpost Media Inc.. As a visionary entrepreneur and engineer, Asif is dedicated to harnessing the opportunity of Synthetic Intelligence for social just right. His most up-to-date enterprise is the release of an Synthetic Intelligence Media Platform, Marktechpost, which sticks out for its in-depth protection of gadget studying and deep studying information this is each technically sound and simply comprehensible via a large target market. The platform boasts of over 2 million per 30 days perspectives, illustrating its recognition amongst audiences.



Source link

Leave a Comment