Referensi panggilan fungsi

Panduan ini memberikan referensi untuk menggunakan panggilan fungsi dengan Gemini API. Bagian ini membahas topik berikut:

  • Model yang didukung: Mencantumkan model yang mendukung panggilan fungsi.
  • Contoh sintaksis: Menampilkan struktur dasar permintaan API panggilan fungsi.
  • Parameter API: Menjelaskan parameter yang digunakan dalam panggilan fungsi, seperti FunctionDeclaration dan FunctionCallingConfig.
  • Contoh: Menyediakan contoh kode untuk mengirim deklarasi fungsi dan mengonfigurasi perilaku panggilan fungsi.

Panggilan fungsi meningkatkan kemampuan LLM dalam memberikan jawaban yang relevan dan sesuai konteks.

Dengan Function Calling API, Anda dapat menyediakan fungsi kustom ke model AI generatif. Model tidak memanggil fungsi ini secara langsung. Sebagai gantinya, fungsi ini menghasilkan output data terstruktur yang menentukan nama fungsi dan argumen yang disarankan. Output ini memungkinkan Anda memanggil API atau sistem informasi eksternal, seperti database, sistem pengelolaan hubungan pelanggan (CRM), dan repositori dokumen. Kemudian, Anda dapat memberikan output API yang dihasilkan kembali ke model untuk meningkatkan kualitas responsnya.

Untuk ringkasan konseptual tentang pemanggilan fungsi, lihat Pemanggilan fungsi.

Model yang didukung

Batasan:

  • Anda dapat memberikan maksimum 128 deklarasi fungsi dengan setiap permintaan.

Contoh sintaksis

Contoh berikut menunjukkan sintaksis untuk permintaan API panggilan fungsi.

curl

curl -X POST \   -H "Authorization: Bearer $(gcloud auth print-access-token)" \   -H "Content-Type: application/json" \  https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \ -d '{   "contents": [{     ...   }],   "tools": [{     "function_declarations": [       {         ...       }     ]   }] }'

Parameter API

Bagian ini menjelaskan parameter untuk panggilan fungsi. Untuk mengetahui detail implementasi, lihat bagian Contoh.

FunctionDeclaration

FunctionDeclaration menentukan fungsi yang dapat dibuat input JSON-nya oleh model, berdasarkan spesifikasi OpenAPI 3.0.

Parameter

name

string

Nama fungsi yang akan dipanggil. Nama harus diawali dengan huruf atau garis bawah. Dapat berisi huruf (a-z, A-Z), angka (0-9), garis bawah, titik, atau tanda hubung, dengan panjang maksimal 64 karakter.

description

Opsional: string

Deskripsi tujuan fungsi. Model menggunakan deskripsi ini untuk memutuskan cara dan apakah akan memanggil fungsi tersebut. Untuk hasil terbaik, sebaiknya sertakan deskripsi.

parameters

Opsional: Schema

Parameter fungsi, yang dijelaskan dalam format Objek Skema JSON OpenAPI.

response

Opsional: Schema

Output dari fungsi, yang dijelaskan dalam format Objek Skema JSON OpenAPI.

Untuk mengetahui informasi selengkapnya, lihat Panggilan fungsi.

Schema

Schema menentukan format data input dan output dalam panggilan fungsi, berdasarkan spesifikasi Skema OpenAPI 3.0.

Parameter
jenis

string

Jenis data. Harus salah satu dari berikut ini

  • STRING
  • INTEGER
  • BOOLEAN
  • NUMBER
  • ARRAY
  • OBJECT
description

Opsional: string

Deskripsi data.

enum

Opsional: string[]

Kemungkinan nilai untuk elemen jenis primitif.

items

Opsional: Schema[]

Skema untuk elemen jenis ARRAY.

properties

Opsional: Schema

Skema untuk properti jenis OBJECT.

required

Opsional: string[]

Properti wajib dari jenis OBJECT.

nullable

Opsional: bool

Menunjukkan apakah nilai dapat berupa null.

FunctionCallingConfig

FunctionCallingConfig memungkinkan Anda mengontrol perilaku model dan menentukan fungsi mana yang akan dipanggil.

Parameter

mode

Opsional: enum/string[]

  • AUTO: Ini adalah perilaku default. Model memutuskan apakah akan memanggil fungsi atau merespons dengan bahasa alami berdasarkan konteks.
  • NONE: Model tidak memanggil fungsi apa pun.
  • ANY: Model dibatasi untuk selalu memprediksi panggilan fungsi. Jika Anda tidak memberikan allowed_function_names, model akan memilih dari semua deklarasi fungsi yang tersedia. Jika Anda memberikan allowed_function_names, model akan memilih dari kumpulan fungsi tersebut.

allowed_function_names

Opsional: string[]

Daftar nama fungsi yang akan dipanggil. Anda hanya dapat menyetelnya jika mode adalah ANY. Nama fungsi harus cocok dengan FunctionDeclaration.name. Jika modenya adalah ANY, model akan memprediksi panggilan fungsi dari daftar nama fungsi yang Anda berikan.

functionCall

functionCall adalah prediksi yang ditampilkan dari model. Objek ini berisi nama fungsi yang akan dipanggil (functionDeclaration.name) dan objek JSON terstruktur dengan parameter dan nilainya.

Parameter

name

string

Nama fungsi yang akan dipanggil.

args

Struct

Parameter fungsi dan nilainya dalam format objek JSON.

Lihat Panggilan fungsi untuk mengetahui detail parameter.

functionResponse

functionResponse adalah output dari FunctionCall. Objek ini berisi nama fungsi yang dipanggil dan objek JSON terstruktur dengan output fungsi. Anda memberikan respons ini kembali ke model untuk digunakan sebagai konteks.

Parameter

name

string

Nama fungsi yang dipanggil.

response

Struct

Respons fungsi dalam format objek JSON.

Contoh

Mengirim deklarasi fungsi

Contoh berikut menunjukkan cara mengirim kueri dan deklarasi fungsi ke model.

REST

Sebelum menggunakan salah satu data permintaan, lakukan penggantian berikut:

  • PROJECT_ID: Project ID Anda.
  • MODEL_ID: ID model yang sedang diproses.
  • ROLE: Identitas entitas yang membuat pesan.
  • TEXT: Perintah yang akan dikirim ke model.
  • NAME: Nama fungsi yang akan dipanggil.
  • DESCRIPTION: Deskripsi dan tujuan fungsi.
  • Untuk kolom lainnya, lihat tabel Daftar parameter.

Metode HTTP dan URL:

POST https://aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/global/publishers/google/models/MODEL_ID:generateContent

Isi JSON permintaan:

 {   "contents": [{     "role": "ROLE",     "parts": [{       "text": "TEXT"     }]   }],   "tools": [{     "function_declarations": [       {         "name": "NAME",         "description": "DESCRIPTION",         "parameters": {           "type": "TYPE",           "properties": {             "location": {               "type": "TYPE",               "description": "DESCRIPTION"             }           },           "required": [             "location"           ]         }       }     ]   }] } 

Untuk mengirim permintaan Anda, pilih salah satu opsi berikut:

curl

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/global/publishers/google/models/MODEL_ID:generateContent"

PowerShell

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/global/publishers/google/models/MODEL_ID:generateContent" | Select-Object -Expand Content

Contoh perintah curl

PROJECT_ID=myproject LOCATION=us-central1 MODEL_ID=gemini-2.5-flash  curl -X POST \   -H "Authorization: Bearer $(gcloud auth print-access-token)" \   -H "Content-Type: application/json" \   https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \   -d '{     "contents": [{       "role": "user",       "parts": [{         "text": "What is the weather in Boston?"       }]     }],     "tools": [{       "functionDeclarations": [         {           "name": "get_current_weather",           "description": "Get the current weather in a given location",           "parameters": {             "type": "object",             "properties": {               "location": {                 "type": "string",                 "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"               }             },             "required": [               "location"             ]           }         }       ]     }]   }' 

Gen AI SDK untuk Python

from google import genai from google.genai.types import GenerateContentConfig, HttpOptions  def get_current_weather(location: str) -> str:     """Example method. Returns the current weather.      Args:         location: The city and state, e.g. San Francisco, CA     """     weather_map: dict[str, str] = {         "Boston, MA": "snowing",         "San Francisco, CA": "foggy",         "Seattle, WA": "raining",         "Austin, TX": "hot",         "Chicago, IL": "windy",     }     return weather_map.get(location, "unknown")  client = genai.Client(http_options=HttpOptions(api_version="v1")) model_id = "gemini-2.5-flash"  response = client.models.generate_content(     model=model_id,     contents="What is the weather like in Boston?",     config=GenerateContentConfig(         tools=[get_current_weather],         temperature=0,     ), )  print(response.text) # Example response: # The weather in Boston is sunny.

Node.js

const {   VertexAI,   FunctionDeclarationSchemaType, } = require('@google-cloud/vertexai');  const functionDeclarations = [   {     function_declarations: [       {         name: 'get_current_weather',         description: 'get weather in a given location',         parameters: {           type: FunctionDeclarationSchemaType.OBJECT,           properties: {             location: {type: FunctionDeclarationSchemaType.STRING},             unit: {               type: FunctionDeclarationSchemaType.STRING,               enum: ['celsius', 'fahrenheit'],             },           },           required: ['location'],         },       },     ],   }, ];  /**  * TODO(developer): Update these variables before running the sample.  */ async function functionCallingBasic(   projectId = 'PROJECT_ID',   location = 'us-central1',   model = 'gemini-2.0-flash-001' ) {   // Initialize Vertex with your Cloud project and location   const vertexAI = new VertexAI({project: projectId, location: location});    // Instantiate the model   const generativeModel = vertexAI.preview.getGenerativeModel({     model: model,   });    const request = {     contents: [       {role: 'user', parts: [{text: 'What is the weather in Boston?'}]},     ],     tools: functionDeclarations,   };   const result = await generativeModel.generateContent(request);   console.log(JSON.stringify(result.response.candidates[0].content)); }

Java

import com.google.cloud.vertexai.VertexAI; import com.google.cloud.vertexai.api.Content; import com.google.cloud.vertexai.api.FunctionDeclaration; import com.google.cloud.vertexai.api.GenerateContentResponse; import com.google.cloud.vertexai.api.Schema; import com.google.cloud.vertexai.api.Tool; import com.google.cloud.vertexai.api.Type; import com.google.cloud.vertexai.generativeai.ChatSession; import com.google.cloud.vertexai.generativeai.ContentMaker; import com.google.cloud.vertexai.generativeai.GenerativeModel; import com.google.cloud.vertexai.generativeai.PartMaker; import com.google.cloud.vertexai.generativeai.ResponseHandler; import java.io.IOException; import java.util.Arrays; import java.util.Collections;  public class FunctionCalling {   public static void main(String[] args) throws IOException {     // TODO(developer): Replace these variables before running the sample.     String projectId = "your-google-cloud-project-id";     String location = "us-central1";     String modelName = "gemini-2.0-flash-001";      String promptText = "What's the weather like in Paris?";      whatsTheWeatherLike(projectId, location, modelName, promptText);   }    // A request involving the interaction with an external tool   public static String whatsTheWeatherLike(String projectId, String location,                                            String modelName, String promptText)       throws IOException {     // Initialize client that will be used to send requests.     // This client only needs to be created once, and can be reused for multiple requests.     try (VertexAI vertexAI = new VertexAI(projectId, location)) {        FunctionDeclaration functionDeclaration = FunctionDeclaration.newBuilder()           .setName("getCurrentWeather")           .setDescription("Get the current weather in a given location")           .setParameters(               Schema.newBuilder()                   .setType(Type.OBJECT)                   .putProperties("location", Schema.newBuilder()                       .setType(Type.STRING)                       .setDescription("location")                       .build()                   )                   .addRequired("location")                   .build()           )           .build();        System.out.println("Function declaration:");       System.out.println(functionDeclaration);        // Add the function to a "tool"       Tool tool = Tool.newBuilder()           .addFunctionDeclarations(functionDeclaration)           .build();        // Start a chat session from a model, with the use of the declared function.       GenerativeModel model = new GenerativeModel(modelName, vertexAI)           .withTools(Arrays.asList(tool));       ChatSession chat = model.startChat();        System.out.println(String.format("Ask the question: %s", promptText));       GenerateContentResponse response = chat.sendMessage(promptText);        // The model will most likely return a function call to the declared       // function `getCurrentWeather` with "Paris" as the value for the       // argument `location`.       System.out.println("\nPrint response: ");       System.out.println(ResponseHandler.getContent(response));        // Provide an answer to the model so that it knows what the result       // of a "function call" is.       Content content =           ContentMaker.fromMultiModalData(               PartMaker.fromFunctionResponse(                   "getCurrentWeather",                   Collections.singletonMap("currentWeather", "sunny")));       System.out.println("Provide the function response: ");       System.out.println(content);       response = chat.sendMessage(content);        // See what the model replies now       System.out.println("Print response: ");       String finalAnswer = ResponseHandler.getText(response);       System.out.println(finalAnswer);        return finalAnswer;     }   } }

Go

import ( 	"context" 	"fmt" 	"io"  	genai "google.golang.org/genai" )  // generateWithFuncCall shows how to submit a prompt and a function declaration to the model, // allowing it to suggest a call to the function to fetch external data. Returning this data // enables the model to generate a text response that incorporates the data. func generateWithFuncCall(w io.Writer) error { 	ctx := context.Background()  	client, err := genai.NewClient(ctx, &genai.ClientConfig{ 		HTTPOptions: genai.HTTPOptions{APIVersion: "v1"}, 	}) 	if err != nil { 		return fmt.Errorf("failed to create genai client: %w", err) 	}  	weatherFunc := &genai.FunctionDeclaration{ 		Description: "Returns the current weather in a location.", 		Name:        "getCurrentWeather", 		Parameters: &genai.Schema{ 			Type: "object", 			Properties: map[string]*genai.Schema{ 				"location": {Type: "string"}, 			}, 			Required: []string{"location"}, 		}, 	} 	config := &genai.GenerateContentConfig{ 		Tools: []*genai.Tool{ 			{FunctionDeclarations: []*genai.FunctionDeclaration{weatherFunc}}, 		}, 		Temperature: genai.Ptr(float32(0.0)), 	}  	modelName := "gemini-2.5-flash" 	contents := []*genai.Content{ 		{Parts: []*genai.Part{ 			{Text: "What is the weather like in Boston?"}, 		}, 			Role: "user"}, 	}  	resp, err := client.Models.GenerateContent(ctx, modelName, contents, config) 	if err != nil { 		return fmt.Errorf("failed to generate content: %w", err) 	}  	var funcCall *genai.FunctionCall 	for _, p := range resp.Candidates[0].Content.Parts { 		if p.FunctionCall != nil { 			funcCall = p.FunctionCall 			fmt.Fprint(w, "The model suggests to call the function ") 			fmt.Fprintf(w, "%q with args: %v\n", funcCall.Name, funcCall.Args) 			// Example response: 			// The model suggests to call the function "getCurrentWeather" with args: map[location:Boston] 		} 	} 	if funcCall == nil { 		return fmt.Errorf("model did not suggest a function call") 	}  	// Use synthetic data to simulate a response from the external API. 	// In a real application, this would come from an actual weather API. 	funcResp := &genai.FunctionResponse{ 		Name: "getCurrentWeather", 		Response: map[string]any{ 			"location":         "Boston", 			"temperature":      "38", 			"temperature_unit": "F", 			"description":      "Cold and cloudy", 			"humidity":         "65", 			"wind":             `{"speed": "10", "direction": "NW"}`, 		}, 	}  	// Return conversation turns and API response to complete the model's response. 	contents = []*genai.Content{ 		{Parts: []*genai.Part{ 			{Text: "What is the weather like in Boston?"}, 		}, 			Role: "user"}, 		{Parts: []*genai.Part{ 			{FunctionCall: funcCall}, 		}}, 		{Parts: []*genai.Part{ 			{FunctionResponse: funcResp}, 		}}, 	}  	resp, err = client.Models.GenerateContent(ctx, modelName, contents, config) 	if err != nil { 		return fmt.Errorf("failed to generate content: %w", err) 	}  	respText := resp.Text()  	fmt.Fprintln(w, respText)  	// Example response: 	// The weather in Boston is cold and cloudy with a temperature of 38 degrees Fahrenheit. The humidity is ...  	return nil } 

REST (OpenAI)

Anda dapat memanggil Function Calling API menggunakan library OpenAI. Untuk mengetahui informasi selengkapnya, lihat Memanggil model Vertex AI menggunakan library OpenAI.

Sebelum menggunakan salah satu data permintaan, lakukan penggantian berikut:

  • PROJECT_ID: .
  • MODEL_ID: ID model yang sedang diproses.

Metode HTTP dan URL:

POST https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions

Isi JSON permintaan:

 {   "model": "google/MODEL_ID",   "messages": [     {       "role": "user",       "content": "What is the weather in Boston?"     }   ],   "tools": [     {       "type": "function",       "function": {         "name": "get_current_weather",         "description": "Get the current weather in a given location",         "parameters": {           "type": "OBJECT",           "properties": {             "location": {               "type": "string",               "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"             }            },           "required": ["location"]         }       }     }   ] } 

Untuk mengirim permintaan Anda, pilih salah satu opsi berikut:

curl

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions"

PowerShell

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions" | Select-Object -Expand Content

Python (OpenAI)

Anda dapat memanggil Function Calling API menggunakan library OpenAI. Untuk mengetahui informasi selengkapnya, lihat Memanggil model Vertex AI menggunakan library OpenAI.

import vertexai import openai  from google.auth import default, transport  # TODO(developer): Update & uncomment below line # PROJECT_ID = "your-project-id" location = "us-central1"  vertexai.init(project=PROJECT_ID, location=location)  # Programmatically get an access token credentials, _ = default(scopes=["https://www.googleapis.com/auth/cloud-platform"]) auth_request = transport.requests.Request() credentials.refresh(auth_request)  # # OpenAI Client client = openai.OpenAI(     base_url=f"https://{location}-aiplatform.googleapis.com/v1beta1/projects/{PROJECT_ID}/locations/{location}/endpoints/openapi",     api_key=credentials.token, )  tools = [     {         "type": "function",         "function": {             "name": "get_current_weather",             "description": "Get the current weather in a given location",             "parameters": {                 "type": "object",                 "properties": {                     "location": {                         "type": "string",                         "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616",                     },                 },                 "required": ["location"],             },         },     } ]  messages = [] messages.append(     {         "role": "system",         "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.",     } ) messages.append({"role": "user", "content": "What is the weather in Boston?"})  response = client.chat.completions.create(     model="google/gemini-2.0-flash-001",     messages=messages,     tools=tools, )  print("Function:", response.choices[0].message.tool_calls[0].id) print("Arguments:", response.choices[0].message.tool_calls[0].function.arguments) # Example response: # Function: get_current_weather # Arguments: {"location":"Boston"} 

Mengonfigurasi perilaku panggilan fungsi

Contoh berikut menunjukkan cara meneruskan FunctionCallingConfig ke model.

Anda dapat menggunakan functionCallingConfig untuk mewajibkan model menghasilkan panggilan fungsi tertentu. Untuk mengonfigurasi perilaku ini:

  • Tetapkan fungsi yang memanggil mode ke ANY.
  • Tentukan nama fungsi yang ingin Anda gunakan di allowed_function_names. Jika allowed_function_names kosong, salah satu fungsi yang disediakan dapat ditampilkan.

REST

PROJECT_ID=myproject LOCATION=us-central1 MODEL_ID=gemini-2.5-flash  curl -X POST \   -H "Authorization: Bearer $(gcloud auth print-access-token)" \   -H "Content-Type: application/json" \   https://${LOCATION}-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/${LOCATION}/publishers/google/models/${MODEL_ID}:generateContent \   -d '{     "contents": [{       "role": "user",       "parts": [{         "text": "Do you have the White Pixel 8 Pro 128GB in stock in the US?"       }]     }],     "tools": [{       "functionDeclarations": [         {           "name": "get_product_sku",           "description": "Get the available inventory for a Google products, e.g: Pixel phones, Pixel Watches, Google Home etc",           "parameters": {             "type": "object",             "properties": {               "product_name": {"type": "string", "description": "Product name"}             }           }         },         {           "name": "get_store_location",           "description": "Get the location of the closest store",           "parameters": {             "type": "object",             "properties": {               "location": {"type": "string", "description": "Location"}             },           }         }       ]     }],     "toolConfig": {         "functionCallingConfig": {             "mode":"ANY",             "allowedFunctionNames": ["get_product_sku"]       }     },     "generationConfig": {       "temperature": 0.95,       "topP": 1.0,       "maxOutputTokens": 8192     }   }' 

Gen AI SDK untuk Python

from google import genai from google.genai.types import (     FunctionDeclaration,     GenerateContentConfig,     HttpOptions,     Tool, )  client = genai.Client(http_options=HttpOptions(api_version="v1")) model_id = "gemini-2.5-flash"  get_album_sales = FunctionDeclaration(     name="get_album_sales",     description="Gets the number of albums sold",     # Function parameters are specified in JSON schema format     parameters={         "type": "OBJECT",         "properties": {             "albums": {                 "type": "ARRAY",                 "description": "List of albums",                 "items": {                     "description": "Album and its sales",                     "type": "OBJECT",                     "properties": {                         "album_name": {                             "type": "STRING",                             "description": "Name of the music album",                         },                         "copies_sold": {                             "type": "INTEGER",                             "description": "Number of copies sold",                         },                     },                 },             },         },     }, )  sales_tool = Tool(     function_declarations=[get_album_sales], )  response = client.models.generate_content(     model=model_id,     contents='At Stellar Sounds, a music label, 2024 was a rollercoaster. "Echoes of the Night," a debut synth-pop album, '     'surprisingly sold 350,000 copies, while veteran rock band "Crimson Tide\'s" latest, "Reckless Hearts," '     'lagged at 120,000. Their up-and-coming indie artist, "Luna Bloom\'s" EP, "Whispers of Dawn," '     'secured 75,000 sales. The biggest disappointment was the highly-anticipated rap album "Street Symphony" '     "only reaching 100,000 units. Overall, Stellar Sounds moved over 645,000 units this year, revealing unexpected "     "trends in music consumption.",     config=GenerateContentConfig(         tools=[sales_tool],         temperature=0,     ), )  print(response.function_calls) # Example response: # [FunctionCall( #     id=None, #     name="get_album_sales", #     args={ #         "albums": [ #             {"album_name": "Echoes of the Night", "copies_sold": 350000}, #             {"copies_sold": 120000, "album_name": "Reckless Hearts"}, #             {"copies_sold": 75000, "album_name": "Whispers of Dawn"}, #             {"copies_sold": 100000, "album_name": "Street Symphony"}, #         ] #     }, # )]

Node.js

const {   VertexAI,   FunctionDeclarationSchemaType, } = require('@google-cloud/vertexai');  const functionDeclarations = [   {     function_declarations: [       {         name: 'get_product_sku',         description:           'Get the available inventory for a Google products, e.g: Pixel phones, Pixel Watches, Google Home etc',         parameters: {           type: FunctionDeclarationSchemaType.OBJECT,           properties: {             productName: {type: FunctionDeclarationSchemaType.STRING},           },         },       },       {         name: 'get_store_location',         description: 'Get the location of the closest store',         parameters: {           type: FunctionDeclarationSchemaType.OBJECT,           properties: {             location: {type: FunctionDeclarationSchemaType.STRING},           },         },       },     ],   }, ];  const toolConfig = {   function_calling_config: {     mode: 'ANY',     allowed_function_names: ['get_product_sku'],   }, };  const generationConfig = {   temperature: 0.95,   topP: 1.0,   maxOutputTokens: 8192, };  /**  * TODO(developer): Update these variables before running the sample.  */ async function functionCallingAdvanced(   projectId = 'PROJECT_ID',   location = 'us-central1',   model = 'gemini-2.0-flash-001' ) {   // Initialize Vertex with your Cloud project and location   const vertexAI = new VertexAI({project: projectId, location: location});    // Instantiate the model   const generativeModel = vertexAI.preview.getGenerativeModel({     model: model,   });    const request = {     contents: [       {         role: 'user',         parts: [           {text: 'Do you have the White Pixel 8 Pro 128GB in stock in the US?'},         ],       },     ],     tools: functionDeclarations,     tool_config: toolConfig,     generation_config: generationConfig,   };   const result = await generativeModel.generateContent(request);   console.log(JSON.stringify(result.response.candidates[0].content)); }

Go

import ( 	"context" 	"encoding/json" 	"errors" 	"fmt" 	"io"  	"cloud.google.com/go/vertexai/genai" )  // functionCallsChat opens a chat session and sends 4 messages to the model: // - convert a first text question into a structured function call request // - convert the first structured function call response into natural language // - convert a second text question into a structured function call request // - convert the second structured function call response into natural language func functionCallsChat(w io.Writer, projectID, location, modelName string) error { 	// location := "us-central1" 	// modelName := "gemini-2.0-flash-001" 	ctx := context.Background() 	client, err := genai.NewClient(ctx, projectID, location) 	if err != nil { 		return fmt.Errorf("unable to create client: %w", err) 	} 	defer client.Close()  	model := client.GenerativeModel(modelName)  	// Build an OpenAPI schema, in memory 	paramsProduct := &genai.Schema{ 		Type: genai.TypeObject, 		Properties: map[string]*genai.Schema{ 			"productName": { 				Type:        genai.TypeString, 				Description: "Product name", 			}, 		}, 	} 	fundeclProductInfo := &genai.FunctionDeclaration{ 		Name:        "getProductSku", 		Description: "Get the SKU for a product", 		Parameters:  paramsProduct, 	} 	paramsStore := &genai.Schema{ 		Type: genai.TypeObject, 		Properties: map[string]*genai.Schema{ 			"location": { 				Type:        genai.TypeString, 				Description: "Location", 			}, 		}, 	} 	fundeclStoreLocation := &genai.FunctionDeclaration{ 		Name:        "getStoreLocation", 		Description: "Get the location of the closest store", 		Parameters:  paramsStore, 	} 	model.Tools = []*genai.Tool{ 		{FunctionDeclarations: []*genai.FunctionDeclaration{ 			fundeclProductInfo, 			fundeclStoreLocation, 		}}, 	} 	model.SetTemperature(0.0)  	chat := model.StartChat()  	// Send a prompt for the first conversation turn that should invoke the getProductSku function 	prompt := "Do you have the Pixel 8 Pro in stock?" 	fmt.Fprintf(w, "Question: %s\n", prompt) 	resp, err := chat.SendMessage(ctx, genai.Text(prompt)) 	if err != nil { 		return err 	} 	if len(resp.Candidates) == 0 || 		len(resp.Candidates[0].Content.Parts) == 0 { 		return errors.New("empty response from model") 	}  	// The model has returned a function call to the declared function `getProductSku` 	// with a value for the argument `productName`. 	jsondata, err := json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "function call generated by the model:\n\t%s\n", string(jsondata))  	// Create a function call response, to simulate the result of a call to a 	// real service 	funresp := &genai.FunctionResponse{ 		Name: "getProductSku", 		Response: map[string]any{ 			"sku":      "GA04834-US", 			"in_stock": "yes", 		}, 	} 	jsondata, err = json.MarshalIndent(funresp, "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "function call response sent to the model:\n\t%s\n\n", string(jsondata))  	// And provide the function call response to the model 	resp, err = chat.SendMessage(ctx, funresp) 	if err != nil { 		return err 	} 	if len(resp.Candidates) == 0 || 		len(resp.Candidates[0].Content.Parts) == 0 { 		return errors.New("empty response from model") 	}  	// The model has taken the function call response as input, and has 	// reformulated the response to the user. 	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "Answer generated by the model:\n\t%s\n\n", string(jsondata))  	// Send a prompt for the second conversation turn that should invoke the getStoreLocation function 	prompt2 := "Is there a store in Mountain View, CA that I can visit to try it out?" 	fmt.Fprintf(w, "Question: %s\n", prompt)  	resp, err = chat.SendMessage(ctx, genai.Text(prompt2)) 	if err != nil { 		return err 	} 	if len(resp.Candidates) == 0 || 		len(resp.Candidates[0].Content.Parts) == 0 { 		return errors.New("empty response from model") 	}  	// The model has returned a function call to the declared function `getStoreLocation` 	// with a value for the argument `store`. 	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "function call generated by the model:\n\t%s\n", string(jsondata))  	// Create a function call response, to simulate the result of a call to a 	// real service 	funresp = &genai.FunctionResponse{ 		Name: "getStoreLocation", 		Response: map[string]any{ 			"store": "2000 N Shoreline Blvd, Mountain View, CA 94043, US", 		}, 	} 	jsondata, err = json.MarshalIndent(funresp, "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "function call response sent to the model:\n\t%s\n\n", string(jsondata))  	// And provide the function call response to the model 	resp, err = chat.SendMessage(ctx, funresp) 	if err != nil { 		return err 	} 	if len(resp.Candidates) == 0 || 		len(resp.Candidates[0].Content.Parts) == 0 { 		return errors.New("empty response from model") 	}  	// The model has taken the function call response as input, and has 	// reformulated the response to the user. 	jsondata, err = json.MarshalIndent(resp.Candidates[0].Content.Parts[0], "\t", "  ") 	if err != nil { 		return fmt.Errorf("json.MarshalIndent: %w", err) 	} 	fmt.Fprintf(w, "Answer generated by the model:\n\t%s\n\n", string(jsondata)) 	return nil } 

REST (OpenAI)

Anda dapat memanggil Function Calling API menggunakan library OpenAI. Untuk mengetahui informasi selengkapnya, lihat Memanggil model Vertex AI menggunakan library OpenAI.

Sebelum menggunakan salah satu data permintaan, lakukan penggantian berikut:

  • PROJECT_ID: .
  • MODEL_ID: ID model yang sedang diproses.

Metode HTTP dan URL:

POST https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions

Isi JSON permintaan:

 {   "model": "google/MODEL_ID",   "messages": [   {     "role": "user",     "content": "What is the weather in Boston?"   } ], "tools": [   {     "type": "function",     "function": {       "name": "get_current_weather",       "description": "Get the current weather in a given location",       "parameters": {         "type": "OBJECT",         "properties": {           "location": {             "type": "string",             "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616"           }          },         "required": ["location"]       }     }   } ], "tool_choice": "auto" } 

Untuk mengirim permintaan Anda, pilih salah satu opsi berikut:

curl

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions"

PowerShell

Simpan isi permintaan dalam file bernama request.json, dan jalankan perintah berikut:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/global/endpoints/openapi/chat/completions" | Select-Object -Expand Content

Python (OpenAI)

Anda dapat memanggil Function Calling API menggunakan library OpenAI. Untuk mengetahui informasi selengkapnya, lihat Memanggil model Vertex AI menggunakan library OpenAI.

import vertexai import openai  from google.auth import default, transport  # TODO(developer): Update & uncomment below line # PROJECT_ID = "your-project-id" location = "us-central1"  vertexai.init(project=PROJECT_ID, location=location)  # Programmatically get an access token credentials, _ = default(scopes=["https://www.googleapis.com/auth/cloud-platform"]) auth_request = transport.requests.Request() credentials.refresh(auth_request)  # OpenAI Client client = openai.OpenAI(     base_url=f"https://{location}-aiplatform.googleapis.com/v1beta1/projects/{PROJECT_ID}/locations/{location}/endpoints/openapi",     api_key=credentials.token, )  tools = [     {         "type": "function",         "function": {             "name": "get_current_weather",             "description": "Get the current weather in a given location",             "parameters": {                 "type": "object",                 "properties": {                     "location": {                         "type": "string",                         "description": "The city and state, e.g. San Francisco, CA or a zip code e.g. 95616",                     },                 },                 "required": ["location"],             },         },     } ]  messages = [] messages.append(     {         "role": "system",         "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.",     } ) messages.append({"role": "user", "content": "What is the weather in Boston, MA?"})  response = client.chat.completions.create(     model="google/gemini-2.0-flash-001",     messages=messages,     tools=tools,     tool_choice="auto", )  print("Function:", response.choices[0].message.tool_calls[0].id) print("Arguments:", response.choices[0].message.tool_calls[0].function.arguments) # Example response: # Function: get_current_weather # Arguments: {"location":"Boston"}

Langkah berikutnya

Untuk mempelajari lebih lanjut, lihat dokumentasi berikut: