How to Create a Custom Model Context Protocol (MCP) Client Using Gemini


On this instructional, we will be able to be enforcing a customized Fashion Context Protocol (MCP) Shopper the usage of Gemini. By way of the top of this instructional, it is possible for you to to attach your personal AI packages with MCP servers, unlocking tough new functions to supercharge your tasks.

Gemini API 

We’ll be the usage of the Gemini 2.0 Flash fashion for this instructional.

To get your Gemini API key, talk over with Google’s Gemini API Key web page and practice the directions.

After you have the important thing, retailer it safely—you’ll want it later.

Node.js

Probably the most MCP servers require Node.js to run. Obtain the newest model of Node.js from nodejs.org

  • Run the installer.
  • Depart all settings as default and entire the set up.

Nationwide Park Services and products API

For this instructional, we will be able to be exposing the Nationwide Park Services and products MCP server to our consumer. To make use of the Nationwide Park Provider API, you’ll be able to request an API key by way of visiting this link and filling out a brief shape. As soon as submitted, the API key might be despatched in your e-mail.

You should definitely stay this key available—we’ll be the usage of it in a while.

Putting in Python libraries

Within the command suggested, input the next code to put in the python libraries:

pip set up mcp python-dotenv google-genai

Growing mcp.json document

Subsequent, create a document named mcp.json.

This document will retailer configuration information about the MCP servers your consumer will hook up with.

As soon as the document is created, upload the next preliminary content material:

{
    "mcpServers": {
      "nationalparks": {
        "command": "npx",
        "args": ["-y", "mcp-server-nationalparks"],
        "env": {
            "NPS_API_KEY": <”YOUR_NPS_API_KEY”>
        }
      }
    }
}

Exchange with the important thing you generated.

Growing .env document

Create a .env document in the similar listing because the mcp.json document and input the next code:

Exchange with the important thing you generated.

We will be able to now create a consumer.py document to put in force our MCP Shopper. Make certain that this document is in the similar listing as mcp.json and .env

Elementary Shopper Construction

We will be able to first import the essential libraries and create a elementary consumer elegance

import asyncio
import json
import os
from typing import Checklist, Not obligatory
from contextlib import AsyncExitStack
import warnings

from google import genai
from google.genai import varieties
from mcp import ClientSession, StdioServerParameters
from mcp.consumer.stdio import stdio_client
from dotenv import load_dotenv

load_dotenv()
warnings.filterwarnings("forget about", class=ResourceWarning)

def clean_schema(schema): # Cleans the schema by way of retaining best allowed keys
    allowed_keys = {"kind", "homes", "required", "description", "identify", "default", "enum"}
    go back {okay: v for okay, v in schema.pieces() if okay in allowed_keys}

elegance MCPGeminiAgent:
    def __init__(self):
        self.consultation: Not obligatory[ClientSession] = None
        self.exit_stack = AsyncExitStack()
        self.genai_client = genai.Shopper(api_key=os.getenv("GEMINI_API_KEY"))
        self.fashion = "gemini-2.0-flash"
        self.equipment = None
        self.server_params = None
        self.server_name = None

The __init__ approach initializes the MCPGeminiAgent by way of putting in an asynchronous consultation supervisor, loading the Gemini API consumer, and getting ready placeholders for fashion configuration, equipment, and server main points.

It lays the basis for managing server connections and interacting with the Gemini fashion.

Settling on the MCP Server

async def select_server(self):
        with open('mcp.json', 'r') as f:
            mcp_config = json.load(f)
        servers = mcp_config['mcpServers']
        server_names = listing(servers.keys())
        print("To be had MCP servers:")
        for idx, identify in enumerate(server_names):
            print(f"  {idx+1}. {identify}")
        whilst True:
            take a look at:
                selection = int(enter(f"Please make a selection a server by way of quantity [1-{len(server_names)}]: "))
                if 1 <= selection <= len(server_names):
                    smash
                else:
                    print("That quantity isn't legitimate. Please take a look at once more.")
            with the exception of ValueError:
                print("Please input a legitimate quantity.")
        self.server_name = server_names[choice-1]
        server_cfg = servers[self.server_name]
        command = server_cfg['command']
        args = server_cfg.get('args', [])
        env = server_cfg.get('env', None)
        self.server_params = StdioServerParameters(
            command=command,
            args=args,
            env=env
        )

This technique activates the person to select a server from the to be had choices indexed in mcp.json. It quite a bit and prepares the chosen server’s connection parameters for later use.

Connecting to the MCP Server

async def attach(self):
        look forward to self.select_server()
        self.stdio_transport = look forward to self.exit_stack.enter_async_context(stdio_client(self.server_params))
        self.stdio, self.write = self.stdio_transport
        self.consultation = look forward to self.exit_stack.enter_async_context(ClientSession(self.stdio, self.write))
        look forward to self.consultation.initialize()
        print(f"Effectively attached to: {self.server_name}")
        # Checklist to be had equipment for this server
        mcp_tools = look forward to self.consultation.list_tools()
        print("nAvailable MCP equipment for this server:")
        for instrument in mcp_tools.equipment:
            print(f"- {instrument.identify}: {instrument.description}")

This establishes an asynchronous connection to the chosen MCP server the usage of stdio shipping. It initializes the MCP consultation and retrieves the to be had equipment from the server.

Dealing with Consumer question and gear calls

async def agent_loop(self, suggested: str) -> str:
        contents = [types.Content(role="user", parts=[types.Part(text=prompt)])]
        mcp_tools = look forward to self.consultation.list_tools()
        equipment = varieties.Instrument(function_declarations=[
            {
                "name": tool.name,
                "description": tool.description,
                "parameters": clean_schema(getattr(tool, "inputSchema", {}))
            }
            for tool in mcp_tools.tools
        ])
        self.equipment = equipment
        reaction = look forward to self.genai_client.aio.fashions.generate_content(
            fashion=self.fashion,
            contents=contents,
            config=varieties.GenerateContentConfig(
                temperature=0,
                equipment=[tools],
            ),
        )
        contents.append(reaction.applicants[0].content material)
        turn_count = 0
        max_tool_turns = 5
        whilst reaction.function_calls and turn_count < max_tool_turns:
            turn_count += 1
            tool_response_parts: Checklist[types.Part] = []
            for fc_part in reaction.function_calls:
                tool_name = fc_part.identify
                args = fc_part.args or {}
                print(f"Invoking MCP instrument '{tool_name}' with arguments: {args}")
                tool_response: dict
                take a look at:
                    tool_result = look forward to self.consultation.call_tool(tool_name, args)
                    print(f"Instrument '{tool_name}' finished.")
                    if tool_result.isError:
                        tool_response = {"error": tool_result.content material[0].textual content}
                    else:
                        tool_response = {"outcome": tool_result.content material[0].textual content}
                with the exception of Exception as e:
                    tool_response = {"error":  f"Instrument execution failed: {kind(e).__name__}: {e}"}
                tool_response_parts.append(
                    varieties.Section.from_function_response(
                        identify=tool_name, reaction=tool_response
                    )
                )
            contents.append(varieties.Content material(position="person", portions=tool_response_parts))
            print(f"Added {len(tool_response_parts)} instrument reaction(s) to the dialog.")
            print("Inquiring for up to date reaction from Gemini...")
            reaction = look forward to self.genai_client.aio.fashions.generate_content(
                fashion=self.fashion,
                contents=contents,
                config=varieties.GenerateContentConfig(
                    temperature=1.0,
                    equipment=[tools],
                ),
            )
            contents.append(reaction.applicants[0].content material)
        if turn_count >= max_tool_turns and reaction.function_calls:
            print(f"Stopped after {max_tool_turns} instrument calls to steer clear of countless loops.")
        print("All instrument calls entire. Exhibiting Gemini's ultimate reaction.")
        go back reaction

This technique sends the person’s suggested to Gemini, processes any instrument calls returned by way of the fashion, executes the corresponding MCP equipment, and iteratively refines the reaction. It manages multi-turn interactions between Gemini and the server equipment.

Interactive Chat Loop

async def chat(self):
        print(f"nMCP-Gemini Assistant is able and attached to: {self.server_name}")
        print("Input your query underneath, or kind 'give up' to go out.")
        whilst True:
            take a look at:
                question = enter("nYour question: ").strip()
                if question.decrease() == 'give up':
                    print("Consultation ended. Good-bye!")
                    smash
                print(f"Processing your request...")
                res = look forward to self.agent_loop(question)
                print("nGemini's solution:")
                print(res.textual content)
            with the exception of KeyboardInterrupt:
                print("nSession interrupted. Good-bye!")
                smash
            with the exception of Exception as e:
                print(f"nAn error befell: {str(e)}")

This gives a command-line interface the place customers can post queries and obtain solutions from Gemini, regularly till they go out the consultation.

Cleansing up assets

async def cleanup(self):
        look forward to self.exit_stack.aclose()

This closes the asynchronous context and cleans up all open assets just like the consultation and connection stack gracefully.

Major access level

async def major():
    agent = MCPGeminiAgent()
    take a look at:
        look forward to agent.attach()
        look forward to agent.chat()
    in the end:
        look forward to agent.cleanup()

if __name__ == "__main__":
    import sys
    import os
    take a look at:
        asyncio.run(major())
    with the exception of KeyboardInterrupt:
        print("Consultation interrupted. Good-bye!")
    in the end:
        sys.stderr = open(os.devnull, "w")

That is the principle execution common sense.

Except major(), all different strategies are a part of the MCPGeminiAgent elegance. You’ll be able to in finding all the consumer.py document here.

Run the next suggested within the terminal to run your consumer:

The buyer will:

  • Learn the mcp.json document to listing the other to be had MCP servers.
  • Suggested the person to choose one of the most indexed servers.
  • Hook up with the chosen MCP server the usage of the equipped configuration and setting settings.
  • Engage with the Gemini fashion thru a chain of queries and responses.
  • Permit customers to factor activates, execute equipment, and procedure responses iteratively with the fashion.
  • Supply a command-line interface for customers to interact with the device and obtain real-time effects.
  • Make sure that right kind cleanup of assets after the consultation ends, remaining connections and liberating reminiscence.


I’m a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I’ve a willing pastime in Knowledge Science, particularly Neural Networks and their utility in more than a few spaces.



Source link

Leave a Comment