Skip to main content

2 posts tagged with "chatgpt"

View All Tags

How to Use MCP Servers with Custom GPTs

· 6 min read
Lutra AI Team
Building the future of AI infrastructure

Why Bring MCP into Your GPT Workflow?

Custom GPTs become powerful when they can interact with external tools and services. The Model Context Protocol (MCP) provides a standardized way for servers to expose tools, resources, and prompts that AI assistants can discover and invoke. By connecting MCP servers to Custom GPTs, you can unlock access to a growing ecosystem of MCP-compatible tools without building custom integrations for each one.

The Architecture Challenge

Custom GPTs and MCP servers operate on fundamentally different principles. Custom GPTs interact with tools through individual REST endpoints (/api/weather, /api/search, /api/database), each representing a specific capability. In contrast, MCP servers expose all their tools through a single JSON-RPC method called tools/call, where the tool name is passed as a parameter.

This architectural difference means you can't directly connect an MCP server to a Custom GPT. Instead, you need an HTTP gateway to bridge the gap.

System Architecture

How the Gateway Translates Between Protocols

The HTTP gateway performs protocol translation between REST and MCP. On the Custom GPT side, it exposes individual REST endpoints like /api/weather or /api/database-query. On the MCP side, it consolidates all requests through the single tools/call JSON-RPC method.

This translation is essential, because without it, Custom GPTs have no way to communicate with MCP's unified tool interface.

Tool Discovery Mechanisms

Custom GPT (OpenAPI)MCP Server
Tools defined statically in OpenAPI specTools discovered dynamically via tools/list method
ChatGPT reads spec once when Action is addedAI agents can query available tools at runtime
Fixed set of endpointsTools can change based on context/permissions

MCP servers expose a tools/list method that returns available tools and their schemas dynamically. This enables AI clients to discover new capabilities at runtime. Custom GPTs, however, require static OpenAPI specifications defined when the Action is configured.

Putting It All Together

To connect MCP servers to Custom GPTs, deploy an HTTP gateway that:

  1. Exposes each MCP tool as a separate REST endpoint
  2. Translates incoming REST requests to MCP tools/call invocations
  3. Converts MCP responses back to HTTP format

Once deployed, generate an OpenAPI specification for the gateway's REST endpoints and configure it as a Custom GPT Action. From ChatGPT's perspective, it's interacting with a standard REST API-the underlying MCP protocol remains transparent.

Tips for Developing Custom Actions in ChatGPT

· 3 min read
Lutra AI Team
Building the future of AI infrastructure

Custom actions let ChatGPT call your HTTP endpoints. They are ChatGPT's version of tool access, similar to Model Context Protocol (MCP). However, there are some important things to note while building for ChatGPT custom actions. We share what we learned here to help you design actions that behave well in chat.