MCP Catalog Now Available: Simplified Discovery, Configuration, and AI Observability in Tetrate Agent Router Service

Learn more

MCP Integration Patterns

Integration patterns are essential for successfully deploying Model Context Protocol (MCP) in enterprise environments. Effective integration enables MCP to work seamlessly with existing AI frameworks, data sources, authentication systems, and infrastructure while maintaining security, performance, and reliability.

What are MCP Integration Patterns?

MCP integration patterns are proven architectural approaches and implementation strategies for connecting MCP servers and clients with enterprise systems, AI frameworks, and existing infrastructure. These patterns address common integration challenges while ensuring security and privacy, optimal performance, and adherence to implementation best practices.

Common Integration Scenarios

1. AI Framework Integration

Integrate MCP with popular AI frameworks to enhance context quality and capabilities.

  • LangChain integration: Connect MCP servers as LangChain tools or retrievers
  • LlamaIndex integration: Use MCP as a data source for LlamaIndex indices
  • Semantic Kernel integration: Expose MCP capabilities as Semantic Kernel plugins
  • Haystack integration: Integrate MCP context into Haystack pipelines

2. Enterprise System Integration

Connect MCP to enterprise data sources and business systems through proper AI infrastructure integration.

  • CRM systems: Access customer data from Salesforce, HubSpot, or custom CRMs
  • ERP systems: Integrate with SAP, Oracle, or Microsoft Dynamics
  • Knowledge bases: Connect to Confluence, SharePoint, or documentation systems
  • Ticketing systems: Access Jira, ServiceNow, or support platforms
  • Database systems: Query SQL and NoSQL databases for context

3. Authentication Integration

Integrate MCP with enterprise authentication systems following security and privacy considerations.

  • OAuth 2.0 integration: Use OAuth flows for API access
  • SAML integration: Connect with enterprise identity providers
  • API key management: Securely manage and rotate API credentials
  • Certificate-based auth: Use client certificates for mutual TLS
  • SSO integration: Enable single sign-on with corporate directories

Foundational Integration Patterns

1. Direct Integration Pattern

Connect MCP clients directly to MCP servers for simple deployments aligned with MCP architecture principles.

When to use:

  • Simple, single-tenant deployments
  • Low-latency requirements
  • Direct client-server relationships
  • Development and testing environments

Implementation considerations:

  • Minimal infrastructure overhead
  • Client-side configuration management
  • Direct authentication handling
  • Limited scalability for multi-tenant scenarios

Example architecture:

AI Agent (Claude Desktop, etc.)
    ↓ stdio/SSE
MCP Server (Node.js, Python, etc.)
    ↓ API calls
Enterprise System (Database, API, etc.)

2. Gateway Integration Pattern

Route MCP traffic through a centralized gateway for enhanced control through centralized configuration.

When to use:

  • Multi-tenant environments
  • Centralized authentication and authorization
  • Rate limiting and throttling requirements
  • Monitoring and observability needs
  • Tool filtering requirements

Implementation considerations:

  • Added network hop (minimal latency impact)
  • Centralized policy enforcement
  • Unified observability and monitoring
  • Simplified client configuration

Example architecture:

AI Agents (Multiple clients)
    ↓ HTTP/WebSocket
MCP Gateway (TARS, Custom Gateway)
    ↓ Protocol translation
MCP Servers (Multiple servers)
    ↓ API calls
Enterprise Systems

3. Sidecar Integration Pattern

Deploy MCP servers as sidecars alongside AI agents for optimized performance at scale.

When to use:

  • Kubernetes/container environments
  • Low-latency requirements
  • Resource isolation needs
  • Service mesh deployments

Implementation considerations:

  • Container orchestration required
  • Local network communication (minimal latency)
  • Resource overhead per agent instance
  • Simplified deployment and scaling

Example architecture:

Pod/Container Group:
    ├─ AI Agent Container
    └─ MCP Server Sidecar
         ↓ Network calls
    Enterprise Systems

4. Proxy Integration Pattern

Intercept and transform MCP requests through proxy layers for enhanced context quality assessment.

When to use:

  • Protocol translation requirements
  • Request/response transformation
  • Legacy system integration
  • Caching and optimization needs

Implementation considerations:

  • Protocol translation overhead
  • Request/response modification capabilities
  • Caching opportunities for performance optimization
  • Complex error handling scenarios

Example architecture:

AI Agent
    ↓ MCP Protocol
MCP Proxy
    ↓ Transform/Cache
    ↓ Legacy Protocol
Legacy Enterprise System

AI Framework Integration Patterns

1. LangChain Integration

Integrate MCP servers as LangChain tools for enhanced dynamic context adaptation.

Pattern: MCP as LangChain Tool

from langchain.tools import Tool
from mcp import MCPClient

class MCPTool(Tool):
    def __init__(self, mcp_client: MCPClient, tool_name: str):
        self.mcp_client = mcp_client
        super().__init__(
            name=tool_name,
            func=self._call_mcp,
            description=f"Access {tool_name} via MCP"
        )

    def _call_mcp(self, query: str) -> str:
        # Call MCP server tool
        result = self.mcp_client.call_tool(
            name=self.name,
            arguments={"query": query}
        )
        return result.content

Pattern: MCP as LangChain Retriever

from langchain.schema import BaseRetriever, Document
from mcp import MCPClient

class MCPRetriever(BaseRetriever):
    def __init__(self, mcp_client: MCPClient):
        self.mcp_client = mcp_client

    def get_relevant_documents(self, query: str) -> list[Document]:
        # Retrieve context from MCP server
        resources = self.mcp_client.read_resource(
            uri=f"search://{query}"
        )
        return [
            Document(page_content=r.text, metadata=r.metadata)
            for r in resources
        ]

2. LlamaIndex Integration

Use MCP as a data source for LlamaIndex for optimized token optimization.

Pattern: MCP Data Loader

from llama_index import SimpleDirectoryReader, VectorStoreIndex
from mcp import MCPClient

class MCPDataLoader:
    def __init__(self, mcp_client: MCPClient):
        self.mcp_client = mcp_client

    def load_documents(self, resource_uri: str):
        # Load documents from MCP server
        resources = self.mcp_client.list_resources()
        documents = []

        for resource in resources:
            content = self.mcp_client.read_resource(resource.uri)
            documents.append({
                "text": content.text,
                "metadata": content.metadata
            })

        return documents

    def create_index(self):
        documents = self.load_documents()
        return VectorStoreIndex.from_documents(documents)

3. Custom Agent Integration

Build custom AI agents with MCP integration following implementation best practices.

Pattern: Agent with MCP Context

from mcp import MCPClient
from openai import OpenAI

class MCPAgent:
    def __init__(self, mcp_client: MCPClient, openai_client: OpenAI):
        self.mcp = mcp_client
        self.openai = openai_client

    async def process_query(self, user_query: str):
        # Get relevant context from MCP
        context = await self.mcp.read_resource(
            uri=f"context://{user_query}"
        )

        # Get available tools from MCP
        tools = await self.mcp.list_tools()

        # Build prompt with context and tools
        messages = [
            {"role": "system", "content": context.text},
            {"role": "user", "content": user_query}
        ]

        # Call LLM with tools
        response = self.openai.chat.completions.create(
            model="gpt-4",
            messages=messages,
            tools=[self._convert_mcp_tool(t) for t in tools]
        )

        # Handle tool calls via MCP
        if response.choices[0].message.tool_calls:
            return await self._execute_tool_calls(
                response.choices[0].message.tool_calls
            )

        return response.choices[0].message.content

    def _convert_mcp_tool(self, mcp_tool):
        # Convert MCP tool schema to OpenAI format
        return {
            "type": "function",
            "function": {
                "name": mcp_tool.name,
                "description": mcp_tool.description,
                "parameters": mcp_tool.inputSchema
            }
        }

Deploy this MCP implementation on Tetrate Agent Router Service for production-ready infrastructure with built-in observability.

Try TARS Free

Enterprise Integration Patterns

1. Database Integration

Connect MCP servers to databases for context retrieval optimized for context window management.

Pattern: SQL Database Integration

import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { Pool } from "pg";

class DatabaseMCPServer {
  private pool: Pool;

  constructor(connectionString: string) {
    this.pool = new Pool({ connectionString });
  }

  async handleResourceRead(uri: string) {
    // Parse URI like: db://users/123
    const match = uri.match(/db:\/\/(\w+)\/(\w+)/);
    if (!match) throw new Error("Invalid URI");

    const [, table, id] = match;

    // Query database with proper escaping
    const result = await this.pool.query(
      `SELECT * FROM ${table} WHERE id = $1`,
      [id]
    );

    return {
      contents: [{
        uri,
        mimeType: "application/json",
        text: JSON.stringify(result.rows[0])
      }]
    };
  }

  async handleToolCall(name: string, args: any) {
    if (name === "query_database") {
      const result = await this.pool.query(args.sql, args.params);
      return {
        content: [{
          type: "text",
          text: JSON.stringify(result.rows)
        }]
      };
    }
  }
}

2. REST API Integration

Integrate MCP with REST APIs for enterprise system access ensuring security and privacy.

Pattern: REST API Wrapper

import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import axios from "axios";

class RestApiMCPServer {
  private apiClient: axios.AxiosInstance;

  constructor(baseURL: string, apiKey: string) {
    this.apiClient = axios.create({
      baseURL,
      headers: {
        'Authorization': `Bearer ${apiKey}`,
        'Content-Type': 'application/json'
      }
    });
  }

  async handleToolCall(name: string, args: any) {
    switch (name) {
      case "get_customer":
        const response = await this.apiClient.get(
          `/customers/${args.customer_id}`
        );
        return {
          content: [{
            type: "text",
            text: JSON.stringify(response.data)
          }]
        };

      case "create_ticket":
        const ticket = await this.apiClient.post('/tickets', {
          title: args.title,
          description: args.description,
          priority: args.priority
        });
        return {
          content: [{
            type: "text",
            text: `Created ticket ${ticket.data.id}`
          }]
        };

      default:
        throw new Error(`Unknown tool: ${name}`);
    }
  }
}

3. Authentication Integration

Implement secure authentication patterns following security best practices.

Pattern: OAuth 2.0 Integration

import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { OAuth2Client } from "google-auth-library";

class OAuth2MCPServer {
  private oauth2Client: OAuth2Client;
  private accessToken: string | null = null;

  constructor(clientId: string, clientSecret: string, redirectUri: string) {
    this.oauth2Client = new OAuth2Client(
      clientId,
      clientSecret,
      redirectUri
    );
  }

  async authenticate(authCode: string) {
    // Exchange authorization code for access token
    const { tokens } = await this.oauth2Client.getToken(authCode);
    this.accessToken = tokens.access_token!;
    this.oauth2Client.setCredentials(tokens);
  }

  async handleToolCall(name: string, args: any) {
    if (!this.accessToken) {
      throw new Error("Not authenticated");
    }

    // Use access token for API calls
    const response = await fetch(
      `https://api.example.com/${name}`,
      {
        headers: {
          'Authorization': `Bearer ${this.accessToken}`
        },
        method: 'POST',
        body: JSON.stringify(args)
      }
    );

    return {
      content: [{
        type: "text",
        text: await response.text()
      }]
    };
  }
}

Best Practices

1. Design for Failure

Implement robust error handling and retry logic following implementation best practices.

2. Secure Credentials

Never hardcode credentials; use environment variables, secret managers, or centralized configuration.

3. Implement Rate Limiting

Protect backend systems with appropriate rate limiting aligned with performance at scale requirements.

4. Monitor Integrations

Track integration health and performance through comprehensive monitoring.

5. Version APIs

Use API versioning to manage integration changes over time per implementation best practices.

6. Test Thoroughly

Implement comprehensive testing and quality assurance for all integration points.

7. Document Integrations

Maintain clear documentation of integration patterns, authentication flows, and error handling per MCP architecture standards.

8. Optimize Performance

Use caching, connection pooling, and batch operations to ensure optimal performance.

9. Filter Tools Appropriately

Implement tool filtering to reduce context overhead in integrated environments.

10. Balance Cost and Quality

Make informed tradeoffs between integration complexity and cost optimization.

TARS Integration

Tetrate Agent Router Service (TARS) simplifies MCP integration with enterprise systems through built-in connectors, authentication management, and protocol translation. TARS handles the complexity of integration patterns, allowing teams to focus on business logic rather than infrastructure concerns.

Conclusion

Successful MCP integration requires careful consideration of architectural patterns, security requirements, and operational needs. By following these proven integration patterns and best practices, organizations can build robust, scalable MCP implementations that work seamlessly with existing enterprise infrastructure and AI frameworks.

Deploy MCP in Production with TARS

Enterprise-grade MCP infrastructure in minutes

  • Native MCP Integration - Seamless protocol support out of the box
  • Advanced Observability - Monitor and optimize your MCP implementations
  • Optimized Routing - Intelligent request routing for maximum performance
  • $5 Free Credit - Start with production features at no cost
Deploy TARS Now →

Production-tested by leading AI development teams

Looking to integrate MCP with your existing systems? Explore these essential topics:

Decorative CTA background pattern background background
Tetrate logo in the CTA section Tetrate logo in the CTA section for mobile

Ready to enhance your
network

with more
intelligence?