bluefly/mcp_registry

Registry and management system for Model Context Protocol (MCP) servers and resources

dev-api-first-development-20250802-1556 2025-08-03 17:57 UTC

This package is auto-updated.

Last update: 2025-08-20 02:07:29 UTC


README

Overview

This guide documents the integration of Model Context Protocol (MCP) servers into the LLM Platform, enabling connection to the broader MCP ecosystem for enhanced AI tools and capabilities.

Architecture

┌─────────────────────────────────────────────────────────┐
│                    LLM Platform                         │
├─────────────────────────────────────────────────────────┤
│                                                         │
│  ┌──────────────┐     ┌─────────────────────┐         │
│  │   Drupal     │────▶│  MCP Integration   │         │
│  │   Modules    │     │     Module         │         │
│  └──────────────┘     └─────────────────────┘         │
│                              │                         │
└──────────────────────────────┼─────────────────────────┘
                               │
                    ┌──────────▼──────────┐
                    │   MCP Servers       │
                    ├──────────────────────┤
                    │ • LLM MCP (3100)    │
                    │ • Vector (3101)     │
                    │ • TDDAI (3102)      │
                    │ • API (3103)        │
                    └──────────────────────┘
                               │
                    ┌──────────▼──────────┐
                    │  External MCP Tools │
                    ├──────────────────────┤
                    │ • GitHub            │
                    │ • GitLab            │
                    │ • Filesystem        │
                    │ • Database          │
                    │ • Search            │
                    │ • Memory            │
                    └──────────────────────┘

Components

1. MCP Servers

LLM MCP Server (Port 3100)

  • Location: /common_npm/llm_mcp
  • Purpose: Main MCP server for LLM Platform operations
  • Features:
    • AI agent orchestration
    • Vector database integration
    • Marketplace connectivity
    • Tool discovery and management

Vector Server (Port 3101)

  • Purpose: Specialized vector operations and RAG
  • Features:
    • Qdrant integration
    • Semantic search
    • Embedding management
    • Document chunking

TDDAI Server (Port 3102)

  • Location: /common_npm/tddai
  • Purpose: Test-driven development AI assistance
  • Features:
    • Code generation
    • Test creation
    • Quality analysis
    • Refactoring suggestions

API Server (Port 3103)

  • Purpose: RESTful API for MCP operations
  • Features:
    • Tool execution endpoints
    • Server management
    • Health monitoring
    • Authentication

2. Drupal Integration Module

Module: mcp_integration Location: /web/modules/custom/mcp_integration

Services:

  • mcp_integration.client: Main client for MCP communication
  • mcp_integration.server_manager: Manages MCP server connections

Key Features:

  • Tool discovery
  • Tool execution
  • Connection testing
  • Configuration management

3. External MCP Tools

The platform can connect to various MCP servers from the ecosystem:

  • Filesystem: Access and manipulate local files
  • GitHub/GitLab: Repository management and CI/CD
  • PostgreSQL: Database operations
  • Brave Search: Web search capabilities
  • Memory: Persistent context storage

Installation & Setup

1. Prerequisites

# Ensure Node.js 20+ is installed
node --version

# Install required npm packages
cd ${LLM_COMMON_NPM_PATH:-../../common_npm}/llm_mcp
npm install

cd ${LLM_COMMON_NPM_PATH:-../../common_npm}/tddai
npm install

2. Build MCP Servers

# Build LLM MCP
cd ${LLM_COMMON_NPM_PATH:-../../common_npm}/llm_mcp
npm run build

# Build TDDAI
cd ${LLM_COMMON_NPM_PATH:-../../common_npm}/tddai
npm run build

3. Start MCP Servers

# Navigate to infrastructure directory
cd ${LLM_WORKSPACE_PATH:-../../../}llm-platform/infrastructure

# Start all MCP servers
./start-mcp-servers.sh

# Or start individually:
PORT=3100 npm run start:mcp     # LLM MCP Server
PORT=3101 npm run start:vector  # Vector Server
PORT=3102 npm run mcp:server    # TDDAI Server
PORT=3103 npm run start:api     # API Server

4. Enable Drupal Module

# Enable the MCP integration module
ddev drush en mcp_integration -y

# Clear cache
ddev drush cr

Configuration

MCP Server Configuration

Edit /llm-platform/infrastructure/mcp-config.json:

{
  "mcpServers": {
    "llm-platform": {
      "command": "node",
      "args": ["/path/to/llm_mcp/dist/index.js"],
      "env": {
        "MCP_SERVER_NAME": "llm-platform",
        "API_BASE_URL": "https://llm-platform.ddev.site"
      }
    }
  }
}

Drupal Configuration

Configure MCP servers in Drupal:

  1. Navigate to /admin/config/llm/mcp
  2. Add server endpoints:
    • LLM MCP: http://localhost:3100
    • Vector: http://localhost:3101
    • TDDAI: http://localhost:3102
    • API: http://localhost:3103

Usage

PHP (Drupal)

// Get MCP client service
$mcp_client = \Drupal::service('mcp_integration.client');

// Call a tool
$result = $mcp_client->callTool('llm-platform', 'search', [
  'query' => 'machine learning',
  'limit' => 10
]);

// Get available tools
$tools = $mcp_client->getAvailableTools('llm-platform');

// Test connection
$connected = $mcp_client->testConnection('llm-platform');

JavaScript/TypeScript

import { McpClient } from "@bluefly/llm-mcp";

const client = new McpClient({
  serverUrl: "http://localhost:3100",
});

// Call a tool
const result = await client.callTool("search", {
  query: "machine learning",
  limit: 10,
});

// Get available tools
const tools = await client.getTools();

CLI

# Using llmcli
llmcli mcp call llm-platform search --query "machine learning"

# Using tddai
tddai mcp tools list
tddai mcp call filesystem read --path "/path/to/file"

Testing

Test Connection

# Run connection test
node ${LLM_WORKSPACE_PATH:-../../../}llm-platform/infrastructure/test-mcp-connection.js

# Check server health
curl http://localhost:3100/health
curl http://localhost:3101/health
curl http://localhost:3102/health
curl http://localhost:3103/health

View Logs

# View server logs
tail -f /tmp/llm-mcp.log
tail -f /tmp/llm-mcp-vector.log
tail -f /tmp/tddai-mcp.log
tail -f /tmp/llm-mcp-api.log

Available MCP Tools

LLM Platform Tools

  • search: Semantic search across documents
  • embed: Generate embeddings for text
  • agent.create: Create AI agent instance
  • agent.execute: Execute agent task
  • workflow.run: Run automated workflow

TDDAI Tools

  • test.generate: Generate test cases
  • code.analyze: Analyze code quality
  • refactor.suggest: Get refactoring suggestions
  • coverage.check: Check test coverage

Vector Tools

  • vector.store: Store vectors in database
  • vector.search: Semantic similarity search
  • vector.delete: Remove vectors
  • collection.create: Create vector collection

Troubleshooting

Server Won't Start

# Check if port is already in use
lsof -i :3100

# Kill existing process
kill -9 $(lsof -t -i:3100)

# Restart server
./start-mcp-servers.sh

Connection Refused

  1. Ensure servers are running:
ps aux | grep "mcp"
  1. Check firewall settings
  2. Verify correct ports in configuration

Module Errors

# Clear Drupal cache
ddev drush cr

# Rebuild container
ddev restart

Security Considerations

  1. Authentication: Configure API keys for production
  2. Network: Use HTTPS in production environments
  3. Access Control: Limit MCP server access to trusted sources
  4. Secrets: Store sensitive data in environment variables

Performance Optimization

  1. Connection Pooling: Reuse HTTP connections
  2. Caching: Cache tool responses when appropriate
  3. Async Operations: Use background jobs for long-running tasks
  4. Rate Limiting: Implement rate limits for API endpoints

Extending MCP

Adding New Tools

  1. Create tool handler in MCP server:
export const myTool = {
  name: "my_tool",
  description: "My custom tool",
  parameters: {
    input: { type: "string", required: true },
  },
  handler: async (params) => {
    // Tool implementation
    return { result: "success" };
  },
};
  1. Register tool in server
  2. Restart MCP server
  3. Tool is now available to all clients

Resources

Support

For issues or questions:

  • Create issue in GitLab
  • Contact LLM Platform team
  • Check #mcp-integration Slack channel