MCP support for various hosts and AI providers
Anthropic's Model Context Protocol (MCP) is reshaping the integration of AI models with external tools and data sources. While most available demo implementations are based on Python or TypeScript, this C#/.NET project shows how MCP integrates cleanly into the Microsoft ecosystem.
This demo project includes a complete MCP host implementation in .NET 9.0 that connects both commercial AI providers like OpenAI, Azure OpenAI, and IONOS, and local LLM solutions via Ollama and LMStudio. The architecture lets developers create both STDIO- and SSE-based MCP servers, embed them in different MCP hosts, and use different AI providers.
Highlights of the C#/.NET implementation:
- Microsoft.Extensions.AI for unified AI provider abstraction
- Dependency Injection following established .NET patterns
- Entity Framework Core for database-backed tools
- OpenTelemetry for comprehensive observability
- Docker Compose for containerised deployment
Project overview
What's in this project?
The evanto MCP host system is a demo .NET application that implements the Model Context Protocol (MCP) to provide a unified interface between AI chat providers and specialised business tools. It enables seamless integration between several AI providers (OpenAI, Azure, Ollama, etc.) and custom business tools β here demonstrated with a support-ticket management system and support-document processing.
Core capabilities
- Multi-provider AI integration: support for OpenAI, Azure OpenAI, Ollama, LMStudio, and IONOS
- Support-ticket management: full CRUD operations for support requests with SQLite storage
- Document vectorisation: PDF processing and semantic search with the Qdrant vector database for support-documentation PDFs
- Interactive chat client: command-line interface for AI conversations with tool integration
- Containerised deployment: Docker Compose setup for easy deployment and scaling
- Comprehensive testing: built-in MCP server testing framework with automated parameters
Audience
This system is designed for C# developers who want to:
- Integrate AI capabilities into their applications via Microsoft.Extensions.AI
- Build MCP-compliant tools and servers
- Implement semantic search and document processing
- Build multi-provider AI chat systems
- Deploy containerised AI infrastructure
AI developer support
Certain parts of the project β vectorisation and PDF document analysis β were initially produced with Claude Code and then edited manually. The instructions for the coding AI live in CLAUDE.md and CodingRules.md (see the GitHub repository).
System architecture
High-level architecture
The following overview groups the various projects of the demo system:
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Evanto MCP Host System β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Applications Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β cmd-mcp-host β β cmd-vectorize β β MCP Servers β β
β β (Interactive β β (PDF β β (SSE/STDIO) β β
β β Client) β β Processing) β β β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β Core Libraries Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Evanto.Mcp.Host β β Evanto.Mcp.Apps β β Evanto.Mcp. β β
β β (Factories & β β (App Helpers) β β Common β β
β β Testing) β β β β (Settings) β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β External Integration Layer β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
β β Evanto.Mcp. β β Evanto.Mcp. β β Evanto.Mcp. β β
β β Embeddings β β Pdfs β β QdrantDB β β
β β (Multi-Provider)β β (iText7 Wrapper)β β (Vector DB) β β
β βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β MCP Tools Layer β
β βββββββββββββββββββ βββββββββββββββββββ β
β β SupportWizard β β SupportDocs β β
β β (Ticket System) β β (Doc Search) β β
β βββββββββββββββββββ βββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Core components
- Applications: standalone executables
- Core libraries: business logic and infrastructure
- External integration: wrappers for external dependencies
- MCP tools: domain-specific tool implementations
Technology stack
- .NET 9.0: current C# features
- Microsoft.Extensions.AI: unified AI provider abstractions
- Model Context Protocol: official MCP client/server implementation
- Entity Framework Core: database abstraction with SQLite
- OpenTelemetry: observability and telemetry
- Docker & Docker Compose: containerisation and orchestration
- iText7: PDF processing and text extraction
- Qdrant: vector database for semantic search
Design patterns
- Factory pattern: AI client and MCP client creation
- Repository pattern: data access abstraction
- Dependency injection: Microsoft.Extensions.DependencyInjection
- Clean architecture: separation of concerns across layers
- Configuration-driven: comprehensive use of appsettings.json
Project structure
Overview of the GitHub project:
public-ai/
βββ app/ # Standalone applications
β βββ cmd-mcp-host/ # Interactive MCP client
β βββ cmd-vectorize/ # PDF vectorisation utility
βββ lib/ # Core libraries
β βββ Evanto.Mcp.Common/ # Shared utilities
β βββ Evanto.Mcp.Host/ # MCP hosting infrastructure
β βββ Evanto.Mcp.Apps/ # Application helpers
β βββ Evanto.Mcp.Embeddings/ # Text embedding services
β βββ Evanto.Mcp.Pdfs/ # PDF processing
β βββ Evanto.Mcp.QdrantDB/ # Vector database
β βββ Evanto.Mcp.Tools.SupportWizard/ # Support ticket system
β βββ Evanto.Mcp.Tools.SupportDocs/ # Document search tools
βββ srv/ # MCP servers (SSE / STDIO)
βββ db/ # Database files
βββ pdfs/ # PDF documents
βββ run/ # Runtime configurations
βββ Directory.Packages.props # Central package management
βββ docker-compose.yaml # Container orchestration
βββ .env.example # Environment variables template
βββ CLAUDE.md # AI assistant instructions
Core libraries
Evanto.Mcp.Host
Purpose: core MCP hosting infrastructure with factories and a testing framework.
Key components:
EvMcpClientFactory: creates MCP clients for various transport types (STDIO, SSE, HTTP)EvChatClientFactory: creates AI chat clients for multiple providersEvMcpServerTester: comprehensive testing framework for MCP servers and tools
Usage example:
// Create MCP client
var mcpClient = await mcpClientFactory.CreateAsync(serverSettings);
// Create chat client
var chatClient = chatClientFactory.Create("OpenAI");
// Test MCP server
var testResult = await mcpTester.TestServerAsync(serverSettings);
Evanto.Mcp.Common
Purpose: shared configuration models, settings, and utilities.
Key components:
EvHostAppSettings: main application configurationEvChatClientSettings: AI provider configurationsEvMcpServerSettings: MCP server configurationsEvMcpToolBase: base class for MCP tool implementations
Evanto.Mcp.Embeddings
Purpose: multi-provider text embedding services with Microsoft.Extensions.AI.
Key features:
- Multi-provider support: OpenAI, Azure, Ollama, LMStudio, IONOS
- Unified interface: a single API regardless of provider
- Performance optimisation: built-in caching and rate limiting
- Configuration-driven: provider selection via settings
Usage example:
// Register embedding service
services.AddEmbeddingService(settings);
// Use embedding service
var embeddings = await embeddingService.GenerateEmbeddingsAsync(texts);
Evanto.Mcp.Pdfs
Purpose: PDF text extraction services with iText7.
Key features:
- Enterprise PDF processing: handles complex PDF structures
- Service abstraction: clean interface that hides iText7 complexity
- Error handling: robust handling of corrupt PDFs
- Performance-optimised: efficient text extraction
Usage example:
// Register PDF service
services.AddPdfTextExtractor();
// Extract text from PDF
var text = await pdfExtractor.ExtractTextAsync(pdfPath);
Evanto.Mcp.QdrantDB
Purpose: unified repository for Qdrant vector database operations.
Key features:
- Unified document model: a single
EvDocumentfor all operations - Advanced search: vector, text, and combined search queries
- Metadata support: rich document metadata and filtering
- Repository pattern: clean data-access abstraction
Usage example:
// Register Qdrant repository
services.AddQdrantDocumentRepository(settings);
// Store document
await repository.StoreDocumentAsync(document);
// Search documents
var results = await repository.SearchDocumentsAsync(query);
Evanto.Mcp.Tools.SupportWizard
Purpose: support-ticket management system with SQLite database.
Key features:
- Full CRUD operations: create, read, update, delete support requests
- User management: support staff with topic assignments
- Status tracking: ticket lifecycle management
- Entity Framework Core: code-first database approach
Demo database schema (customisable):
-- Support requests
CREATE TABLE SupportRequests (
Id UNIQUEIDENTIFIER PRIMARY KEY,
CustomerEmail TEXT NOT NULL,
CustomerName TEXT NOT NULL,
Subject TEXT NOT NULL,
Description TEXT NOT NULL,
Status INTEGER NOT NULL,
Priority INTEGER NOT NULL,
CreatedAt DATETIME NOT NULL,
UpdatedAt DATETIME NOT NULL
);
-- Users
CREATE TABLE Users (
Id UNIQUEIDENTIFIER PRIMARY KEY,
Name TEXT NOT NULL,
Email TEXT NOT NULL,
Topic TEXT NOT NULL,
IsActive BOOLEAN NOT NULL
);
Evanto.Mcp.Tools.SupportDocs
Purpose: document search and management with semantic similarity.
Key features:
- Semantic search: find documents by meaning, not just keywords
- Document management: store and organise documentation
- Vector integration: uses Qdrant for high-performance search
- Multi-provider embeddings: flexible embedding-provider support
Applications
cmd-mcp-host
Purpose: interactive MCP client with AI chat integration.
Key features:
- Multi-provider chat: switch between OpenAI, Azure, Ollama, etc.
- MCP tool integration: access to SupportWizard and SupportDocs tools
- Interactive interface: rich console experience with Spectre.Console
- Configuration management: supports several AI providers at the same time
Usage:
# Run interactive client
dotnet run --project app/cmd-mcp-host
# Show help
dotnet run --project app/cmd-mcp-host -- --help
# List available providers
dotnet run --project app/cmd-mcp-host -- --list
# Run server tests
dotnet run --project app/cmd-mcp-host -- --test
cmd-vectorize
Purpose: PDF processing and vectorisation utility.
Key features:
- Batch PDF processing: process several PDFs in one run
- Text chunking: configurable chunk sizes and overlap
- Vector storage: store embeddings in the Qdrant database
- File tracking: avoid reprocessing with a JSON tracking file
MCP servers
sse-mcp-server
Purpose: HTTP/SSE-based MCP server for web integration.
- ASP.NET Core: modern web server infrastructure
- Server-Sent Events: real-time communication
- Health checks: built-in endpoint monitoring
- Docker support: containerised deployment
stdio-mcp-server
Purpose: STDIO-based MCP server for command-line integration.
- Standard I/O: works with any MCP client
- Console host: lightweight deployment
- Docker support: container-based execution
- Interactive mode: TTY support for debugging
Prerequisites & setup
System requirements
- .NET 9.0 SDK or later
- Docker 20.10 or later
- Docker Compose 2.0 or later
- Git for version control
Required external services
- Qdrant vector database: provided via Docker Compose
- AI provider API keys: at least one of:
- OpenAI API key
- Azure OpenAI credentials
- Ollama (local install)
- LMStudio (local install)
- IONOS AI API key
Recommended development tools
- Visual Studio 2022 (current) or VS Code with the C# extension
- Docker Desktop for container management
- Postman or similar for API testing
- DB Browser for SQLite for database inspection
Installation steps
- Clone the repository:
git clone https://github.com/svkaenel/public-ai
cd public-ai
- Verify the .NET install:
dotnet --version
# Should show 9.0.x or later
- Restore packages:
dotnet restore
- Build the solution:
dotnet build
- Set up environment variables (see the configuration guide).
Configuration guide
Environment variable setup
The system uses environment variables for API keys and sensitive configuration. This keeps secrets out of the source code and supports different environments.
Step 1: Create a .env file
# Copy the example file
cp .env.example .env
# Edit with your values
nano .env # or your preferred editor
Step 2: Configure API keys
Edit the .env file with your API keys:
# OpenAI API key
OPENAI_API_KEY=your-openai-api-key-here
# IONOS AI API key (JWT token)
IONOS_API_KEY=your-ionos-jwt-token-here
# Azure AI API key
AZURE_API_KEY=your-azure-ai-api-key-here
# Azure OpenAI API key
AZUREOAI_API_KEY=your-azure-openai-api-key-here
# LMStudio API key (usually empty for local)
LMSTUDIO_API_KEY=
# Ollama API key (usually empty for local)
OLLAMA_API_KEY=
# Docker Compose configuration
SSE_PORT=5561
SSE_CONFIG_PATH=./run/sse/appsettings.json
STDIO_CONFIG_PATH=./run/stdio/appsettings.json
Step 3: Verify the configuration
The system loads environment variables automatically in this priority order:
- Command-line environment variables (highest priority)
- System environment variables
- .env file variables
- appsettings.json values (lowest priority)
AI provider configuration
The ChatClients section in appsettings.json configures AI providers:
{
"DefaultChatClient": "OpenAI",
"ChatClients": [
{
"ProviderName": "OpenAI",
"Endpoint": "https://api.openai.com/v1",
"DefaultModel": "o4-mini",
"AvailableModels": ["o4-mini", "gpt-4.1-mini", "gpt-4.1", "o1"]
},
{
"ProviderName": "Azure",
"Endpoint": "https://your-resource.services.ai.azure.com/models",
"DefaultModel": "DeepSeek-R1",
"AvailableModels": ["DeepSeek-R1"]
},
{
"ProviderName": "Ollama",
"Endpoint": "http://localhost:11434",
"DefaultModel": "qwen3:14b",
"AvailableModels": ["qwen3:4b", "qwen3:14b", "gemma3:12b"]
}
]
}
Database configuration
SQLite databases are configured via connection strings:
{
"ConnectionStrings": {
"SupportWizardDB": "Filename=db/ev-supportwizard.db"
}
}
OpenTelemetry configuration
Configure observability and telemetry:
{
"Telemetry": {
"Enabled": true,
"ServiceName": "cmd-mcp-host",
"OtlpEndpoint": "http://localhost:4317",
"EnableConsoleExporter": false,
"EnableOtlpExporter": true,
"LogSensitiveData": false,
"ActivitySources": ["Microsoft.Extensions.AI"]
}
}
Docker setup & deployment
Docker Compose overview
The system uses Docker Compose to orchestrate several services:
- qdrantdb: vector database for document embeddings
- aspire-dashboard: .NET Aspire dashboard for telemetry
- sse-mcp-server: HTTP/SSE MCP server
- stdio-mcp-server: STDIO MCP server
Step-by-step deployment
# Build and start all services
docker-compose up -d
# Watch logs
docker-compose logs -f
# Check service status
docker-compose ps
Build the MCP servers separately (from the public-ai directory):
# SSE MCP server
docker build -f srv/sse-mcp-server/Dockerfile -t sse-mcp-server .
# STDIO MCP server
docker build -f srv/stdio-mcp-server/Dockerfile -t stdio-mcp-server .
Verify the services:
# Check Qdrant is running
curl http://localhost:6335/
# Check the SSE MCP server
curl http://localhost:5561/
# Check the Aspire dashboard
# Open in browser: http://localhost:4316
The MCP servers create and migrate the SQLite database automatically on first start.
Getting started
# Clone the repository
git clone https://github.com/svkaenel/public-ai
cd public-ai
# Build
dotnet build
# Set up environment
cp .env.example .env
nano .env
# Set at least one AI provider:
# OPENAI_API_KEY=your-api-key
# or OLLAMA_API_KEY= # for local Ollama
# Start Docker services
docker-compose up -d
# Place PDFs in the pdfs/ directory and run vectorisation
dotnet run --project app/cmd-vectorize
# Test all MCP servers and tools
dotnet run --project app/cmd-mcp-host -- --test
# Run the interactive chat client
dotnet run --project app/cmd-mcp-host
Example interaction:
> Hello, can you help me find documents about embeddings?
> Create a new support ticket for customer john@example.com
> Show me all support tickets with high priority
Using the MCP host client
The cmd-mcp-host application offers a rich interactive experience for AI conversations with integrated MCP tools.
Welcome to the Evanto MCP Host Client
Current provider: OpenAI (o4-mini)
Available tools: SupportWizard, SupportDocs
Type 'help' for commands, 'exit' to quit
> Hello, can you help me with support tickets?
The client integrates automatically with MCP tools. You can use natural language to interact with them:
# SupportWizard tool examples
> "Create a new support ticket for customer john@example.com with subject 'Login issues'"
> "Show me all support tickets with high priority"
> "List all users who can handle technical issues"
> "Update support ticket ID 123 to resolved"
# SupportDocs tool examples
> "Search for documentation about embeddings"
> "Find information about API authentication"
> "Look up troubleshooting guides for database connections"
For more technical detail, full configuration options, and advanced usage scenarios, please see the full README.md in our GitHub repository.
The project is available under the MIT licence on GitHub: https://github.com/svkaenel/public-ai
This C#/.NET MCP demo project shows how an MCP integration can be implemented in the Microsoft ecosystem and offers a foundation for your own MCP-based applications.