Content
# MCP_Host
MCP_Host is used to manage and simplify communication with multiple (MCP) servers. It provides features such as connection management, tool invocation, and integration with large language models.
## TODO
- [ ] Improve documentation
- [x] Support tool invocation
- [ ] Support resource lookup
## Features
- **Multi-server connection management**: Unified management of multiple MCP server connections
- **Various connection methods**: Supports SSE, Stdio, and in-process connection methods
- **Large language model integration**: Built-in integration with LLMs such as OpenAI
- **Tool invocation management**: Simplifies the tool invocation process and supports automatic tool execution
- **Flexible notification system**: Supports handling and forwarding of server notifications
## Installation
```go
go get github.com/TIANLI0/MCP_Host
```
## Basic Usage
### Connect to MCP Server
```go
import (
"context"
"fmt"
"time"
"github.com/TIANLI0/MCP_Host"
)
func main() {
host := MCP_Host.NewMCPHost()
defer host.DisconnectAll()
ctx := context.Background()
// Connect to SSE server
conn, err := host.ConnectSSE(ctx, "server1", "http://your-mcp-server-url/sse")
if err != nil {
panic(err)
}
fmt.Printf("Connected to: %s (version %s)\n",
conn.ServerInfo.ServerInfo.Name,
conn.ServerInfo.ServerInfo.Version)
}
```
### Execute Tool Invocation
```go
func executeTools(host *MCP_Host.MCPHost, ctx context.Context) {
// List available tools
tools, err := host.ListTools(ctx, "server1")
if err != nil {
fmt.Printf("Unable to list tools: %v\n", err)
return
}
fmt.Printf("Found %d available tools\n", len(tools.Tools))
// Execute tool
result, err := host.ExecuteTool(ctx, "server1", "get_current_time", nil)
if err != nil {
fmt.Printf("Tool execution failed: %v\n", err)
return
}
fmt.Printf("Tool execution result: %v\n", result.Content)
}
```
## Integration with Large Language Models
MCP_Host has built-in integration with LLMs, supporting tool usage in both text mode and function call mode:
```go
import (
"github.com/TIANLI0/MCP_Host"
"github.com/TIANLI0/MCP_Host/llm"
)
func useLLM(host *MCP_Host.MCPHost, ctx context.Context) {
// Create OpenAI client
openaiClient, err := llm.NewOpenAIClient(
llm.WithToken("your-api-key"),
llm.WithOpenAIModel("gpt-4"),
)
if err != nil {
panic(err)
}
// Create MCP client wrapper
mcpClient := llm.NewMCPClient(openaiClient, host)
// Automatically execute tool invocation
gen, err := mcpClient.Generate(ctx, "What time is it now?",
llm.WithMCPWorkMode(llm.TextMode),
llm.WithMCPAutoExecute(true),
)
if err != nil {
panic(err)
}
fmt.Println(gen.Content)
}
```
## Advanced Features
### Streaming Output
```go
// Set up streaming output callback
gen, err := mcpClient.Generate(ctx, "Task to be executed",
llm.WithMCPAutoExecute(true),
llm.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
fmt.Print(string(chunk))
return nil
}),
)
```
### Status Notification
```go
gen, err := mcpClient.Generate(ctx, "Complex task",
llm.WithMCPAutoExecute(true),
llm.WithStateNotifyFunc(func(ctx context.Context, state llm.MCPExecutionState) error {
switch state.Type {
case "tool_call":
fmt.Printf("\n[Calling tool: %s.%s]\n", state.ServerID, state.ToolName)
case "tool_result":
fmt.Printf("\n[Tool result]\n")
}
return nil
}),
)
```
## Custom MCP Server Connections
In addition to SSE connections, MCP_Host also supports other connection methods:
### Standard Input/Output Connection
```go
conn, err := host.ConnectStdio(ctx, "local-server", "./mcp-server",
[]string{"ENV=production"}, "--debug")
```
### In-Process Connection
```go
import "github.com/mark3labs/mcp-go/server"
// Create server instance
server := server.NewMCPServer()
// Register tool
server.RegisterTool("get_time", func(ctx context.Context, args map[string]any) (any, error) {
return time.Now().String(), nil
})
// Connect
conn, err := host.ConnectInProcess(ctx, "embedded", server)
```
## Examples
Complete examples can be found in the `examples` directory:
- `examples/simple/main.go` - Basic connection example
- `examples/chat_simple/main.go` - Basic example of integration with LLM
- `examples/auto_exec/main.go` - Advanced example of automatic tool execution
## License
MIT License
You Might Also Like
Ollama
Ollama enables easy access to large language models on various platforms.

n8n
n8n is a secure workflow automation platform for technical teams with 400+...
OpenWebUI
Open WebUI is an extensible web interface for customizable applications.

Dify
Dify is a platform for AI workflows, enabling file uploads and self-hosting.

Zed
Zed is a high-performance multiplayer code editor from the creators of Atom.
MarkItDown MCP
markitdown-mcp is a lightweight MCP server for converting various URIs to Markdown.