Content
# MCP Client
## Contents
- Implement a basic Client to integrate deepseek into the MCP ecosystem
- Core Architecture
- JSON-RPC 2.0
- What is RPC?
- **Differences between MCP and Function Calling**
- Differences in APIs among various providers
- OpenAI / Anthropic / DeepSeek
- **Refer to the 5ire source code for smoothing out API differences**
## MCP Architecture
https://modelcontextprotocol.io/docs/concepts/architecture
https://www.jsonrpc.org/
## Implementation of MCP Client
https://modelcontextprotocol.io/quickstart/client
https://github.com/modelcontextprotocol/typescript-sdk
- MCP Server: https://github.com/pskill9/hn-server
- Inspector: https://github.com/modelcontextprotocol/inspector
## Differences between MCP and Function Calling
### Tool Invocation in MCP
```mermaid
graph LR
LLM <--> MCP_Client <--> MCP_Server
```
- There is a strict communication protocol between MCP Client and MCP Server
- The server provides a unified interface specification and service registration mechanism
- The client knows how to communicate with the server and the available service list
- Due to the **unified protocol specification**, tools developed by different developers can be easily reused by other MCP clients
- The server's registration mechanism simplifies tool discovery and sharing
### Tool Invocation in Function Calling
```mermaid
graph LR
LLM <--> AnyClient <--> AnyServer
```
- It is more like a **basic capability** of the model, where the LLM is only responsible for identifying what functionality needs to be called
- The client decides how to implement this function call
- There is no unified service discovery and communication standard
- The plugin ecosystem of different clients cannot be compatible with each other
- https://chat-plugin-sdk.lobehub.com/quick-start/get-start
- https://www.librechat.ai/docs/development/tools_and_plugins
## API Differences
https://platform.openai.com/docs/api-reference/chat
https://docs.anthropic.com/en/api/messages
https://api-docs.deepseek.com/api/create-chat-completion
- Differences
1. The function definition format of function_calling
2. The data structure returned by function_calling
3. The data structure of the message after the call ends
## How 5ire Smooths Out API Differences
- The API return data structures of various models are different; how to smooth out the differences between LLM model outputs and mcp-client?
- Since we know they are different → we can convert them to be the same
- https://github.com/nanbingxyz/5ire
1. Different function definition methods → handled by ChatService's makeTool
2. Different function call results → handled by Reader's parseReply and parseTools
3. Different message formats → handled by ChatService's makeToolMessages
You Might Also Like
Ollama
Ollama enables easy access to large language models on various platforms.

n8n
n8n is a secure workflow automation platform for technical teams with 400+...
OpenWebUI
Open WebUI is an extensible web interface for customizable applications.

Dify
Dify is a platform for AI workflows, enabling file uploads and self-hosting.

Zed
Zed is a high-performance multiplayer code editor from the creators of Atom.
MarkItDown MCP
markitdown-mcp is a lightweight MCP server for converting various URIs to Markdown.