Content
# MCP Client
Node.js client implementation based on Model Context Protocol (MCP) (using Function Calling)
## Project Introduction
MCP Client is a Node.js client implementation based on the Model Context Protocol, allowing your application to connect to various MCP servers and interact with these servers through large language models (LLM). MCP (Model Context Protocol) is an open protocol designed to standardize the way applications provide context to LLMs.
## Core Features
- Supports connection to any server compliant with the MCP standard
- Supports LLM capabilities compatible with OpenAI API format
- Automatically discovers and utilizes tools provided by the server
- Comprehensive logging system, including API requests and tool calls
- Interactive command-line interface
- Supports tool calls and result handling
## System Requirements
- Node.js version 17 or higher
- LLM API key (Ollama is not currently supported)
- Disk space for storing log files (located in the `logs/` directory)
## Installation
### 1. Clone the Repository
```bash
git clone https://github.com/yourusername/mcp-client.git
cd mcp-client
```
### 2. Install Dependencies
```bash
npm install
```
Required dependencies:
- @modelcontextprotocol/sdk
- openai
- dotenv
- typescript (development dependency)
- @types/node (development dependency)
### 3. Configure Environment Variables
Copy the example environment variable file and set your LLM API key:
```bash
cp .env.example .env
```
Then edit the `.env` file to include your LLM API key, model provider API address, and model name:
```
OPENAI_API_KEY=your_api_key_here
MODEL_NAME=xxx
BASE_URL=xxx
```
### 4. Compile the Project
```bash
npm run build
```
## Usage
To start the MCP client, you can use the following methods:
### 1. Directly Specify the Server Script Path
```bash
node build/index.js <server_script_path>
```
Where `<server_script_path>` refers to the path of the MCP server script, which can be a JavaScript (.js) or Python (.py) file.
### 2. Use a Configuration File
```bash
node build/index.js <server_identifier> <config_file_path>
```
Where `<server_identifier>` is the server name defined in the configuration file, and `<config_file_path>` is the path to the JSON file containing the server definitions.
```json
{
"mcpServers": {
"time": {
"command": "node",
"args": [
"/Users/xxx/Desktop/github/mcp/dist/index.js"
],
"description": "Custom Node.js MCP server"
},
"mongodb": {
"command": "npx",
"args": [
"mcp-mongo-server",
"mongodb://localhost:27017/studentManagement?authSource=admin"
]
}
},
"defaultServer": "mongodb",
"system": "Custom system prompt"
}
```
### 3. Use npm Package (npx)
You can run this package directly via npx without local cloning and building:
```bash
# Directly connect to the script
$ npx mcp-client-nodejs /path/to/mcp-server.js
# Connect via configuration file
$ npx mcp-client-nodejs mongodb ./mcp-servers.json
```
> Note: You need to configure model-related information in the .env file in the current running directory.
### Example
Directly connect to a JavaScript MCP server:
```bash
node build/index.js /path/to/weather-server/build/index.js
```
Directly connect to a Python MCP server:
```bash
node build/index.js /path/to/mcp-server.py
```
Connect to the server using a configuration file:
```bash
node build/index.js mongodb ./mcp-servers.json
```
Run using npx:
```bash
# Directly connect to the script
$ npx mcp-client-nodejs /path/to/mcp-server.js
# Connect via configuration file
$ npx mcp-client-nodejs mongodb ./mcp-servers.json
```

## How It Works
1. **Server Connection**: The client connects to the specified MCP server.
2. **Tool Discovery**: Automatically retrieves the list of available tools provided by the server.
3. **Query Processing**:
- Sends user queries to the LLM.
- The LLM decides whether to use tools.
- If needed, the client executes tool calls through the server.
- Returns tool results to the LLM.
- The LLM provides the final response.
4. **Interactive Loop**: Users can continuously input queries until they type "quit" to exit.
## Logging System
MCP Client includes a comprehensive logging system that records all key operations and communications in detail. Log files are stored in the `logs/` directory in JSON format for easy querying and analysis.

### Log Types
- **Requests and Responses to/from LLM** - Records all communications with the LLM API.
- **Tool Calls and Results** - Records all tool call parameters and return results.
- **Error Messages** - Records any errors that occur during system operation.
### Log Naming and Format
Log files are sequentially named as `[index] [log_type] YYYY-MM-DD HH:MM:SS.json`, containing the index, log type, and timestamp, making it easy to view the entire session in chronological order.
## Architectural Design
MCP Client is based on a modular client-server architecture:
- **Transport Layer**: Communicates with the server using Stdio transport mechanism.
- **Protocol Layer**: Handles request/response and tool calls using the MCP protocol.
- **Model Layer**: Provides LLM capabilities through the OpenAI SDK.
### Core Components
- **MCPClient Class**: Manages server connections, query handling, and tool calls.
- **Client Object**: Client implementation provided by the MCP SDK.
- **StdioClientTransport**: Transport implementation based on standard input/output.
## Best Practices
- **Error Handling**: Use TypeScript's type system for better error detection.
- **Security**: Securely store API keys in the .env file.
- **Tool Permissions**: Be mindful of tool permissions and security.
## Troubleshooting
### Server Path Issues
- Ensure the server script path is correct.
- If relative paths do not work, use absolute paths.
- Windows users should ensure to use forward slashes (/) or escaped backslashes (\\) in paths.
- Verify that the server file has the correct extension (.js or .py).
### Response Time
- The first response may take up to 30 seconds to return.
- This is normal and occurs during server initialization, query processing, and tool execution.
- Subsequent responses are usually faster.
### Common Error Messages
- `Error: Cannot find module`: Check the build folder and ensure TypeScript compiled successfully.
- `Connection refused`: Ensure the server is running and the path is correct.
- `Tool execution failed`: Verify that the required environment variables for the tool are set.
- `OPENAI_API_KEY is not set`: Check your .env file and environment variables.
- `TypeError`: Ensure the correct types are used for tool parameters.
## License
Apache License 2.0