Content
# Mianshiya MCP Server
## Introduction
The question search API of [Mianshiya](https://mianshiya.com/) is now compatible with the MCP (Model Context Protocol) protocol, making it the first interview question practice website in China to support the MCP protocol. For more information about the MCP protocol, please refer to the official [documentation](https://modelcontextprotocol.io/).
Developed using the `MCP Java SDK`, any intelligent assistant that supports the MCP protocol (such as `Claude`, `Cursor`, and `Qianfan AppBuilder`, etc.) can be quickly integrated.
Below are more detailed adaptation instructions.
## Tool List
#### Question Search `questionSearch`
- Retrieves interview questions as links from Mianshiya
- Input: `Question`
- Output: `[Question](Link)`
## Quick Start
Using the Mianshiya MCP Server is primarily done through the `Java SDK`.
### Java Integration
> Prerequisite: Java Runtime Environment
#### Installation
```bash
git clone https://github.com/gulihua10010/mcp-mianshiya-server
```
#### Build
```bash
cd mcp-mianshiya-server
mvn clean package
```
#### Usage
1) Open `Cherry Studio`'s `Settings`, and click on `MCP Server`.

2) Click `Edit JSON`, and add the following configuration to the configuration file.
```json
{
"mcpServers": {
"mianshiyaServer": {
"command": "java",
"args": [
"-Dspring.ai.mcp.server.stdio=true",
"-Dspring.main.web-application-type=none",
"-Dlogging.pattern.console=",
"-jar",
"/yourPath/mcp-server-0.0.1-SNAPSHOT.jar"
],
"env": {}
}
}
}
```

3) In the settings under Model Services, select a model, input the API key, choose model settings, and check the tool function call feature.

4) Check the box to enable the MCP service below the input box.

5) Configuration is complete, and you can now query interview questions.

#### Code Invocation
1) Add dependencies
```java
<dependency>
<groupId>com.alibaba.cloud.ai</groupId>
<artifactId>spring-ai-alibaba-starter</artifactId>
<version>1.0.0-M6.1</version>
</dependency>
<dependency>
<groupId>org.springframework.ai</groupId>
<artifactId>spring-ai-mcp-client-spring-boot-starter</artifactId>
<version>1.0.0-M6</version>
</dependency>
```
2) Configure the MCP server
You need to configure some parameters of the MCP server in `application.yml`:
```yaml
spring:
ai:
mcp:
client:
stdio:
# Specify the MCP server configuration file
servers-configuration: classpath:/mcp-servers-config.json
mandatory-file-encoding: UTF-8
```
The configuration for `mcp-servers-config.json` is as follows:
```json
{
"mcpServers": {
"mianshiyaServer": {
"command": "java",
"args": [
"-Dspring.ai.mcp.server.stdio=true",
"-Dspring.main.web-application-type=none",
"-Dlogging.pattern.console=",
"-jar",
"/Users/gulihua/Documents/mcp-server/target/mcp-server-0.0.1-SNAPSHOT.jar"
],
"env": {}
}
}
}
```
For the client, we are using Alibaba's Tongyi Qianwen model, so we include the `spring-ai-alibaba-starter` dependency. If you are using another model, you can use the corresponding dependency, such as for OpenAI, include the `spring-ai-openai-spring-boot-starter` dependency.
Configure the key and other information for the large model:
```yaml
spring:
ai:
dashscope:
api-key: ${Tongyi Qianwen key}
chat:
options:
model: qwen-max
```
You can apply for the Tongyi Qianwen key directly from the [official website](https://help.aliyun.com/zh/model-studio/developer-reference/get-api-key?spm=a2c4g.11186623.0.0.7399482394LUBH). The model we are using is Tongyi Qianwen-Max.
3) Initialize the chat client
```java
@Bean
public ChatClient initChatClient(ChatClient.Builder chatClientBuilder,
ToolCallbackProvider mcpTools) {
return chatClientBuilder
.defaultTools(mcpTools)
.build();
}
```
4) API call
```java
@PostMapping(value = "/ai/answer/sse", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<String> generateStreamAsString(@RequestBody AskRequest request) {
Flux<String> content = chatClient.prompt()
.user(request.getContent())
.stream()
.content();
return content
.concatWith(Flux.just("[complete]"));
}
```