Content
# <img src="https://mermaid.js.org/favicon.svg" height="24"/> MCP Mermaid  [](https://github.com/hustcc/mcp-mermaid/actions/workflows/build.yml) [](https://www.npmjs.com/package/mcp-mermaid) [](https://smithery.ai/server/@hustcc/mcp-mermaid) [](https://www.npmjs.com/package/mcp-mermaid) [](https://archestra.ai/mcp-catalog/hustcc__mcp-mermaid)
Generate <img src="https://mermaid.js.org/favicon.svg" height="14"/> [mermaid](https://mermaid.js.org/) diagram and chart with AI MCP dynamically. Also you can use <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*ZFK8SrovcqgAAAAAAAAAAAAAemJ7AQ/original" height="14"/> [mcp-server-chart](https://github.com/antvis/mcp-server-chart) to generate chart, graph, map.
## ✨ Features
- Fully support all features and syntax of `Mermaid`.
- Support configuration of `backgroundColor` and `theme`, enabling large AI models to output rich style configurations.
- Support exporting to `png`, `svg`, and `mermaid` formats, with validation for `Mermaid` to facilitate the model's multi-round output of correct syntax and graphics.
<img width="720" alt="mcp-mermaid" src="https://mermaid.js.org/header.png" />
## 🤖 Usage
To use with `Desktop APP`, such as Claude, VSCode, Cline, Cherry Studio, and so on, add the MCP server config below. On Mac system:
```json
{
"mcpServers": {
"mcp-mermaid": {
"command": "npx",
"args": [
"-y",
"mcp-mermaid"
]
}
}
}
```
On Window system:
```json
{
"mcpServers": {
"mcp-mermaid": {
"command": "cmd",
"args": [
"/c",
"npx",
"-y",
"mcp-mermaid"
]
}
}
}
```
Also, you can use it on aliyun, modelscope, glama.ai, smithery.ai or others with HTTP, SSE Protocol.
## 🚰 Run with SSE or Streamable transport
Install the package globally.
```bash
npm install -g mcp-mermaid
```
Run the server with your preferred transport option:
```bash
# For SSE transport (default endpoint: /sse)
mcp-mermaid -t sse
# For Streamable transport with custom endpoint
mcp-mermaid -t streamable
```
Then you can access the server at:
- SSE transport: `http://localhost:3033/sse`
- Streamable transport: `http://localhost:3033/mcp`
## 🎮 CLI Options
You can also use the following CLI options when running the MCP server. Command options by run cli with `-h`.
```plain
MCP Mermaid CLI
Options:
--transport, -t Specify the transport protocol: "stdio", "sse", or "streamable" (default: "stdio")
--port, -p Specify the port for SSE or streamable transport (default: 3033)
--endpoint, -e Specify the endpoint for the transport:
- For SSE: default is "/sse"
- For streamable: default is "/mcp"
--help, -h Show this help message
```
## 🔨 Development
Install dependencies:
```bash
npm install
```
Build the server:
```bash
npm run build
```
Start the MCP server:
```bash
npm run start
```
## 📄 License
MIT@[hustcc](https://github.com/hustcc).
Connection Info
You Might Also Like

Continue
Continue is an open-source project for enhancing MCP Server functionality.
semantic-kernel
Build and deploy intelligent AI agents with the Semantic Kernel framework.

repomix
Repomix packages your codebase into AI-friendly formats for seamless integration.
Blender
BlenderMCP integrates Blender with Claude AI for enhanced 3D modeling.
Serena
Serena is a free, open-source coding agent toolkit enhancing LLMs with...
pydantic-ai
Pydantic AI: A GenAI Agent Framework built with Pydantic.