Content
# Lark Agentx - Your Feishu AI Assistant 🚀
[](https://www.python.org/)
[](https://nodejs.org/zh-cn/)
An AI Agent based on Feishu (Lark), enabling large models to call functions and process messages through Feishu.
**No need to configure a Feishu bot; your Feishu account serves as the AI assistant.**
**Simply define functions and comments, and your Feishu bot will automatically call them based on the context.**
## Project Overview 🌟
Lark Agentx is a modern Python application that can:
- 📊 Reverse engineer the Websockets and API of Feishu's Protobuf format to listen for and log messages
- 🤖 Provide custom functions for large models to call
- 🔄 Implement a function calling framework based on MCP (Model Context Protocol)
- 💾 Use SQLAlchemy to store messages in a MySQL database
## Effect Diagrams 🧸
<div align="center">
<img src="static/resource/back_end.png" width="600" alt="Backend Logs">
<br>
<em>Figure 1: Backend Logs</em>
</div>
<div align="center">
<img src="static/resource/front_end_1.png" width="600" alt="Chat Database Query">
<br>
<em>Figure 2: Chat Database Query</em>
</div>
<div align="center">
<img src="static/resource/front_end_2.png" width="600" alt="Weather Query">
<br>
<em>Figure 3: Weather Query</em>
</div>
<div align="center">
<img src="static/resource/functions.png" width="600" alt="Register Functions">
<br>
<em>Figure 4: Simple Function Registration, just define the function and comments</em>
</div>
## ✨ Features
- **Function Registration Mechanism**: Simple and intuitive function registration decorators
- **Automatic Message Handling**: Logs all received messages (private and group chats)
- **Asynchronous Processing**: Uses async/await for asynchronous communication
- **Data Persistence**: Stores messages in a MySQL database using SQLAlchemy
- **Flexible Configuration**: Configurable via environment variables
- **Containerized Deployment**: Supports quick deployment with Docker
- **Intelligent Function Calling**: The AI automatically analyzes user input and calls the most relevant function; developers only need to add functions and their descriptive comments
## 📦 Currently Supported Functions
The project currently includes the following built-in functions for large models to call:
| Function Name | Description |
|---------------|-------------|
| `tell_joke` | Tell a random joke |
| `get_time` | Get the current time |
| `fortune` | Draw a random fortune |
| `get_weather` | Get the weather for a city |
| `count_daily_speakers` | Get the count of speakers for today |
| `get_top_speaker_today` | Get the user who spoke the most today |
| `send_message` | Send a message to a specified user |
| `list_tools` | List all available tools and their descriptions |
| `extra_order_from_content` | Extract order information from text, including order number, product name, quantity, etc. |
You can call these functions by entering a trigger command followed by the action you want to perform in Feishu, for example: `/run tell a joke`
## 📂 Project Structure
```
project/
├── app/ # Application module
│ ├── api/ # API related modules
│ │ ├── auth.py # Authentication module
│ │ └── lark_client.py # Feishu client
│ ├── config/ # Configuration module
│ │ └── settings.py # Application settings
│ ├── core/ # Core business logic
│ │ ├── mcp_server.py # MCP server (function registration and processing)
│ │ ├── llm_service.py # LLM service
│ │ └── message_service.py # Message processing service
│ ├── db/ # Database related
│ │ ├── models.py # Data models
│ │ └── session.py # Database session management
│ └── utils/ # Utility functions
├── builder/ # Request builder
├── extension/ # Extended features
│ └── weather_api/ # Weather API integration
├── static/ # Static resources
│ ├── resource/ # Image resources
│ ├── proto_pb2.py # Protocol definitions
│ └── lark_decrypt.js # Feishu decryption tool
├── .env # Environment variables
├── main.py # Application entry point
├── requirements.txt # Project dependencies
├── docker-compose.yml # Docker Compose configuration
└── Dockerfile # Docker configuration
```
## 🛠️ Custom Function Development
In the `app/core/mcp_server.py` file, you can add your own custom functions using the `@register_tool` decorator:
```python
@register_tool(name="tell_joke", description="Tell a random joke")
def tell_joke() -> str:
jokes = [
"Why do programmers prefer dark mode? Because they don't like bugs.",
"What do Python and snakes have in common? Once they wrap around you, you can't let go.",
"Why are Java developers rarely invited to parties? Because they always throw exceptions.",
]
return random.choice(jokes)
@register_tool(name="send_message", description="Send a message to a specified user {user:username content:message content}")
def send_message(user: str, content: str) -> str:
"""Send a private message to a specified user"""
lark_client = LarkClient(get_auth())
# ... implement logic ...
return f"Successfully sent a private message to {user}: '{content}'"
```
**Important**: Just add the function and its corresponding description; the AI will automatically analyze user text and call the most relevant function without needing to manually implement function matching logic.
## 🔧 Environment Requirements
- Python 3.10+
- Node.js 18+
- MySQL database
## 📦 Installation Instructions
### Using Local Environment
1. Install dependencies:
```bash
pip install -r requirements.txt
```
2. Note for Windows users:
Windows systems require the installation of the following additional dependency:
```bash
pip install win-inet-pton==1.1.0
```
### Using Docker
Method 1: Build the image separately
```bash
# Build the image
docker build -t feishuapp .
# Run the container, requires external MySQL, connect to host MySQL via Docker gateway, recommended --env-file
docker run -it feishuapp bash
```
Method 2: Use Docker Compose (recommended)
```bash
# Start all services (application and database)
docker-compose up -d
# View logs
docker-compose logs -f
# Stop all services
docker-compose down
```
Using Docker Compose allows you to start the entire application environment, including the MySQL database and application services, more conveniently and efficiently.
## 🛠️ Configuration Instructions
Copy the `.env.example` file and rename it to `.env`, including the following configurations:
```
# Database settings
DB_HOST=localhost
DB_PORT=3306
DB_USER=root
DB_PASSWORD=123456
DB_NAME=lark_messages
# Feishu Cookie settings - only configure LARK_COOKIE, no need for a Feishu bot
LARK_COOKIE=""
# Trigger prefix for function calls (messages starting with FUNCTION_TRIGGER_FLAG will be parsed by the large model; all messages will be logged to the database, regardless of whether they start with this prefix)
FUNCTION_TRIGGER_FLAG="/run"
# Bot speaking prefix (not yet used)
AI_BOT_PREFIX="Lark AI Bot:"
# OpenAI API configuration, defaults to Tongyi Qianwen, compatible with any OpenAI large model vendor
OPENAI_API_KEY=""
OPENAI_API_BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1"
OPENAI_API_MODEL="qwen-plus"
```
## 🚀 Usage Guide
### Running the Application
Method 1: Run directly
```bash
python main.py
```
Method 2: Use Docker Compose
```bash
docker-compose up -d
```
The application will:
1. Initialize the MCP server
2. Connect to the Feishu API and use your Feishu account as the AI assistant
3. Listen for incoming messages
4. Process and execute function calls initiated by the large model through Feishu
5. Store messages in the MySQL database
## 🗄️ Database Structure
The application stores messages in the `messages` table, which has the following structure:
| Column Name | Type | Description |
|------------------|----------------|----------------------------------|
| id | INT (PK) | Primary key |
| user_name | VARCHAR(255) | Name of the message sender |
| user_id | VARCHAR(255) | Sender's Feishu user ID |
| content | TEXT | Message content |
| is_group_chat | BOOLEAN | Whether the message is from a group chat |
| group_name | VARCHAR(255) | Group chat name (if applicable) |
| chat_id | VARCHAR(255) | Chat ID |
| message_time | DATETIME | Message sending time |
| created_at | DATETIME | Record creation time |
## 🤝 Contribution Guidelines
Contributions are welcome! Feel free to submit a Pull Request.
1. Fork this repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing features'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## 🐛 Issues and Support
If you encounter any issues or have questions, please [submit an issue](https://github.com/cv-cat/LarkAgentX/issues) or visit our [discussion forum](https://github.com/cv-cat/LarkAgentX/discussions).
## 📈 Star Trend
<a href="https://www.star-history.com/#cv-cat/LarkAgentX&Date">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date" />
</picture>
</a>
You Might Also Like
OpenWebUI
Open WebUI is an extensible web interface for various applications.

NextChat
NextChat is a light and fast AI assistant supporting Claude, DeepSeek,...

cherry-studio
Cherry-studio is a multilingual project supporting various languages.
claude_code-gemini-mcp
Integrate Claude Code with Google Gemini for AI collaboration and code reviews.
mcp-nixos
MCP-NixOS simplifies AI package management to prevent hallucinations.
nixmcp
NixMCP - Model Context Protocol for NixOS resources