Content
# Tool List
[](https://www.python.org/)
[](https://nodejs.org/)
An AI Agent based on Lark, enabling large models to call functions and process messages through Lark.
**No need to configure a Lark bot; your Lark account is your AI assistant.**
**Simply define functions and annotations, and your Lark bot will automatically call them based on scenarios.**
## Project Overview
Lark Agentx is a modern Python application that:
- 逆向 Lark Protobuf format transmission Websockets and API, listening and recording messages
- Provides custom functions for large models to call
- Implements a function call framework based on MCP (Model Context Protocol)
- Uses SQLAlchemy to store messages in a MySQL database
## Screenshots
<div align="center">
<img src="static/resource/back_end.png" width="600" alt="后台日志">
<br>
<em>Figure 1: Backend Logs</em>
</div>
<div align="center">
<img src="static/resource/front_end_1.png" width="600" alt="聊天数据库查询">
<br>
<em>Figure 2: Chat Database Query</em>
</div>
<div align="center">
<img src="static/resource/front_end_2.png" width="600" alt="天气查询">
<br>
<em>Figure 3: Weather Query</em>
</div>
<div align="center">
<img src="static/resource/functions.png" width="600" alt="注册函数">
<br>
<em>Figure 4: Simple function registration, just define functions and annotations</em>
</div>
## Features
- **Function Registration Mechanism**: Simple and intuitive function registration decorator
- **Automatic Message Processing**: Records all received messages (private and group chats)
- **Asynchronous Processing**: Uses async/await mode for asynchronous communication
- **Data Persistence**: Stores messages in a MySQL database using SQLAlchemy
- **Flexible Configuration**: Configures through environment variables
- **Containerized Deployment**: Supports Docker for quick deployment
- **Intelligent Function Calling**: AI automatically analyzes user input and calls the most matching function; developers only need to add functions and their annotation descriptions
## Supported Functions
The project currently has the following built-in functions for large models to call:
| Function Name | Description |
|-------|------|
| `tell_joke` | Tell a random joke |
| `get_time` | Get the current time |
| `fortune` | Draw a random fortune |
| `get_weather` | Get city weather |
| `count_daily_speakers` | Get today's speaker count statistics |
| `get_top_speaker_today` | Get today's top speaker |
| `send_message` | Send a message to a specified user |
| `list_tools` | List all available tools and their descriptions |
| `extra_order_from_content` | Extract order information from text, including order number, product name, quantity, etc. |
You can call these functions by entering a trigger command followed by the operation to be performed in Lark, for example: `/run tell a joke`
## Project Structure
```
project/
├── app/ # Application module
│ ├── api/ # API-related module
│ │ ├── auth.py # Authentication module
│ │ └── lark_client.py # Lark client
│ ├── config/ # Configuration module
│ │ └── settings.py # Application configuration
│ ├── core/ # Core business logic
│ │ ├── mcp_server.py # MCP server (function registration and processing)
│ │ ├── llm_service.py # LLM service
│ │ └── message_service.py # Message processing service
│ ├── db/ # Database-related
│ │ ├── models.py # Data models
│ │ └── session.py # Database session management
│ └── utils/ # Utility functions
├── builder/ # Request builder
├── extension/ # Extension functions
│ └── weather_api/ # Weather API integration
├── static/ # Static resources
│ ├── resource/ # Image resources
│ ├── proto_pb2.py # Protocol definition
│ └── lark_decrypt.js # Lark decryption tool
├── .env # Environment variables
├── main.py # Application entry
├── requirements.txt # Project dependencies
├── docker-compose.yml # Docker Compose configuration
└── Dockerfile # Docker configuration
```
## Custom Function Development
In the `app/core/mcp_server.py` file, you can add your own custom functions using the `@register_tool` decorator:
```python
@register_tool(name="tell_joke", description="Tell a random joke")
def tell_joke() -> str:
jokes = [
"Why do programmers prefer dark mode? Because they don't like bugs shining.",
"What do Python and a snake have in common? Once you get tangled up, you can't get out.",
"Why do Java developers rarely get invited to parties? Because they always throw exceptions.",
]
return random.choice(jokes)
@register_tool(name="send_message", description="Send a message to a specified user {user: username content: message content}")
def send_message(user: str, content: str) -> str:
"""Send a private message to a specified user"""
lark_client = LarkClient(get_auth())
# ... implementation logic ...
return f"Successfully sent a private message to {user}: '{content}'"
```
**Important**: Just add functions and their descriptions, and the AI will automatically analyze user input and call the most matching function; no need to manually implement function matching logic.
## Requirements
- Python 3.10+
- Node.js 18+
- MySQL database
## Installation
### Using Local Environment
1. Install dependencies:
```bash
pip install -r requirements.txt
```
2. Windows users note:
Windows systems require additional installation of the following dependencies:
```bash
pip install win-inet-pton==1.1.0
```
### Using Docker
Method 1: Build image separately
```bash
# Build image
docker build -t feishuapp .
# Run container (requires external MySQL connection through Docker gateway to host MySQL; recommended to use --env-file)
docker run -it feishuapp bash
```
Method 2: Using Docker Compose (recommended)
```bash
# Start all services (application and database)
docker-compose up -d
# View logs
docker-compose logs -f
# Stop all services
docker-compose down
```
Using Docker Compose can start the entire application environment with one click, including the MySQL database and application services, making it more convenient and efficient.
## Configuration
Copy the `.env.example` file and rename it to `.env`, containing the following configurations:
```
# Database settings
DB_HOST=localhost
DB_PORT=3306
DB_USER=root
DB_PASSWORD=123456
DB_NAME=lark_messages
# Lark Cookie settings - only LARK_COOKIE is required; goodbye Lark bot
LARK_COOKIE=""
# Trigger prefix for calling functions (messages starting with FUNCTION_TRIGGER_FLAG will be parsed by the large model; all messages will be recorded in the database regardless of whether they start with this prefix)
FUNCTION_TRIGGER_FLAG="/run"
# Robot speaking prefix (not used yet)
AI_BOT_PREFIX="Lark AI Bot:"
# OpenAI API configuration; default is Tongyi Qianwen, and any large model manufacturer that meets OpenAI specifications can be used
OPENAI_API_KEY=""
OPENAI_API_BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1"
OPENAI_API_MODEL="qwen-plus"
```
## Usage
### Run Application
Method 1: Directly run
```bash
python main.py
```
Method 2: Using Docker Compose
```bash
docker-compose up -d
```
The application will:
1. Initialize MCP server
2. Connect to Lark API and use your Lark account as AI assistant
3. Listen to incoming messages
4. Process and execute function calls initiated by the large model through Lark
5. Store messages in MySQL database
## Database Structure
The application stores messages in the `messages` table with the following structure:
| Column Name | Type | Description |
|----------------|---------------|---------------------------|
| id | INT (PK) | Primary key |
| user_name | VARCHAR(255) | Message sender's name |
| user_id | VARCHAR(255) | Sender's Lark user ID |
| content | TEXT | Message content |
| is_group_chat | BOOLEAN | Whether the message is from a group chat |
| group_name | VARCHAR(255) | Group chat name (if applicable) |
| chat_id | VARCHAR(255) | Chat ID |
| message_time | DATETIME | Message sending time |
| created_at | DATETIME | Record creation time |
## Contribution
Welcome to contribute! Please submit Pull Requests anytime.
1. Fork this repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing features'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## Issues and Support
If you encounter any issues or have questions, please [submit an issue](https://github.com/cv-cat/LarkAgentX/issues) or visit our [discussion forum](https://github.com/cv-cat/LarkAgentX/discussions).
## Star History
<a href="https://www.star-history.com/#cv-cat/LarkAgentX&Date">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date&theme=dark" />
<source media="(prefers-color-scheme: light)" srcset="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date" />
<img alt="Star History Chart" src="https://api.star-history.com/svg?repos=cv-cat/LarkAgentX&type=Date" />
</picture>
</a>
## Communication Group
If you're interested in crawlers and AI Agents, please add the author's WeChat to join the group chat via invitation.
ps: Please add group 6; if full or expired, issue | WeChat提醒

Connection Info
You Might Also Like
awesome-mcp-servers
A collection of MCP servers.
git
A Model Context Protocol server for Git automation and interaction.
cc-switch
All-in-One Assistant for Claude Code, Codex & Gemini CLI across platforms.
Appwrite
Build like a team of hundreds
TrendRadar
TrendRadar: Your hotspot assistant for real news in just 30 seconds.
oh-my-opencode
Background agents · Curated agents like oracle, librarians, frontend...