Content
# Breadcrumb MCP Server
A Model Context Protocol (MCP) server for storing and retrieving conversation breadcrumbs with semantic search capabilities using LlamaIndex. Perfect for maintaining context across AI assistant conversations.
## Features
- 🔍 **Semantic Search**: Query conversation breadcrumbs using natural language and semantic similarity
- 💾 **Local Storage**: All data persists locally using JSON files with LlamaIndex vector storage
- 📋 **Multiple Query Types**: Recent breadcrumb retrieval, semantic search, and global search
- ⚙️ **User Preferences**: Store and manage user-specific settings across sessions
- 📊 **Analytics**: Get comprehensive statistics about stored breadcrumbs and preferences
- 🚀 **Production Ready**: Built with TypeScript, fully typed, and npm publishable
- 🐰 **Bun Compatible**: Optimized for Bun runtime with minimal dependencies
## Two Versions Available
### Simple Version (Recommended for Bun)
- Text-based search with relevance scoring
- No external AI dependencies
- Fast startup and lightweight
- Perfect for most use cases
### Advanced Version (Optional)
- Semantic search with vector embeddings using LlamaIndex
- Requires additional dependencies
- Better for complex semantic queries
## Installation
### From npm (when published)
```bash
npm install -g breadcrumb-mcp
```
### Using npx (no installation required)
```bash
npx breadcrumb-mcp
```
### From source
```bash
git clone <https://github.com/amantiwari57/breadcrump-mcp>
cd mcp-user-context-server
bun install
bun run build
```
## Usage
### Running the Server
#### Simple Version (Recommended)
```bash
# Development
bun run dev
# Production (from source)
bun run start
# Using npx (no installation required)
npx breadcrumb-mcp
# Or if installed globally
breadcrumb-mcp
```
#### Advanced Version (with LlamaIndex)
```bash
# Install optional dependencies first
bun install llamaindex
# Development
bun run dev:advanced
# Build advanced version
bun run build:advanced
```
### Available Tools
The server provides the following MCP tools:
#### `store_context`
Store a conversation breadcrumb with user context, metadata, and timestamp for future retrieval.
```json
{
"userId": "user123",
"conversationId": "conv456", // optional
"content": "User asked about machine learning algorithms",
"metadata": { "topic": "ML", "sentiment": "curious" } // optional
}
```
#### `query_context`
Search through user's conversation breadcrumbs using semantic similarity to find relevant past discussions.
```json
{
"userId": "user123",
"query": "machine learning questions",
"limit": 5 // optional, default 5
}
```
#### `get_recent_context`
Retrieve the most recent conversation breadcrumbs for a user in chronological order.
```json
{
"userId": "user123",
"limit": 10 // optional, default 10
}
```
#### `get_user_preferences`
Retrieve user preferences, settings, and configuration stored across conversation sessions.
```json
{
"userId": "user123"
}
```
#### `update_user_preferences`
Update or add user preferences and settings that persist across conversation sessions.
```json
{
"userId": "user123",
"preferences": {
"theme": "dark",
"language": "en",
"notifications": true
}
}
```
#### `global_search`
Search across all users' conversation breadcrumbs to find relevant discussions (administrative feature).
```json
{
"query": "machine learning discussions",
"limit": 10 // optional, default 10
}
```
#### `get_user_stats`
Get comprehensive statistics and analytics about a user's stored conversation breadcrumbs and preferences.
```json
{
"userId": "user123"
}
```
## Configuration
The server uses local storage in a `data/` directory with the following structure:
```
data/
├── contexts/ # User conversation breadcrumb JSON files
│ ├── user123.json
│ └── user456.json
├── preferences/ # User preference JSON files
│ ├── user123.json
│ └── user456.json
└── vectors/ # LlamaIndex vector storage for semantic search
├── docstore.json
├── index_store.json
└── vector_store.json
```
## Configuration with Claude Desktop
Add to your Claude Desktop MCP configuration:
```json
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "npx",
"args": ["breadcrumb-mcp"]
}
}
}
```
Or if you have it installed globally:
```json
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "breadcrumb-mcp",
"args": []
}
}
}
```
Or if running from source:
```json
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "bun",
"args": ["run", "/path/to/breadcrumb-mcp/src/simple-storage.ts"]
}
}
}
```
For the advanced version with LlamaIndex:
```json
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "bun",
"args": ["run", "/path/to/breadcrumb-mcp/src/server.ts"]
}
}
}
```
## Development
### Requirements
- Node.js 18+
- Bun (recommended) or npm
### Setup
```bash
bun install
```
### Scripts
```bash
bun run dev # Run in development mode
bun run build # Build for production
bun run start # Run built version
```
### Project Structure
```
src/
├── simple-storage.ts # Simple text-based MCP server (recommended)
├── index.ts # Core storage and vector operations (advanced)
├── server.ts # MCP server with LlamaIndex (advanced)
package.json
README.md
tsconfig.json
```
## Technical Details
### Vector Embeddings
- Uses HuggingFace's `BAAI/bge-small-en-v1.5` model for embeddings
- Supports semantic search across stored conversation breadcrumbs
- Automatic persistence of vector indices
### Storage Format
- Conversation breadcrumbs stored as JSON with metadata
- Vector indices persisted using LlamaIndex
- Graceful handling of concurrent access
### Error Handling
- Comprehensive error handling with detailed error messages
- Graceful degradation when vector index is unavailable
- Automatic retry logic for transient failures
## API Response Format
All tools return JSON responses with the following structure:
```json
{
"success": true,
"data": { /* tool-specific data */ },
"message": "Optional status message"
}
```
Error responses:
```json
{
"success": false,
"error": "Error description",
"tool": "tool_name"
}
```
## License
MIT License
Copyright (c) 2025 Breadcrumb MCP Server
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
## Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests if applicable
5. Submit a pull request
## Support
For issues and questions, please open an issue on the GitHub repository.
Connection Info
You Might Also Like
markitdown
MarkItDown-MCP is a lightweight server for converting URIs to Markdown.
markitdown
Python tool for converting files and office documents to Markdown.
Filesystem
Node.js MCP Server for filesystem operations with dynamic access control.
Sequential Thinking
A structured MCP server for dynamic problem-solving and reflective thinking.
Fetch
Retrieve and process content from web pages by converting HTML into markdown format.
TrendRadar
TrendRadar: Your hotspot assistant for real news in just 30 seconds.