Content
# MCP Server with RAG and Multi-Search
A custom MCP (Model Calling Protocol) server that provides RAG (Retrieval-Augmented Generation) capabilities using LlamaIndex and multiple web search options via Google's Gemini 2.0 API and Linkup.
## Features
- RAG workflow using local documents
- Multiple web search capabilities:
- Google's Gemini 2.0 for advanced AI-powered search
- Linkup for traditional web search
- Built with FastMCP
## Setup
### Prerequisites
- Python 3.8 or higher
- Ollama installed locally with DeepSeek models (or modify to use your preferred model)
- Gemini API key (get one at https://ai.google.dev/)
- Linkup API key (optional)
### Installation
1. Clone this repository:
```
git clone <repository-url>
cd own-mcp-server
```
2. Install required dependencies:
```
pip install -r requirements.txt
```
3. Set up environment variables (create a `.env` file):
```
# Required API keys
GEMINI_API_KEY=your_gemini_api_key_here
LINKUP_API_KEY=your_linkup_api_key_here
# Optional configurations
OLLAMA_HOST=http://localhost:11434
```
4. Add documents to the `data` directory (will be created automatically if it doesn't exist)
### Running the Server
Start the server with:
```
python server.py
```
## Usage
The server provides the following tools:
1. `web_search`: Uses the best available search method (Gemini 2.0 preferred, fallback to Linkup)
2. `gemini_search`: Search using Google's Gemini 2.0 AI
3. `linkup_search`: Search using Linkup
4. `rag`: Query your local documents using RAG
## Required Libraries
This project uses:
- llama-index - Core RAG functionality
- ollama - Local LLM integration
- Google Generative AI SDK - Gemini 2.0 integration
- Linkup SDK - Web search capabilities
- FastMCP - MCP server implementation
- Python-dotenv - Environment management
- nest-asyncio - Async support
## Troubleshooting
If you encounter issues:
1. Make sure Ollama is properly installed and running
2. Pull the DeepSeek model: `ollama pull deepseek-r1:1.5b`
3. If you encounter Python 3.13 compatibility issues, consider downgrading to Python 3.11 or 3.10
4. Verify your API keys are correct and have the necessary permissions
5. For Gemini 2.0 issues, make sure your API key has access to the latest models
You Might Also Like
mcp-chrome
Transform your Chrome into an AI-powered automation tool.
Firecrawl
Firecrawl MCP Server enables web scraping, crawling, and content extraction.
firecrawl-mcp-server
Firecrawl MCP Server enables web scraping, crawling, and content extraction.
debugg-ai-mcp
AI-powered MCP Server for testing, debugging, and code analysis.
experiments-with-mcp
A collection of practical experiments with MCP using various libraries.
guidance-for-deploying-model-context-protocol-servers-on-aws
Guidance for deploying Model Context Protocol servers on AWS with secure...