Content
# HistorIQ
HistorIQ is an AI historical knowledge interaction platform based on the Model Context Protocol (MCP) architecture. It integrates Retrieval-Augmented Generation (RAG), AI Agent, and locally deployed Large Language Models (LLM), allowing users to ask questions about historical events, figures, and cultural topics in natural language. The system generates contextually relevant answers through multi-stage reasoning and knowledge retrieval.
## Key Features
- **AI Multi-turn Interaction**: Supports AI answering capabilities with contextual memory.
- **RAG Model Integration**: Combines database knowledge retrieval with language model generation to enhance accuracy and credibility.
- **MCP Architecture Implementation**: Clearly delineates MCP Server / Client / Agent / RAG modules, providing a clear structure.
- **Story-like Responses**: Supports chapter generation and extended exploration, suitable for historical education and knowledge navigation.
- **Voice Input and Reading**: Offers a storyteller mode to create an immersive learning experience.
- **This project follows the design principles of the MCP (Model Context Protocol) and implements the following key modules and specifications:**
| Module | Description | Implementation Status |
| ------------------- | ----------------------------------------------------- | --------------------- |
| `MCP Server` | Acts as the core intermediary of the context protocol, handling multi-turn dialogue states, agent routing, context management, and task scheduling | ✅ Completed |
| `MCP Client (Web)` | Provides user interaction entry, supporting question input, context display, button interactions, etc. | ✅ Completed |
| `AI Agent` | Receives task delegation from the Server, performing multi-step reasoning and character-based content generation | ✅ Completed |
| `RAG Retriever` | Retrieves external knowledge content (e.g., historical databases) and returns context for AI model use | ✅ Completed |
| `LLM Interface Module` | Supports API calls and response parsing for OpenAI or local LLMs (e.g., LM Studio) | ✅ Completed |
| `Function Menu Control` | Provides interactive buttons after the story ends, such as "Extend Story," "Change Style," etc., with corresponding services dispatched by the MCP Server | ✅ Completed |
| `Context Management Mechanism` | Each user has an independent session ID, recording historical Q&A context to maintain context | ✅ Completed |
| `Decoupled Multi-module Architecture` | Each module (RAG, Agent, Server, Client) has clear responsibilities and independent maintenance boundaries | ✅ Completed |
| `Extended Task Chain Design` | Supports subsequent story branches and button-triggered tasks, such as "AI Storyteller," "Generate Summary," etc. | ✅ Completed |
| `MCP Specification Compatibility` | Adheres to MCP's Request/Response and task layering logic, extensible to more Agents or Tools | ✅ Preliminary Completed |
You Might Also Like
OpenWebUI
Open WebUI is an extensible web interface for customizable applications.

NextChat
NextChat is a light and fast AI assistant supporting Claude, DeepSeek, GPT4...

Continue
Continue is an open-source project for seamless server management.
experiments-with-mcp
A collection of practical experiments with MCP using various libraries.
guidance-for-deploying-model-context-protocol-servers-on-aws
Guidance for deploying Model Context Protocol servers on AWS with secure...
mcp-hubspot
MCP server for AI assistants to access HubSpot CRM data efficiently.