Content
# TW MCP Local Server - Claude 4.6 with Hybrid Cloud Computing
A comprehensive Python MCP (Model Context Protocol) server implementation with Claude Sonnet 4.6 integration, featuring **hybrid cloud computing** that prioritizes local execution while seamlessly integrating with Azure cloud resources for resource-intensive tasks.
## 🌟 Key Features
### 🔥 Hybrid Cloud Architecture
- **Local-First Computing**: Intelligent resource management prioritizing local execution
- **Azure Cloud Integration**: Seamless fallback to Azure Functions for demanding tasks
- **Windows Desktop Optimization**: Tailored for high-performance Windows workstations
- **Resource-Aware Decisions**: Real-time monitoring and intelligent task placement
### 🧠 Empathetic AI Assistance
- **Vibe Coding**: Thoughtful, empathetic programming companion
- **Deep Reasoning**: Comprehensive explanations with clear "why" behind recommendations
- **Supportive Guidance**: Encouraging, patient assistance that builds confidence
- **Emotional Intelligence**: Recognizes and responds to user emotional states
### 🎯 Comprehensive MCP Modules
#### 🧠 Ideation & Thoughtcraft
- **Brainstorm**: Empathetic idea generation triggered by intent or mood
- **Mindmap**: Recursive concept branching with visual mapping and clarity
- **Perspective Shift**: Reframes questions and challenges defaults with support
- **Creativity Surge**: Breaks creative gridlock through divergent thinking
#### 🎨 Visual & Image Generation
- **Image Seed**: Thematic visual output generation
- **Palette Gen**: Emotion and brand-driven color schemes
- **Render Style**: Diverse rendering techniques (photo, sketch, surreal)
- **Sketch Flow**: Draft-level visual sequences from minimal input
#### 🎬 Animation & Motion Design
- **Motion Branding**: Logo and tagline animation synchronized with brand energy
- **Vibe Fade**: Emotion-driven transitions for ambient visual experiences
- **Loop Craft**: Seamless looping animation generation
- **Tempo Sync**: Animation synchronized with ambient inputs (music, voice)
#### 🧪 Model Interaction & Testing
- **LLM Dictation**: Voice-to-model transcription and dictation workflows
- **API Testbed**: Local sandbox for testing and validating API behavior
- **Query Refine**: Prompt language tuning for optimal clarity and results
- **Agent Weave**: Multi-agent workflow orchestration with coordinated logic
#### ✍️ Writing & Composition
- **Writing Muse**: Storylines, brand copy, and essays from seed concepts
- **Composition Sculpt**: Form, tone, and flow control for writing tasks
- **Edit Pass**: Text rewriting and polishing with stylistic presets
- **Persona Writer**: Character-based writing voice emulation
#### 🎧 Music & Audio Development
- **Tone Builder**: Melodic ideas tied to brand emotion and scene context
- **Beat Vibe**: Loop creation synchronized with animations and triggers
- **Soundscape**: Multi-layered ambient audio design generation
- **Voiceflow**: Vocal input/output processing and styling
#### 🗣️ Voice Recognition & Interaction
- **Voice Capture**: Contextual voice capture with intelligent labeling
- **Intent Echo**: Tone and emotion analysis embedded in speech
- **Speech Craft**: Natural spoken response generation
- **Command Stream**: Voice-activated MCP task execution
## 🏗️ Architecture
### Hybrid Computing Decision Engine
```
┌─────────────────────────────────────────────────────────────┐
│ FastAPI Application │
├─────────────────────────────────────────────────────────────┤
│ Hybrid Compute Manager │
│ ┌─────────────────────┐ ┌─────────────────────────────┐ │
│ │ Local Execution │ │ Azure Cloud Functions │ │
│ │ - CPU Monitoring │ │ - Flex Consumption Plan │ │
│ │ - Memory Tracking │ │ - Serverless Compute │ │
│ │ - GPU Utilization │ │ - Auto-scaling │ │
│ └─────────────────────┘ └─────────────────────────────┘ │
├─────────────────────────────────────────────────────────────┤
│ MCP Plugin System │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────────────┐ │
│ │ Ideation │ │ Visual │ │ Voice & Audio │ │
│ │ Modules │ │ Generation │ │ Modules │ │
│ └──────────────┘ └──────────────┘ └──────────────────────┘ │
├─────────────────────────────────────────────────────────────┤
│ Claude Client │
├─────────────────────────────────────────────────────────────┤
│ Configuration Layer │
├─────────────────────────────────────────────────────────────┤
│ Azure Integration & Local Processing │
└─────────────────────────────────────────────────────────────┘
```
### Resource Management Strategy
- **Local Execution Criteria**: CPU < 80%, Memory < 85%, Task duration < 5 min
- **Azure Fallback Triggers**: Resource constraints, long-running tasks, specialized AI services
- **Cost Optimization**: Minimize cloud costs while maintaining performance guarantees
- **Windows-Specific**: GPU prioritization, service integration, desktop optimization
## 🚀 Quick Start
### System Requirements
#### Minimum System Requirements
- **CPU**: Modern x64 processor (Intel Core i5-8400 or AMD Ryzen 5 2600 equivalent)
- **RAM**: 16GB DDR4
- **Storage**: 256GB SSD (100GB free space minimum)
- **GPU**: Integrated graphics (dedicated GPU recommended)
- **Network**: Broadband internet connection
- **OS**: Windows 10 Pro (version 1903 or later) or Linux (Ubuntu 20.04 LTS+, CentOS 8+, or equivalent)
#### Recommended Windows Desktop Configuration
- **CPU**: AMD Ryzen 7 5800X+, Intel Core i7-12700K+, or Snapdragon X Elite (ARM64)
- **RAM**: 64GB DDR4/DDR5 (32GB minimum)
- **Storage**: 1TB+ NVMe SSD (multiple drives recommended for optimal performance)
- **GPU**: NVIDIA RTX 3070/4060+ or AMD Radeon RX 6700 XT+ (dedicated GPU recommended for AI workloads)
- **Network**: Gigabit Ethernet and/or Wi-Fi 6/6E (stable high-speed connection)
- **OS**: Windows 11 Pro or Pro for Workstations
#### Lightweight Linux Configuration
- **CPU**: Modern x64 processor (Intel Core i3-10100 or AMD Ryzen 3 3100 equivalent)
- **RAM**: 8GB DDR4 (16GB recommended)
- **Storage**: 128GB SSD (50GB free space minimum)
- **GPU**: Integrated graphics (NVIDIA GTX 1650+ for GPU acceleration)
- **Network**: Broadband internet connection
- **OS**: Ubuntu 22.04 LTS, Debian 11+, or Alpine Linux 3.15+
#### Azure Cloud Resources (Optional)
- Azure Functions (Linux Flex Consumption)
- Azure Storage (Blob, Table, Queue)
- Azure AI Services
- Azure Orchestration (Durable Functions)
### 🔑 Getting Your Claude Sonnet 4.6 API Key
Before installation, you'll need an Anthropic API key to access Claude Sonnet 4.6:
1. **Sign up for Anthropic Console**:
- Visit [console.anthropic.com](https://console.anthropic.com/)
- Create an account or sign in with your existing account
2. **Verify your account**:
- Complete email verification if required
- You may need to provide a phone number for verification
3. **Add billing information**:
- Navigate to "Billing" in the console
- Add a payment method (credit card required)
- Claude Sonnet 4.6 uses pay-per-use pricing
4. **Generate your API key**:
- Go to "API Keys" in the left sidebar
- Click "Create Key"
- Give your key a descriptive name (e.g., "MCP Local Server")
- Copy the generated key immediately (you won't be able to see it again)
5. **Important notes**:
- **Keep your API key secure**: Never commit it to version control
- **Monitor usage**: Check your usage in the Anthropic Console regularly
- **Rate limits**: Be aware of API rate limits for your account tier
- **Pricing**: Claude Sonnet 4.6 pricing is based on input/output tokens
6. **Pricing reference** (as of July 2025):
- Input tokens: ~$3.00 per million tokens
- Output tokens: ~$15.00 per million tokens
- Prices may vary; check [Anthropic's pricing page](https://www.anthropic.com/pricing) for current rates
### Installation
Choose your preferred deployment method:
#### 🖥️ Local Windows Installation
1. **Prerequisites**:
- Python 3.9 or higher
- Git for Windows
- PowerShell 5.1 or higher
- Visual Studio Build Tools (for some Python packages)
2. **Clone and setup**:
```powershell
git clone https://github.com/AplUSAndmINUS/tw-mcp-local-server-claude.git
cd tw-mcp-local-server-claude
```
3. **Create virtual environment**:
```powershell
python -m venv venv
.\venv\Scripts\Activate.ps1
```
4. **Install dependencies**:
```powershell
pip install -e .
pip install -r requirements.txt
```
5. **Configure environment**:
```powershell
copy .env.example .env
# Edit .env with your settings using notepad or your preferred editor
notepad .env
```
6. **Essential Windows Configuration**:
```env
# Claude API
ANTHROPIC_API_KEY=your-api-key-here
# Hybrid Computing
HYBRID_COMPUTING_ENABLED=true
PREFER_LOCAL_EXECUTION=true
# Windows Optimizations
WINDOWS_OPTIMIZATIONS=true
WINDOWS_GPU_PRIORITY=true
# Azure Integration (Optional)
AZURE_ENABLED=false
```
7. **Start the server**:
```powershell
mcp-server run
```
8. **Install as Windows Service** (Optional):
```powershell
python scripts/windows_service.py install
python scripts/windows_service.py start
```
#### 🐧 Local Linux Installation
1. **Prerequisites**:
```bash
# Ubuntu/Debian
sudo apt update
sudo apt install python3 python3-pip python3-venv git build-essential
# CentOS/RHEL
sudo yum install python3 python3-pip git gcc gcc-c++ make
# Alpine Linux
sudo apk add python3 python3-dev py3-pip git build-base
```
2. **Clone and setup**:
```bash
git clone https://github.com/AplUSAndmINUS/tw-mcp-local-server-claude.git
cd tw-mcp-local-server-claude
```
3. **Create virtual environment**:
```bash
python3 -m venv venv
source venv/bin/activate
```
4. **Install dependencies**:
```bash
pip install -e .
pip install -r requirements.txt
```
5. **Configure environment**:
```bash
cp .env.example .env
# Edit .env with your settings
nano .env # or vim, gedit, etc.
```
6. **Essential Linux Configuration**:
```env
# Claude API
ANTHROPIC_API_KEY=your-api-key-here
# Hybrid Computing
HYBRID_COMPUTING_ENABLED=true
PREFER_LOCAL_EXECUTION=true
# Linux Optimizations
WINDOWS_OPTIMIZATIONS=false
WINDOWS_GPU_PRIORITY=false
# Azure Integration (Optional)
AZURE_ENABLED=false
```
7. **Start the server**:
```bash
mcp-server run
```
8. **Install as systemd service** (Optional):
```bash
# Create service file
sudo cp scripts/mcp-server.service /etc/systemd/system/
sudo systemctl daemon-reload
sudo systemctl enable mcp-server
sudo systemctl start mcp-server
```
#### ☁️ Azure Cloud Deployment
1. **Prerequisites**:
- Azure CLI installed and configured
- Azure subscription with appropriate permissions
- Docker (for containerized deployment)
2. **Login to Azure**:
```bash
az login
az account set --subscription "your-subscription-id"
```
3. **Clone and prepare**:
```bash
git clone https://github.com/AplUSAndmINUS/tw-mcp-local-server-claude.git
cd tw-mcp-local-server-claude
```
4. **Option A: Azure Container Instances (Simplest)**:
```bash
# Create resource group
az group create --name mcp-server-rg --location eastus
# Create container instance
az container create \
--resource-group mcp-server-rg \
--name mcp-server \
--image python:3.11-slim \
--cpu 2 \
--memory 4 \
--ports 8000 \
--environment-variables \
ANTHROPIC_API_KEY="your-api-key-here" \
HYBRID_COMPUTING_ENABLED=true \
AZURE_ENABLED=true \
--command-line "pip install -e . && mcp-server run --host 0.0.0.0"
```
5. **Option B: Azure App Service (Recommended)**:
```bash
# Create App Service plan
az appservice plan create \
--name mcp-server-plan \
--resource-group mcp-server-rg \
--sku B1 \
--is-linux
# Create web app
az webapp create \
--resource-group mcp-server-rg \
--plan mcp-server-plan \
--name mcp-server-app \
--runtime "PYTHON|3.11"
# Configure app settings
az webapp config appsettings set \
--resource-group mcp-server-rg \
--name mcp-server-app \
--settings \
ANTHROPIC_API_KEY="your-api-key-here" \
HYBRID_COMPUTING_ENABLED=true \
AZURE_ENABLED=true \
WEBSITES_PORT=8000
# Deploy code
az webapp deployment source config \
--resource-group mcp-server-rg \
--name mcp-server-app \
--repo-url https://github.com/AplUSAndmINUS/tw-mcp-local-server-claude \
--branch master \
--manual-integration
```
6. **Option C: Azure Functions (Serverless)**:
```bash
# Create Function App
az functionapp create \
--resource-group mcp-server-rg \
--consumption-plan-location eastus \
--runtime python \
--runtime-version 3.11 \
--functions-version 4 \
--name mcp-server-func \
--storage-account mcpserverstorage
# Deploy using Azure Functions Core Tools
func azure functionapp publish mcp-server-func
```
7. **Configure Azure-specific settings**:
```env
# Azure optimizations
AZURE_ENABLED=true
AZURE_SUBSCRIPTION_ID=your-subscription-id
AZURE_RESOURCE_GROUP=mcp-server-rg
# Hybrid computing for cloud
HYBRID_COMPUTING_ENABLED=true
PREFER_LOCAL_EXECUTION=false
# Security
CORS_ORIGINS=["https://yourdomain.com"]
```
8. **Verify deployment**:
```bash
# Test the deployment
curl https://your-app-name.azurewebsites.net/health
```
#### 🔧 Post-Installation Verification
For all deployment methods, verify your installation:
1. **Check server health**:
```bash
curl http://localhost:8000/health # Local deployments
curl https://your-app.azurewebsites.net/health # Azure deployments
```
2. **Test basic functionality**:
```bash
# Test vibe coding
curl -X POST "http://localhost:8000/vibe-code" \
-H "Content-Type: application/json" \
-d '{"request": "Hello, can you help me with Python?", "context": {"mood": "curious"}}'
```
3. **Run integration tests**:
```bash
python tests/test_integration.py
```
4. **Monitor logs**:
```bash
# Local deployments
tail -f logs/mcp-server.log
# Azure deployments
az webapp log tail --resource-group mcp-server-rg --name mcp-server-app
```
## 🎯 Usage Examples
### Empathetic Brainstorming
```bash
# CLI brainstorming session
mcp-server vibe "I need creative ideas for improving team collaboration"
# API request
curl -X POST "http://localhost:8000/brainstorm/session" \
-H "Content-Type: application/json" \
-d '{
"topic": "Sustainable urban transportation",
"intent": "problem_solving",
"mood": "focused",
"duration_minutes": 15
}'
```
### Mindmap Creation
```python
import httpx
import asyncio
async def create_mindmap():
async with httpx.AsyncClient() as client:
response = await client.post(
"http://localhost:8000/mindmap/create",
json={
"central_concept": "Machine Learning Applications",
"depth": 3,
"breadth": 5,
"thinking_style": "analytical"
}
)
mindmap = response.json()
print(f"Created mindmap with {len(mindmap['nodes'])} nodes")
return mindmap
asyncio.run(create_mindmap())
```
### Perspective Shifting
```python
async def shift_perspective():
async with httpx.AsyncClient() as client:
response = await client.post(
"http://localhost:8000/perspective-shift/shift",
json={
"original_question": "How can we increase team productivity?",
"shift_type": "stakeholder",
"intensity": "moderate"
}
)
return response.json()
```
### Creativity Surge
```bash
# Break creative gridlock
curl -X POST "http://localhost:8000/creativity-surge/surge" \
-H "Content-Type: application/json" \
-d '{
"challenge": "Design a more engaging user onboarding",
"technique": "random_stimulation",
"intensity": "high",
"preferred_style": "playful"
}'
```
### System Monitoring
```bash
# Check hybrid computing status
curl http://localhost:8000/system/status
# Monitor resource usage
curl http://localhost:8000/system/resources
# Trigger system optimization
curl -X POST http://localhost:8000/system/optimize
```
## 🎯 Vibe Coding Philosophy
This server implements a unique "vibe coding" approach that prioritizes:
- **Empathy**: Understanding your needs, frustrations, and emotional context
- **Reassurance**: Providing confidence and encouragement, especially during challenges
- **Kindness**: Patient, supportive explanations that never condescend
- **Understanding**: Grasping broader context, goals, and long-term objectives
- **Appreciation**: Recognizing the complexity and creativity in programming
- **Deep-dive reasoning**: Thorough, well-reasoned solutions with clear explanations
- **Strong logical reasoning**: Clear explanations of the "why" behind recommendations
## 🔧 Hybrid Computing Features
### Intelligent Resource Management
- **Real-time Monitoring**: CPU, memory, disk, and GPU utilization tracking
- **Adaptive Thresholds**: Windows-optimized performance thresholds
- **Predictive Analytics**: Task duration and resource requirement estimation
- **Cost Optimization**: Minimize cloud costs while maintaining performance
### Local-First Execution
- **Prioritized Local Processing**: Maximum efficiency with your hardware
- **GPU Acceleration**: Leverage NVIDIA RTX capabilities for visual tasks
- **Windows Service Integration**: Seamless Windows desktop integration
- **Resource-Aware Scheduling**: Intelligent task queuing and prioritization
### Azure Cloud Integration
- **Serverless Functions**: Linux Flex Consumption for cost-effective scaling
- **Storage Integration**: Blob, table, and queue operations
- **AI Services**: Cognitive Services for specialized processing
- **Orchestration**: Durable Functions for complex workflows
### Empathetic MCP Modules
Each module is designed with empathy and support at its core:
#### 🧠 Ideation & Thoughtcraft
- **Brainstorm**: Mood-aware idea generation with encouraging feedback
- **Mindmap**: Visual concept mapping with supportive guidance
- **Perspective Shift**: Gentle reframing with empathetic reasoning
- **Creativity Surge**: Breakthrough techniques with motivational support
#### 🎨 Visual & Creative Modules
- **Image Seed**: Emotion-driven visual concepts
- **Palette Gen**: Brand-aligned color psychology
- **Motion Branding**: Dynamic brand expression
- **Vibe Fade**: Ambient emotional transitions
#### 🗣️ Voice & Communication
- **Voice Capture**: Contextual speech recognition
- **Intent Echo**: Emotional tone analysis
- **Speech Craft**: Natural response generation
- **Command Stream**: Voice-controlled workflows
## 🔧 Usage
### Command Line Interface
```bash
# Start the server
mcp-server run --host localhost --port 8000
# Test API connection
mcp-server test
# Interactive vibe coding session
mcp-server interactive
# Quick vibe coding
mcp-server vibe "Help me optimize this Python function"
# Analyze code files
mcp-server analyze mycode.py --language python --task review
# Check server status
mcp-server status
# List available plugins
mcp-server plugins
```
### API Endpoints
The server provides several REST API endpoints:
- `POST /complete` - Basic text completion
- `POST /vibe-code` - Vibe coding assistance
- `POST /chat` - Multi-turn conversations
- `POST /analyze-code` - Code analysis and improvement
- `GET /health` - Health check
- `GET /plugins` - List available plugins
### Example API Usage
```python
import httpx
import asyncio
async def vibe_code_example():
async with httpx.AsyncClient() as client:
response = await client.post(
"http://localhost:8000/vibe-code",
json={
"request": "I'm struggling with async/await in Python. Can you help?",
"context": {
"mood": "supportive",
"experience_level": "beginner"
}
}
)
print(response.json())
asyncio.run(vibe_code_example())
```
## 🔌 Plugin System
The server features a flexible plugin architecture. Create custom plugins by extending the `PluginInterface`:
```python
from mcp_server.plugins import PluginInterface, PluginMetadata
class MyPlugin(PluginInterface):
def get_metadata(self) -> PluginMetadata:
return PluginMetadata(
name="my_plugin",
version="1.0.0",
description="My custom plugin",
author="Your Name"
)
async def initialize(self) -> None:
# Plugin initialization
pass
async def shutdown(self) -> None:
# Plugin cleanup
pass
```
### Built-in Plugins
- **Vibe Coder**: Empathetic programming assistance
- **Code Analyzer**: Code review and improvement suggestions
- **Documentation Generator**: Automatic documentation generation
## ⚙️ Configuration
Configure the server using environment variables or the `.env` file:
```env
# Server Configuration
HOST=localhost
PORT=8000
DEBUG=false
# Claude API Configuration
ANTHROPIC_API_KEY=your-api-key-here
CLAUDE_MODEL=claude-sonnet-4-6
MAX_TOKENS=4096
TEMPERATURE=0.7
# Plugin Configuration
ENABLED_PLUGINS=["vibe_coder"]
# Security
SECRET_KEY=your-secret-key
CORS_ORIGINS=["http://localhost:3000"]
# Rate Limiting
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_WINDOW=60
```
## 🖥️ Windows Service Installation
For Windows users, the server can be installed as a Windows service:
1. **Install NSSM** (Non-Sucking Service Manager):
Download from [nssm.cc](https://nssm.cc/download)
2. **Run the service installer**:
```bash
python scripts/windows_service.py
```
3. **Or use the batch file**:
```cmd
scripts\install_service.bat
```
4. **Manage the service**:
```powershell
.\scripts\manage_service.ps1 start
.\scripts\manage_service.ps1 stop
.\scripts\manage_service.ps1 status
```
## 🏗️ Deployment Options
### Local Windows Desktop
```bash
# Install as Windows service
python scripts/windows_service.py install
python scripts/windows_service.py start
# Or run directly
mcp-server run --host localhost --port 8000
```
### Hybrid Cloud Deployment
```bash
# Deploy Azure Functions
./scripts/deploy_azure_functions.sh
# Configure hybrid settings
export AZURE_ENABLED=true
export AZURE_SUBSCRIPTION_ID=your-subscription-id
# Start with hybrid computing
mcp-server run --hybrid-enabled
```
### Docker Deployment
```dockerfile
FROM python:3.11-slim
WORKDIR /app
COPY . .
RUN pip install -e .
EXPOSE 8000
CMD ["mcp-server", "run", "--host", "0.0.0.0"]
```
## 📊 Performance & Monitoring
### Resource Monitoring
- **Real-time Metrics**: CPU, memory, disk, GPU utilization
- **Historical Analysis**: Performance trends and patterns
- **Threshold Alerts**: Proactive resource management
- **Cost Tracking**: Azure usage and cost optimization
### System Health
- **Health Endpoints**: `/health`, `/system/status`
- **Plugin Status**: Individual module health checks
- **Azure Integration**: Service availability and latency
- **Performance Metrics**: Response times and throughput
## 🔐 Security & Privacy
### Local Security
- **Rate Limiting**: Configurable request throttling
- **CORS Protection**: Cross-origin request management
- **Input Validation**: Comprehensive request sanitization
- **Secure Defaults**: Production-ready security configuration
### Azure Security
- **OAuth2 Integration**: Secure cloud authentication
- **Managed Identity**: Passwordless Azure access
- **Network Security**: VPC and firewall configuration
- **Audit Logging**: Comprehensive security monitoring
## 📚 Documentation
- **[Hybrid Configuration Guide](HYBRID_CONFIGURATION.md)**: Complete setup and configuration
- **[API Documentation](docs/api.md)**: Comprehensive API reference
- **[Plugin Development](docs/plugins.md)**: Creating custom MCP modules
- **[Azure Integration](docs/azure.md)**: Cloud deployment guide
- **[Windows Service](docs/windows.md)**: Desktop service setup
## 🧪 Testing & Validation
### Automated Testing
```bash
# Run all tests
pytest tests/
# Test hybrid computing
python tests/test_integration.py
# Test individual modules
pytest tests/test_brainstorm.py
pytest tests/test_mindmap.py
```
### Manual Testing
```bash
# Test brainstorming
mcp-server vibe "Help me brainstorm ideas for..."
# Test mindmapping
curl -X POST localhost:8000/mindmap/create -d '{"central_concept": "AI"}'
# Test system status
curl localhost:8000/system/status
```
## 🤝 Contributing
### Development Setup
```bash
# Clone and setup
git clone https://github.com/AplUSAndmINUS/tw-mcp-local-server-claude.git
cd tw-mcp-local-server-claude
pip install -e ".[dev]"
# Run tests
pytest
# Format code
black src/
isort src/
```
### Plugin Development
Create custom MCP plugins:
```python
from mcp_server.plugins import PluginInterface, PluginMetadata
class MyPlugin(PluginInterface):
def get_metadata(self) -> PluginMetadata:
return PluginMetadata(
name="my_plugin",
version="1.0.0",
description="Custom empathetic plugin",
author="Your Name"
)
async def initialize(self) -> None:
# Initialize with empathy and support
pass
```
## 🛡️ Security Features
- **Rate Limiting**: Configurable request rate limiting
- **CORS Protection**: Configurable CORS origins
- **Input Validation**: Pydantic-based request validation
- **Error Handling**: Comprehensive error handling and logging
- **API Key Security**: Secure API key management
- **Azure Security**: OAuth2 and managed identity integration
## 📝 Logging & Monitoring
The server uses structured logging with empathetic context:
```python
import structlog
logger = structlog.get_logger()
# Empathetic logging with user context
logger.info("Supporting user through creative challenge",
user_mood="frustrated",
assistance_type="brainstorming")
```
## 🌐 Cloud Integration
### Azure Functions
- **Ideation Functions**: Brainstorming and mindmapping
- **Visual Functions**: Image and animation generation
- **Audio Functions**: Music and voice processing
- **Orchestration**: Complex workflow management
### Cost Optimization
- **Local-First**: Zero cloud costs for local execution
- **Intelligent Routing**: Cost-aware task placement
- **Usage Monitoring**: Real-time cost tracking
- **Budget Controls**: Configurable spending limits
## 📚 Examples
The `examples/` directory contains:
- `vibe_coding_example.py`: Complete vibe coding demonstration
- `sample_mcp_config.py`: Plugin configuration examples
- `usage_examples.json`: API usage examples
- `custom_plugin_template.py`: Template for creating custom plugins
Run the vibe coding example:
```bash
cd examples
python vibe_coding_example.py
```
## 🧪 Development
### Setup Development Environment
```bash
# Install development dependencies
pip install -e ".[dev]"
# Install pre-commit hooks
pre-commit install
# Run tests
pytest
# Format code
black src/
isort src/
# Type checking
mypy src/
```
### Creating Plugins
1. Create a new Python file in `src/mcp_server/plugins/`
2. Extend `PluginInterface`
3. Implement required methods
4. Add to `ENABLED_PLUGINS` configuration
5. Restart the server
## 🌐 Cloud Deployment
### Docker Deployment
```dockerfile
FROM python:3.9-slim
WORKDIR /app
COPY . .
RUN pip install -e .
EXPOSE 8000
CMD ["mcp-server", "run", "--host", "0.0.0.0"]
```
### Environment Variables for Cloud
```env
HOST=0.0.0.0
PORT=8000
ANTHROPIC_API_KEY=your-api-key
CORS_ORIGINS=["https://yourdomain.com"]
```
## 🛡️ Security Features
- **Rate Limiting**: Configurable request rate limiting
- **CORS Protection**: Configurable CORS origins
- **Input Validation**: Pydantic-based request validation
- **Error Handling**: Comprehensive error handling and logging
- **API Key Security**: Secure API key management
## 📝 Logging
The server uses structured logging with configurable levels:
```python
import structlog
logger = structlog.get_logger()
# Logs are automatically structured with context
logger.info("Request processed", user_id=123, duration=0.5)
```
## 🤝 Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests
5. Submit a pull request
## 📄 License
This project is licensed under the GNU General Public License v3.0 - see the [LICENSE](LICENSE) file for details.
## 🙏 Acknowledgments
- **Anthropic** for the Claude API
- **FastAPI** for the excellent web framework
- **The Python Community** for amazing libraries and tools
## 📞 Support
For support, please:
1. Check the [documentation](docs/)
2. Review [examples](examples/)
3. Open an issue on GitHub
4. Join our community discussions
---
**Happy Vibe Coding! 🚀**
Connection Info
You Might Also Like
everything-claude-code
Complete Claude Code configuration collection - agents, skills, hooks,...
markitdown
MarkItDown-MCP is a lightweight server for converting URIs to Markdown.
servers
Model Context Protocol Servers
servers
Model Context Protocol Servers
Time
A Model Context Protocol server for time and timezone conversions.
Filesystem
Node.js MCP Server for filesystem operations with dynamic access control.