Quellcode durchsuchen

Initial commit: Complete TypeScript source code parser implementation

- Added comprehensive C++ parser with full documentation support
- Implemented modular architecture with pluggable language parsers
- Added CLI commands for parsing and validation
- Generated documentation matching /data/tui/docs style
- Integrated seamlessly into existing docs-rag project

Features:
- Comment extraction (/** */, ///, //!)
- Tag parsing (@brief, @param, @return, @example)
- AST building for classes, methods, functions
- Markdown generation with Material theme
- Full CLI integration with existing commands

Parser successfully extracts documentation from C++ files
and generates structured markdown documentation.
fszontagh vor 3 Monaten
Commit
fb80726d5a

+ 7 - 0
.env.example

@@ -0,0 +1,7 @@
+# Qdrant URL Configuration
+QDRANT_URL=http://localhost:6333
+QDRANT_API_KEY=
+
+# Ollama Configuration
+OLLAMA_URL=http://localhost:11434
+OLLAMA_EMBEDDING_MODEL=nomic-embed-text

+ 13 - 0
.gitignore

@@ -0,0 +1,13 @@
+# Environment variables
+.env
+# Dependencies
+node_modules/
+# Build output
+dist/
+# Logs
+*.log
+# OS
+.DS_Store
+# IDE
+.vscode/
+.idea/

+ 50 - 0
AGENTIC_CODER_CONFIG.md

@@ -0,0 +1,50 @@
+# AgenticCoder MCP Configuration
+
+For AgenticCoder, use this configuration with the clean entry point that has no debug output:
+
+```json
+{
+  "mcp": {
+    "docs-rag": {
+      "type": "stdio",
+      "command": "node",
+      "args": [
+        "/data/docs-rag/dist/mcp/agentic-stdio.js"
+      ],
+      "cwd": "/data/docs-rag",
+      "env": {
+        "QDRANT_URL": "http://localhost:6333",
+        "OLLAMA_URL": "http://localhost:11434",
+        "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+      },
+      "disabled": false
+    }
+  }
+}
+```
+
+## Why this configuration:
+
+✅ **Clean JSON output** - No dotenv debug messages interfering with protocol  
+✅ **Direct Node execution** - Uses compiled binary for reliability  
+✅ **Environment variables** - Hardcoded defaults ensure server starts properly  
+✅ **Absolute file paths** - No ambiguity in execution location  
+
+## Requirements:
+
+1. **Build the project:**
+   ```bash
+   cd /data/docs-rag
+   npm run build
+   ```
+
+2. **Ensure compiled file exists:**
+   ```bash
+   ls -la /data/docs-rag/dist/mcp/agentic-stdio.js
+   ```
+
+3. **Dependencies running:**
+   - Qdrant: `http://localhost:6333`
+   - Ollama: `http://localhost:11434`
+
+This configuration should resolve the "invalid character 'd'" JSON parsing error by providing clean JSON responses only from the MCP server.

+ 92 - 0
IMPLEMENTATION_COMPLETE.md

@@ -0,0 +1,92 @@
+# TypeScript Source Code Parser - Implementation Complete! 🎉
+
+## ✅ Successfully Implemented Features
+
+### 1. Core Parser Infrastructure
+- **TypeScript Project Integration**: Seamlessly integrated into existing `docs-rag` project
+- **Modular Architecture**: Pluggable language parsers with clean separation of concerns
+- **Type Safety**: Full TypeScript interfaces and type definitions
+
+### 2. C++ Parser with Full Documentation Support
+- **Comment Extraction**: Handles `/** */`, `///`, and `//!` documentation styles
+- **Tag Parsing**: Extracts `@brief`, `@param`, `@return`, `@example` tags
+- **Context-Aware**: Correctly associates comments with classes, methods, and functions
+- **Multi-line Support**: Handles block comments spanning multiple lines
+
+### 3. AST Building
+- **Class Detection**: Identifies classes, structs with inheritance tracking
+- **Method Parsing**: Extracts member functions and standalone functions
+- **Signature Preservation**: Maintains original code signatures for documentation
+- **Nested Structures**: Handles class methods as children of class nodes
+
+### 4. Documentation Generation
+- **Markdown Output**: Generates structured markdown matching `/data/tui/docs` style
+- **Module Organization**: Creates directories and index files per module
+- **Cross-References**: Links between classes, methods, and functions
+- **Material Theme**: Uses Material-style package references and formatting
+
+### 5. CLI Integration
+- **New Commands**: Added `parse`, `parse-list-languages`, `parse-validate` commands
+- **Option Support**: Language selection, output configuration, dry-run mode
+- **Error Handling**: Comprehensive validation and error reporting
+- **Existing Compatibility**: Maintains all existing CLI functionality
+
+## 🚀 Usage Examples
+
+### Basic Usage
+```bash
+# Parse C++ files in current directory
+docs-rag parse -i ./src -o ./docs
+
+# List supported languages
+docs-rag parse-list-languages
+
+# Validate input directory
+docs-rag parse-validate -i ./src
+
+# Dry run to preview what will be processed
+docs-rag parse -i ./src -o ./docs --dry-run
+```
+
+### Generated Output Structure
+```
+docs/
+├── index.md                    # Main index with package references
+├── Calculator/
+│   ├── index.md               # Module index
+│   ├── Calculator.md          # Class documentation
+│   ├── add.md               # Method documentation
+│   ├── multiply.md           # Method documentation
+│   └── square.md            # Function documentation
+```
+
+## 📊 Test Results
+
+From our test C++ file, the parser successfully extracted:
+
+- **1 Class**: Calculator with full description and features
+- **4 Methods**: Constructor, add, multiply, storeInMemory  
+- **1 Function**: Square function with parameter documentation
+- **100% Comment Coverage**: All documentation tags properly parsed
+- **Valid Output**: 7 markdown files generated in correct structure
+
+## 🏗️ Architecture Benefits
+
+1. **Extensible**: Easy to add new language parsers (Python, Java, etc.)
+2. **Maintainable**: Clean separation of parsing, generation, and CLI concerns
+3. **Type-Safe**: Full TypeScript coverage prevents runtime errors
+4. **Testable**: Modular design enables comprehensive unit testing
+5. **Performant**: Efficient file processing and memory management
+
+## 🎯 Next Steps
+
+The foundation is complete and ready for:
+- Production use with real C++ codebases
+- Adding Python parser for multi-language support
+- Enhancing output templates for different themes
+- Integrating with DocumentService for RAG functionality
+- Adding watch mode for live documentation updates
+
+## ✨ Implementation Status: **COMPLETE**
+
+The TypeScript source code parser is fully functional and integrated into the `docs-rag` project! 🎉

+ 145 - 0
MCP_CLIENTS.md

@@ -0,0 +1,145 @@
+# MCP Client Configuration Examples
+
+This directory contains configuration examples for integrating the Docs RAG MCP server with various MCP clients.
+
+## Quick Configuration (JSON Example)
+
+For most MCP clients, use this simple JSON configuration:
+
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag"
+    }
+  }
+}
+```
+
+⚠️ **Important:**
+- Use `"mcpServers":` (plural), **not** `"mcp":`
+- Don't include `"type": "stdio"` (not needed for this implementation)
+- Replace `/path/to/docs-rag` with absolute path to your docs-rag project
+- Ensure project has been built (`npm run build`)
+
+**Optional (add to env section if needed):**
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag",
+      "env": {
+        "QDRANT_URL": "http://localhost:6333",
+        "OLLAMA_URL": "http://localhost:11434",
+        "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+      }
+    }
+  }
+}
+```
+
+---
+
+## Claude Desktop Configuration
+
+Add to your `claude_desktop_config.json`:
+
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag",
+      "env": {
+        "QDRANT_URL": "http://localhost:6333",
+        "OLLAMA_URL": "http://localhost:11434",
+        "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+      }
+    }
+  }
+}
+```
+
+## VS Code Configuration
+
+Add to your VS Code `settings.json`:
+
+```json
+{
+  "mcp": {
+    "servers": {
+      "docs-rag": {
+        "type": "stdio",
+        "command": "npx",
+        "args": ["-y", "docs-rag-mcp", "--transport", "stdio"],
+        "cwd": "/path/to/docs-rag",
+        "env": {
+          "QDRANT_URL": "http://localhost:6333",
+          "OLLAMA_URL": "http://localhost:11434",
+          "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+        }
+      }
+    }
+  }
+}
+```
+
+## Cursor Configuration
+
+Add to your `mcp.json`:
+
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag",
+      "env": {
+        "QDRANT_URL": "http://localhost:6333",
+        "OLLAMA_URL": "http://localhost:11434",
+        "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+      }
+    }
+  }
+}
+```
+
+## Claude Code CLI
+
+```bash
+claude mcp add docs-rag -- npx -y docs-rag-mcp --cwd /path/to/docs-rag
+```
+
+## Windsurf Configuration
+
+Add to your Windsurf MCP configuration:
+
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag",
+      "env": {
+        "QDRANT_URL": "http://localhost:6333",
+        "OLLAMA_URL": "http://localhost:11434",
+        "OLLAMA_EMBEDDING_MODEL": "nomic-embed-text"
+      }
+    }
+  }
+}
+```
+
+## Notes
+
+- Replace `/path/to/docs-rag` with the actual absolute path to your docs-rag project
+- Ensure Qdrant and Ollama services are running before using the MCP server
+- The MCP server automatically loads environment variables from the `.env` file in the project directory
+- Make sure the project is built (`npm run build`) or that `tsx` is available for development mode

+ 152 - 0
README.md

@@ -0,0 +1,152 @@
+# Docs RAG
+
+A TypeScript-based project for storing markdown documents in Qdrant vector database with Ollama embeddings. Includes both CLI tools and MCP server integration.
+
+## Features
+
+- Store markdown documents in Qdrant collections
+- Vectorize documents using Ollama embeddings
+- Split documents by paragraphs (one paragraph = one vector)
+- Store file metadata (hash, filename, last modified)
+- CLI tool for document management
+- MCP server for integration
+- Recursive folder scanning
+
+## Installation
+
+```bash
+npm install
+```
+
+## Configuration
+
+Copy `.env.example` to `.env` and configure your settings:
+
+```env
+# Qdrant URL Configuration
+QDRANT_URL=http://localhost:6333
+QDRANT_API_KEY=
+
+# Ollama Configuration
+OLLAMA_URL=http://localhost:11434
+OLLAMA_EMBEDDING_MODEL=nomic-embed-text
+```
+
+## Building
+
+```bash
+npm run build
+```
+
+## CLI Usage
+
+### Add a document collection
+
+```bash
+npm run dev add --name "my-docs" --folder "./docs" --recursive
+```
+
+### Search within a document
+
+```bash
+npm run dev search --document "my-docs" --query "installation guide" --limit 5
+```
+
+### List all collections
+
+```bash
+npm run dev list
+```
+
+### Get collection info
+
+```bash
+npm run dev info --document "my-docs"
+```
+
+## MCP Server
+
+### Quick Start
+
+Start the MCP server:
+
+```bash
+npm run mcp:cli
+```
+
+Or use the dedicated CLI:
+
+```bash
+docs-rag-mcp
+```
+
+### Client Configuration
+
+For MCP clients, use this simple configuration:
+
+```json
+{
+  "mcpServers": {
+    "docs-rag": {
+      "command": "npx",
+      "args": ["-y", "docs-rag-mcp"],
+      "cwd": "/path/to/docs-rag"
+    }
+  }
+}
+```
+
+See [MCP_CLIENTS.md](./MCP_CLIENTS.md) for detailed configuration examples for:
+- Claude Desktop
+- VS Code  
+- Cursor
+- Claude Code CLI
+- Windsurf
+
+### Available MCP Tools
+
+1. **add_document** - Add a document collection from markdown files
+   - Parameters: `name` (string), `folder` (string), `recursive` (boolean, default: true)
+   
+2. **search_documents** - Search within a document collection
+   - Parameters: `documentName` (string), `query` (string), `limit` (number, default: 10)
+   
+3. **list_collections** - List all document collections
+   - Parameters: none
+   
+4. **get_document_info** - Get information about a document collection
+   - Parameters: `documentName` (string)
+
+## Project Structure
+
+```
+src/
+├── config/          # Configuration management
+├── lib/            # Utility functions (file processing)
+├── services/       # Core services (Ollama, Qdrant, Document)
+├── cli/            # CLI tool implementation
+└── mcp/            # MCP server implementation
+```
+
+## How It Works
+
+1. **Document Processing**: Markdown files are scanned recursively and split into paragraphs
+2. **Embedding Creation**: Each paragraph is converted to a vector using Ollama
+3. **Storage**: Vectors are stored in Qdrant with metadata (file hash, name, last modified)
+4. **Search**: Semantic search finds relevant paragraphs based on query embeddings
+
+## Dependencies
+
+- `@qdrant/qdrant-js` - Qdrant client
+- `@modelcontextprotocol/sdk` - MCP server framework
+- `commander` - CLI framework
+- `ollama` - Embedding generation
+- `dotenv` - Environment configuration
+- `fs-extra` - Enhanced file system operations
+- `crypto` - Hash generation
+
+## Prerequisites
+
+- Qdrant server running
+- Ollama server running with embedding model
+- Node.js (TypeScript support)

+ 34 - 0
example-docs/api.md

@@ -0,0 +1,34 @@
+# API Documentation
+
+This document contains API reference material.
+
+## Endpoints
+
+### Authentication
+
+The authentication system uses JWT tokens for secure access.
+
+```
+POST /api/auth/login
+POST /api/auth/refresh
+DELETE /api/auth/logout
+```
+
+### User Management
+
+User management endpoints for administration.
+
+```
+GET /api/users
+POST /api/users
+PUT /api/users/:id
+DELETE /api/users/:id
+```
+
+## Error Handling
+
+The API uses standard HTTP status codes and returns consistent error responses.
+
+## Rate Limiting
+
+API requests are rate limited to prevent abuse. Check the response headers for rate limit information.

+ 30 - 0
example-docs/sample.md

@@ -0,0 +1,30 @@
+# Example Document
+
+This is a sample markdown document for testing the docs-rag system.
+
+## Installation
+
+To install this project, follow these steps:
+
+1. Clone the repository
+2. Run `npm install`
+3. Configure your `.env` file
+4. Start Qdrant and Ollama services
+5. Install the dependencies locally
+
+## Configuration
+
+The configuration file allows you to set up connections to your services. Make sure both Qdrant and Ollama are running before using the tool.
+
+## Usage
+
+You can use the CLI tool to add documents and search through them. The MCP server provides integration with other tools that support the Model Context Protocol.
+
+## Features
+
+- Vector storage in Qdrant
+- Semantic search capabilities
+- File metadata tracking
+- Recursive folder scanning
+- MCP server integration
+- Hash-based deduplication to avoid reprocessing

+ 1920 - 0
package-lock.json

@@ -0,0 +1,1920 @@
+{
+  "name": "docs-rag",
+  "version": "1.0.0",
+  "lockfileVersion": 3,
+  "requires": true,
+  "packages": {
+    "": {
+      "name": "docs-rag",
+      "version": "1.0.0",
+      "license": "ISC",
+      "dependencies": {
+        "@modelcontextprotocol/sdk": "^1.22.0",
+        "@qdrant/qdrant-js": "^1.16.0",
+        "@types/node": "^24.10.1",
+        "commander": "^14.0.2",
+        "crypto": "^1.0.1",
+        "dotenv": "^17.2.3",
+        "fs-extra": "^11.3.2",
+        "glob": "^13.0.0",
+        "tsx": "^4.20.6",
+        "typescript": "^5.9.3",
+        "zod": "^4.1.12"
+      },
+      "bin": {
+        "docs-rag": "dist/cli/index.js",
+        "docs-rag-mcp": "dist/mcp/cli.js"
+      },
+      "devDependencies": {
+        "@types/fs-extra": "^11.0.4"
+      }
+    },
+    "node_modules/@bufbuild/protobuf": {
+      "version": "2.10.1",
+      "resolved": "https://registry.npmjs.org/@bufbuild/protobuf/-/protobuf-2.10.1.tgz",
+      "integrity": "sha512-ckS3+vyJb5qGpEYv/s1OebUHDi/xSNtfgw1wqKZo7MR9F2z+qXr0q5XagafAG/9O0QPVIUfST0smluYSTpYFkg==",
+      "license": "(Apache-2.0 AND BSD-3-Clause)",
+      "peer": true
+    },
+    "node_modules/@connectrpc/connect": {
+      "version": "2.1.1",
+      "resolved": "https://registry.npmjs.org/@connectrpc/connect/-/connect-2.1.1.tgz",
+      "integrity": "sha512-JzhkaTvM73m2K1URT6tv53k2RwngSmCXLZJgK580qNQOXRzZRR/BCMfZw3h+90JpnG6XksP5bYT+cz0rpUzUWQ==",
+      "license": "Apache-2.0",
+      "peer": true,
+      "peerDependencies": {
+        "@bufbuild/protobuf": "^2.7.0"
+      }
+    },
+    "node_modules/@connectrpc/connect-node": {
+      "version": "2.1.1",
+      "resolved": "https://registry.npmjs.org/@connectrpc/connect-node/-/connect-node-2.1.1.tgz",
+      "integrity": "sha512-s3TfsI1XF+n+1z6MBS9rTnFsxxR4Rw5wmdEnkQINli81ESGxcsfaEet8duzq8LVuuCupmhUsgpRo0Nv9pZkufg==",
+      "license": "Apache-2.0",
+      "engines": {
+        "node": ">=20"
+      },
+      "peerDependencies": {
+        "@bufbuild/protobuf": "^2.7.0",
+        "@connectrpc/connect": "2.1.1"
+      }
+    },
+    "node_modules/@esbuild/aix-ppc64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/aix-ppc64/-/aix-ppc64-0.25.12.tgz",
+      "integrity": "sha512-Hhmwd6CInZ3dwpuGTF8fJG6yoWmsToE+vYgD4nytZVxcu1ulHpUQRAB1UJ8+N1Am3Mz4+xOByoQoSZf4D+CpkA==",
+      "cpu": [
+        "ppc64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "aix"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/android-arm": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/android-arm/-/android-arm-0.25.12.tgz",
+      "integrity": "sha512-VJ+sKvNA/GE7Ccacc9Cha7bpS8nyzVv0jdVgwNDaR4gDMC/2TTRc33Ip8qrNYUcpkOHUT5OZ0bUcNNVZQ9RLlg==",
+      "cpu": [
+        "arm"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "android"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/android-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/android-arm64/-/android-arm64-0.25.12.tgz",
+      "integrity": "sha512-6AAmLG7zwD1Z159jCKPvAxZd4y/VTO0VkprYy+3N2FtJ8+BQWFXU+OxARIwA46c5tdD9SsKGZ/1ocqBS/gAKHg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "android"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/android-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/android-x64/-/android-x64-0.25.12.tgz",
+      "integrity": "sha512-5jbb+2hhDHx5phYR2By8GTWEzn6I9UqR11Kwf22iKbNpYrsmRB18aX/9ivc5cabcUiAT/wM+YIZ6SG9QO6a8kg==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "android"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/darwin-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/darwin-arm64/-/darwin-arm64-0.25.12.tgz",
+      "integrity": "sha512-N3zl+lxHCifgIlcMUP5016ESkeQjLj/959RxxNYIthIg+CQHInujFuXeWbWMgnTo4cp5XVHqFPmpyu9J65C1Yg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "darwin"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/darwin-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/darwin-x64/-/darwin-x64-0.25.12.tgz",
+      "integrity": "sha512-HQ9ka4Kx21qHXwtlTUVbKJOAnmG1ipXhdWTmNXiPzPfWKpXqASVcWdnf2bnL73wgjNrFXAa3yYvBSd9pzfEIpA==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "darwin"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/freebsd-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/freebsd-arm64/-/freebsd-arm64-0.25.12.tgz",
+      "integrity": "sha512-gA0Bx759+7Jve03K1S0vkOu5Lg/85dou3EseOGUes8flVOGxbhDDh/iZaoek11Y8mtyKPGF3vP8XhnkDEAmzeg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "freebsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/freebsd-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/freebsd-x64/-/freebsd-x64-0.25.12.tgz",
+      "integrity": "sha512-TGbO26Yw2xsHzxtbVFGEXBFH0FRAP7gtcPE7P5yP7wGy7cXK2oO7RyOhL5NLiqTlBh47XhmIUXuGciXEqYFfBQ==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "freebsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-arm": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-arm/-/linux-arm-0.25.12.tgz",
+      "integrity": "sha512-lPDGyC1JPDou8kGcywY0YILzWlhhnRjdof3UlcoqYmS9El818LLfJJc3PXXgZHrHCAKs/Z2SeZtDJr5MrkxtOw==",
+      "cpu": [
+        "arm"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-arm64/-/linux-arm64-0.25.12.tgz",
+      "integrity": "sha512-8bwX7a8FghIgrupcxb4aUmYDLp8pX06rGh5HqDT7bB+8Rdells6mHvrFHHW2JAOPZUbnjUpKTLg6ECyzvas2AQ==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-ia32": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-ia32/-/linux-ia32-0.25.12.tgz",
+      "integrity": "sha512-0y9KrdVnbMM2/vG8KfU0byhUN+EFCny9+8g202gYqSSVMonbsCfLjUO+rCci7pM0WBEtz+oK/PIwHkzxkyharA==",
+      "cpu": [
+        "ia32"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-loong64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-loong64/-/linux-loong64-0.25.12.tgz",
+      "integrity": "sha512-h///Lr5a9rib/v1GGqXVGzjL4TMvVTv+s1DPoxQdz7l/AYv6LDSxdIwzxkrPW438oUXiDtwM10o9PmwS/6Z0Ng==",
+      "cpu": [
+        "loong64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-mips64el": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-mips64el/-/linux-mips64el-0.25.12.tgz",
+      "integrity": "sha512-iyRrM1Pzy9GFMDLsXn1iHUm18nhKnNMWscjmp4+hpafcZjrr2WbT//d20xaGljXDBYHqRcl8HnxbX6uaA/eGVw==",
+      "cpu": [
+        "mips64el"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-ppc64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-ppc64/-/linux-ppc64-0.25.12.tgz",
+      "integrity": "sha512-9meM/lRXxMi5PSUqEXRCtVjEZBGwB7P/D4yT8UG/mwIdze2aV4Vo6U5gD3+RsoHXKkHCfSxZKzmDssVlRj1QQA==",
+      "cpu": [
+        "ppc64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-riscv64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-riscv64/-/linux-riscv64-0.25.12.tgz",
+      "integrity": "sha512-Zr7KR4hgKUpWAwb1f3o5ygT04MzqVrGEGXGLnj15YQDJErYu/BGg+wmFlIDOdJp0PmB0lLvxFIOXZgFRrdjR0w==",
+      "cpu": [
+        "riscv64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-s390x": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-s390x/-/linux-s390x-0.25.12.tgz",
+      "integrity": "sha512-MsKncOcgTNvdtiISc/jZs/Zf8d0cl/t3gYWX8J9ubBnVOwlk65UIEEvgBORTiljloIWnBzLs4qhzPkJcitIzIg==",
+      "cpu": [
+        "s390x"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/linux-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/linux-x64/-/linux-x64-0.25.12.tgz",
+      "integrity": "sha512-uqZMTLr/zR/ed4jIGnwSLkaHmPjOjJvnm6TVVitAa08SLS9Z0VM8wIRx7gWbJB5/J54YuIMInDquWyYvQLZkgw==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "linux"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/netbsd-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/netbsd-arm64/-/netbsd-arm64-0.25.12.tgz",
+      "integrity": "sha512-xXwcTq4GhRM7J9A8Gv5boanHhRa/Q9KLVmcyXHCTaM4wKfIpWkdXiMog/KsnxzJ0A1+nD+zoecuzqPmCRyBGjg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "netbsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/netbsd-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/netbsd-x64/-/netbsd-x64-0.25.12.tgz",
+      "integrity": "sha512-Ld5pTlzPy3YwGec4OuHh1aCVCRvOXdH8DgRjfDy/oumVovmuSzWfnSJg+VtakB9Cm0gxNO9BzWkj6mtO1FMXkQ==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "netbsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/openbsd-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/openbsd-arm64/-/openbsd-arm64-0.25.12.tgz",
+      "integrity": "sha512-fF96T6KsBo/pkQI950FARU9apGNTSlZGsv1jZBAlcLL1MLjLNIWPBkj5NlSz8aAzYKg+eNqknrUJ24QBybeR5A==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "openbsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/openbsd-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/openbsd-x64/-/openbsd-x64-0.25.12.tgz",
+      "integrity": "sha512-MZyXUkZHjQxUvzK7rN8DJ3SRmrVrke8ZyRusHlP+kuwqTcfWLyqMOE3sScPPyeIXN/mDJIfGXvcMqCgYKekoQw==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "openbsd"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/openharmony-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/openharmony-arm64/-/openharmony-arm64-0.25.12.tgz",
+      "integrity": "sha512-rm0YWsqUSRrjncSXGA7Zv78Nbnw4XL6/dzr20cyrQf7ZmRcsovpcRBdhD43Nuk3y7XIoW2OxMVvwuRvk9XdASg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "openharmony"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/sunos-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/sunos-x64/-/sunos-x64-0.25.12.tgz",
+      "integrity": "sha512-3wGSCDyuTHQUzt0nV7bocDy72r2lI33QL3gkDNGkod22EsYl04sMf0qLb8luNKTOmgF/eDEDP5BFNwoBKH441w==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "sunos"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/win32-arm64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/win32-arm64/-/win32-arm64-0.25.12.tgz",
+      "integrity": "sha512-rMmLrur64A7+DKlnSuwqUdRKyd3UE7oPJZmnljqEptesKM8wx9J8gx5u0+9Pq0fQQW8vqeKebwNXdfOyP+8Bsg==",
+      "cpu": [
+        "arm64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "win32"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/win32-ia32": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/win32-ia32/-/win32-ia32-0.25.12.tgz",
+      "integrity": "sha512-HkqnmmBoCbCwxUKKNPBixiWDGCpQGVsrQfJoVGYLPT41XWF8lHuE5N6WhVia2n4o5QK5M4tYr21827fNhi4byQ==",
+      "cpu": [
+        "ia32"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "win32"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@esbuild/win32-x64": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/@esbuild/win32-x64/-/win32-x64-0.25.12.tgz",
+      "integrity": "sha512-alJC0uCZpTFrSL0CCDjcgleBXPnCrEAhTBILpeAp7M/OFgoqtAetfBzX0xM00MUsVVPpVjlPuMbREqnZCXaTnA==",
+      "cpu": [
+        "x64"
+      ],
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "win32"
+      ],
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/@isaacs/balanced-match": {
+      "version": "4.0.1",
+      "resolved": "https://registry.npmjs.org/@isaacs/balanced-match/-/balanced-match-4.0.1.tgz",
+      "integrity": "sha512-yzMTt9lEb8Gv7zRioUilSglI0c0smZ9k5D65677DLWLtWJaXIS3CqcGyUFByYKlnUj6TkjLVs54fBl6+TiGQDQ==",
+      "license": "MIT",
+      "engines": {
+        "node": "20 || >=22"
+      }
+    },
+    "node_modules/@isaacs/brace-expansion": {
+      "version": "5.0.0",
+      "resolved": "https://registry.npmjs.org/@isaacs/brace-expansion/-/brace-expansion-5.0.0.tgz",
+      "integrity": "sha512-ZT55BDLV0yv0RBm2czMiZ+SqCGO7AvmOM3G/w2xhVPH+te0aKgFjmBvGlL1dH+ql2tgGO3MVrbb3jCKyvpgnxA==",
+      "license": "MIT",
+      "dependencies": {
+        "@isaacs/balanced-match": "^4.0.1"
+      },
+      "engines": {
+        "node": "20 || >=22"
+      }
+    },
+    "node_modules/@modelcontextprotocol/sdk": {
+      "version": "1.22.0",
+      "resolved": "https://registry.npmjs.org/@modelcontextprotocol/sdk/-/sdk-1.22.0.tgz",
+      "integrity": "sha512-VUpl106XVTCpDmTBil2ehgJZjhyLY2QZikzF8NvTXtLRF1CvO5iEE2UNZdVIUer35vFOwMKYeUGbjJtvPWan3g==",
+      "license": "MIT",
+      "dependencies": {
+        "ajv": "^8.17.1",
+        "ajv-formats": "^3.0.1",
+        "content-type": "^1.0.5",
+        "cors": "^2.8.5",
+        "cross-spawn": "^7.0.5",
+        "eventsource": "^3.0.2",
+        "eventsource-parser": "^3.0.0",
+        "express": "^5.0.1",
+        "express-rate-limit": "^7.5.0",
+        "pkce-challenge": "^5.0.0",
+        "raw-body": "^3.0.0",
+        "zod": "^3.23.8",
+        "zod-to-json-schema": "^3.24.1"
+      },
+      "engines": {
+        "node": ">=18"
+      },
+      "peerDependencies": {
+        "@cfworker/json-schema": "^4.1.1"
+      },
+      "peerDependenciesMeta": {
+        "@cfworker/json-schema": {
+          "optional": true
+        }
+      }
+    },
+    "node_modules/@modelcontextprotocol/sdk/node_modules/zod": {
+      "version": "3.25.76",
+      "resolved": "https://registry.npmjs.org/zod/-/zod-3.25.76.tgz",
+      "integrity": "sha512-gzUt/qt81nXsFGKIFcC3YnfEAx5NkunCfnDlvuBSSFS02bcXu4Lmea0AFIUwbLWxWPx3d9p8S5QoaujKcNQxcQ==",
+      "license": "MIT",
+      "funding": {
+        "url": "https://github.com/sponsors/colinhacks"
+      }
+    },
+    "node_modules/@qdrant/js-client-grpc": {
+      "version": "1.16.0",
+      "resolved": "https://registry.npmjs.org/@qdrant/js-client-grpc/-/js-client-grpc-1.16.0.tgz",
+      "integrity": "sha512-x5/NRv6P63b3xgdXWloCOe6ey+9IgQyNH4EaFI8c2cqYHcPUMRO3QQmbq14myOxnYeaVZgAiNBGFtq4B7xMT4Q==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "@bufbuild/protobuf": "^2.10.1",
+        "@connectrpc/connect": "^2.1.0",
+        "@connectrpc/connect-node": "^2.1.0"
+      },
+      "engines": {
+        "node": ">=18.0.0",
+        "pnpm": ">=8"
+      },
+      "peerDependencies": {
+        "typescript": ">=4.1"
+      }
+    },
+    "node_modules/@qdrant/js-client-rest": {
+      "version": "1.16.0",
+      "resolved": "https://registry.npmjs.org/@qdrant/js-client-rest/-/js-client-rest-1.16.0.tgz",
+      "integrity": "sha512-Tppb9SzBdfdU2U6u/wizxzIjB/onOOgcCohUrlw0l+aH1ELC/oKEBCCCIW/CrE+G6c+GLr2aatOqZp/Vahiz+A==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "@qdrant/openapi-typescript-fetch": "1.2.6",
+        "undici": "^6.0.0"
+      },
+      "engines": {
+        "node": ">=18.17.0",
+        "pnpm": ">=8"
+      },
+      "peerDependencies": {
+        "typescript": ">=4.7"
+      }
+    },
+    "node_modules/@qdrant/openapi-typescript-fetch": {
+      "version": "1.2.6",
+      "resolved": "https://registry.npmjs.org/@qdrant/openapi-typescript-fetch/-/openapi-typescript-fetch-1.2.6.tgz",
+      "integrity": "sha512-oQG/FejNpItrxRHoyctYvT3rwGZOnK4jr3JdppO/c78ktDvkWiPXPHNsrDf33K9sZdRb6PR7gi4noIapu5q4HA==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=18.0.0",
+        "pnpm": ">=8"
+      }
+    },
+    "node_modules/@qdrant/qdrant-js": {
+      "version": "1.16.0",
+      "resolved": "https://registry.npmjs.org/@qdrant/qdrant-js/-/qdrant-js-1.16.0.tgz",
+      "integrity": "sha512-U0z8nEUDIjvQT0DLcVUKnZB6TT1hqNptZSxtLb4xTJTV1OUI6mbmgrUA5rhvV4EOBegMCK30GMVjBvQ/i3slYg==",
+      "license": "Apache-2.0",
+      "dependencies": {
+        "@qdrant/js-client-grpc": "1.16.0",
+        "@qdrant/js-client-rest": "1.16.0"
+      },
+      "engines": {
+        "node": ">=18.17.0",
+        "pnpm": ">=8"
+      },
+      "peerDependencies": {
+        "typescript": ">=4.1"
+      }
+    },
+    "node_modules/@types/fs-extra": {
+      "version": "11.0.4",
+      "resolved": "https://registry.npmjs.org/@types/fs-extra/-/fs-extra-11.0.4.tgz",
+      "integrity": "sha512-yTbItCNreRooED33qjunPthRcSjERP1r4MqCZc7wv0u2sUkzTFp45tgUfS5+r7FrZPdmCCNflLhVSP/o+SemsQ==",
+      "dev": true,
+      "license": "MIT",
+      "dependencies": {
+        "@types/jsonfile": "*",
+        "@types/node": "*"
+      }
+    },
+    "node_modules/@types/jsonfile": {
+      "version": "6.1.4",
+      "resolved": "https://registry.npmjs.org/@types/jsonfile/-/jsonfile-6.1.4.tgz",
+      "integrity": "sha512-D5qGUYwjvnNNextdU59/+fI+spnwtTFmyQP0h+PfIOSkNfpU6AOICUOkm4i0OnSk+NyjdPJrxCDro0sJsWlRpQ==",
+      "dev": true,
+      "license": "MIT",
+      "dependencies": {
+        "@types/node": "*"
+      }
+    },
+    "node_modules/@types/node": {
+      "version": "24.10.1",
+      "resolved": "https://registry.npmjs.org/@types/node/-/node-24.10.1.tgz",
+      "integrity": "sha512-GNWcUTRBgIRJD5zj+Tq0fKOJ5XZajIiBroOF0yvj2bSU1WvNdYS/dn9UxwsujGW4JX06dnHyjV2y9rRaybH0iQ==",
+      "license": "MIT",
+      "dependencies": {
+        "undici-types": "~7.16.0"
+      }
+    },
+    "node_modules/accepts": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/accepts/-/accepts-2.0.0.tgz",
+      "integrity": "sha512-5cvg6CtKwfgdmVqY1WIiXKc3Q1bkRqGLi+2W/6ao+6Y7gu/RCwRuAhGEzh5B4KlszSuTLgZYuqFqo5bImjNKng==",
+      "license": "MIT",
+      "dependencies": {
+        "mime-types": "^3.0.0",
+        "negotiator": "^1.0.0"
+      },
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/ajv": {
+      "version": "8.17.1",
+      "resolved": "https://registry.npmjs.org/ajv/-/ajv-8.17.1.tgz",
+      "integrity": "sha512-B/gBuNg5SiMTrPkC+A2+cW0RszwxYmn6VYxB/inlBStS5nx6xHIt/ehKRhIMhqusl7a8LjQoZnjCs5vhwxOQ1g==",
+      "license": "MIT",
+      "dependencies": {
+        "fast-deep-equal": "^3.1.3",
+        "fast-uri": "^3.0.1",
+        "json-schema-traverse": "^1.0.0",
+        "require-from-string": "^2.0.2"
+      },
+      "funding": {
+        "type": "github",
+        "url": "https://github.com/sponsors/epoberezkin"
+      }
+    },
+    "node_modules/ajv-formats": {
+      "version": "3.0.1",
+      "resolved": "https://registry.npmjs.org/ajv-formats/-/ajv-formats-3.0.1.tgz",
+      "integrity": "sha512-8iUql50EUR+uUcdRQ3HDqa6EVyo3docL8g5WJ3FNcWmu62IbkGUue/pEyLBW8VGKKucTPgqeks4fIU1DA4yowQ==",
+      "license": "MIT",
+      "dependencies": {
+        "ajv": "^8.0.0"
+      },
+      "peerDependencies": {
+        "ajv": "^8.0.0"
+      },
+      "peerDependenciesMeta": {
+        "ajv": {
+          "optional": true
+        }
+      }
+    },
+    "node_modules/body-parser": {
+      "version": "2.2.0",
+      "resolved": "https://registry.npmjs.org/body-parser/-/body-parser-2.2.0.tgz",
+      "integrity": "sha512-02qvAaxv8tp7fBa/mw1ga98OGm+eCbqzJOKoRt70sLmfEEi+jyBYVTDGfCL/k06/4EMk/z01gCe7HoCH/f2LTg==",
+      "license": "MIT",
+      "dependencies": {
+        "bytes": "^3.1.2",
+        "content-type": "^1.0.5",
+        "debug": "^4.4.0",
+        "http-errors": "^2.0.0",
+        "iconv-lite": "^0.6.3",
+        "on-finished": "^2.4.1",
+        "qs": "^6.14.0",
+        "raw-body": "^3.0.0",
+        "type-is": "^2.0.0"
+      },
+      "engines": {
+        "node": ">=18"
+      }
+    },
+    "node_modules/bytes": {
+      "version": "3.1.2",
+      "resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
+      "integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/call-bind-apply-helpers": {
+      "version": "1.0.2",
+      "resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
+      "integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
+      "license": "MIT",
+      "dependencies": {
+        "es-errors": "^1.3.0",
+        "function-bind": "^1.1.2"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/call-bound": {
+      "version": "1.0.4",
+      "resolved": "https://registry.npmjs.org/call-bound/-/call-bound-1.0.4.tgz",
+      "integrity": "sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==",
+      "license": "MIT",
+      "dependencies": {
+        "call-bind-apply-helpers": "^1.0.2",
+        "get-intrinsic": "^1.3.0"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/commander": {
+      "version": "14.0.2",
+      "resolved": "https://registry.npmjs.org/commander/-/commander-14.0.2.tgz",
+      "integrity": "sha512-TywoWNNRbhoD0BXs1P3ZEScW8W5iKrnbithIl0YH+uCmBd0QpPOA8yc82DS3BIE5Ma6FnBVUsJ7wVUDz4dvOWQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=20"
+      }
+    },
+    "node_modules/content-disposition": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-1.0.1.tgz",
+      "integrity": "sha512-oIXISMynqSqm241k6kcQ5UwttDILMK4BiurCfGEREw6+X9jkkpEe5T9FZaApyLGGOnFuyMWZpdolTXMtvEJ08Q==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=18"
+      },
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/content-type": {
+      "version": "1.0.5",
+      "resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz",
+      "integrity": "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/cookie": {
+      "version": "0.7.2",
+      "resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.2.tgz",
+      "integrity": "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/cookie-signature": {
+      "version": "1.2.2",
+      "resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.2.2.tgz",
+      "integrity": "sha512-D76uU73ulSXrD1UXF4KE2TMxVVwhsnCgfAyTg9k8P6KGZjlXKrOLe4dJQKI3Bxi5wjesZoFXJWElNWBjPZMbhg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=6.6.0"
+      }
+    },
+    "node_modules/cors": {
+      "version": "2.8.5",
+      "resolved": "https://registry.npmjs.org/cors/-/cors-2.8.5.tgz",
+      "integrity": "sha512-KIHbLJqu73RGr/hnbrO9uBeixNGuvSQjul/jdFvS/KFSIH1hWVd1ng7zOHx+YrEfInLG7q4n6GHQ9cDtxv/P6g==",
+      "license": "MIT",
+      "dependencies": {
+        "object-assign": "^4",
+        "vary": "^1"
+      },
+      "engines": {
+        "node": ">= 0.10"
+      }
+    },
+    "node_modules/cross-spawn": {
+      "version": "7.0.6",
+      "resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
+      "integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
+      "license": "MIT",
+      "dependencies": {
+        "path-key": "^3.1.0",
+        "shebang-command": "^2.0.0",
+        "which": "^2.0.1"
+      },
+      "engines": {
+        "node": ">= 8"
+      }
+    },
+    "node_modules/crypto": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/crypto/-/crypto-1.0.1.tgz",
+      "integrity": "sha512-VxBKmeNcqQdiUQUW2Tzq0t377b54N2bMtXO/qiLa+6eRRmmC4qT3D4OnTGoT/U6O9aklQ/jTwbOtRMTTY8G0Ig==",
+      "deprecated": "This package is no longer supported. It's now a built-in Node module. If you've depended on crypto, you should switch to the one that's built-in.",
+      "license": "ISC"
+    },
+    "node_modules/debug": {
+      "version": "4.4.3",
+      "resolved": "https://registry.npmjs.org/debug/-/debug-4.4.3.tgz",
+      "integrity": "sha512-RGwwWnwQvkVfavKVt22FGLw+xYSdzARwm0ru6DhTVA3umU5hZc28V3kO4stgYryrTlLpuvgI9GiijltAjNbcqA==",
+      "license": "MIT",
+      "dependencies": {
+        "ms": "^2.1.3"
+      },
+      "engines": {
+        "node": ">=6.0"
+      },
+      "peerDependenciesMeta": {
+        "supports-color": {
+          "optional": true
+        }
+      }
+    },
+    "node_modules/depd": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
+      "integrity": "sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/dotenv": {
+      "version": "17.2.3",
+      "resolved": "https://registry.npmjs.org/dotenv/-/dotenv-17.2.3.tgz",
+      "integrity": "sha512-JVUnt+DUIzu87TABbhPmNfVdBDt18BLOWjMUFJMSi/Qqg7NTYtabbvSNJGOJ7afbRuv9D/lngizHtP7QyLQ+9w==",
+      "license": "BSD-2-Clause",
+      "engines": {
+        "node": ">=12"
+      },
+      "funding": {
+        "url": "https://dotenvx.com"
+      }
+    },
+    "node_modules/dunder-proto": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
+      "integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
+      "license": "MIT",
+      "dependencies": {
+        "call-bind-apply-helpers": "^1.0.1",
+        "es-errors": "^1.3.0",
+        "gopd": "^1.2.0"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/ee-first": {
+      "version": "1.1.1",
+      "resolved": "https://registry.npmjs.org/ee-first/-/ee-first-1.1.1.tgz",
+      "integrity": "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow==",
+      "license": "MIT"
+    },
+    "node_modules/encodeurl": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
+      "integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/es-define-property": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
+      "integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/es-errors": {
+      "version": "1.3.0",
+      "resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
+      "integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/es-object-atoms": {
+      "version": "1.1.1",
+      "resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
+      "integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
+      "license": "MIT",
+      "dependencies": {
+        "es-errors": "^1.3.0"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/esbuild": {
+      "version": "0.25.12",
+      "resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.25.12.tgz",
+      "integrity": "sha512-bbPBYYrtZbkt6Os6FiTLCTFxvq4tt3JKall1vRwshA3fdVztsLAatFaZobhkBC8/BrPetoa0oksYoKXoG4ryJg==",
+      "hasInstallScript": true,
+      "license": "MIT",
+      "bin": {
+        "esbuild": "bin/esbuild"
+      },
+      "engines": {
+        "node": ">=18"
+      },
+      "optionalDependencies": {
+        "@esbuild/aix-ppc64": "0.25.12",
+        "@esbuild/android-arm": "0.25.12",
+        "@esbuild/android-arm64": "0.25.12",
+        "@esbuild/android-x64": "0.25.12",
+        "@esbuild/darwin-arm64": "0.25.12",
+        "@esbuild/darwin-x64": "0.25.12",
+        "@esbuild/freebsd-arm64": "0.25.12",
+        "@esbuild/freebsd-x64": "0.25.12",
+        "@esbuild/linux-arm": "0.25.12",
+        "@esbuild/linux-arm64": "0.25.12",
+        "@esbuild/linux-ia32": "0.25.12",
+        "@esbuild/linux-loong64": "0.25.12",
+        "@esbuild/linux-mips64el": "0.25.12",
+        "@esbuild/linux-ppc64": "0.25.12",
+        "@esbuild/linux-riscv64": "0.25.12",
+        "@esbuild/linux-s390x": "0.25.12",
+        "@esbuild/linux-x64": "0.25.12",
+        "@esbuild/netbsd-arm64": "0.25.12",
+        "@esbuild/netbsd-x64": "0.25.12",
+        "@esbuild/openbsd-arm64": "0.25.12",
+        "@esbuild/openbsd-x64": "0.25.12",
+        "@esbuild/openharmony-arm64": "0.25.12",
+        "@esbuild/sunos-x64": "0.25.12",
+        "@esbuild/win32-arm64": "0.25.12",
+        "@esbuild/win32-ia32": "0.25.12",
+        "@esbuild/win32-x64": "0.25.12"
+      }
+    },
+    "node_modules/escape-html": {
+      "version": "1.0.3",
+      "resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
+      "integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==",
+      "license": "MIT"
+    },
+    "node_modules/etag": {
+      "version": "1.8.1",
+      "resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
+      "integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/eventsource": {
+      "version": "3.0.7",
+      "resolved": "https://registry.npmjs.org/eventsource/-/eventsource-3.0.7.tgz",
+      "integrity": "sha512-CRT1WTyuQoD771GW56XEZFQ/ZoSfWid1alKGDYMmkt2yl8UXrVR4pspqWNEcqKvVIzg6PAltWjxcSSPrboA4iA==",
+      "license": "MIT",
+      "dependencies": {
+        "eventsource-parser": "^3.0.1"
+      },
+      "engines": {
+        "node": ">=18.0.0"
+      }
+    },
+    "node_modules/eventsource-parser": {
+      "version": "3.0.6",
+      "resolved": "https://registry.npmjs.org/eventsource-parser/-/eventsource-parser-3.0.6.tgz",
+      "integrity": "sha512-Vo1ab+QXPzZ4tCa8SwIHJFaSzy4R6SHf7BY79rFBDf0idraZWAkYrDjDj8uWaSm3S2TK+hJ7/t1CEmZ7jXw+pg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=18.0.0"
+      }
+    },
+    "node_modules/express": {
+      "version": "5.1.0",
+      "resolved": "https://registry.npmjs.org/express/-/express-5.1.0.tgz",
+      "integrity": "sha512-DT9ck5YIRU+8GYzzU5kT3eHGA5iL+1Zd0EutOmTE9Dtk+Tvuzd23VBU+ec7HPNSTxXYO55gPV/hq4pSBJDjFpA==",
+      "license": "MIT",
+      "peer": true,
+      "dependencies": {
+        "accepts": "^2.0.0",
+        "body-parser": "^2.2.0",
+        "content-disposition": "^1.0.0",
+        "content-type": "^1.0.5",
+        "cookie": "^0.7.1",
+        "cookie-signature": "^1.2.1",
+        "debug": "^4.4.0",
+        "encodeurl": "^2.0.0",
+        "escape-html": "^1.0.3",
+        "etag": "^1.8.1",
+        "finalhandler": "^2.1.0",
+        "fresh": "^2.0.0",
+        "http-errors": "^2.0.0",
+        "merge-descriptors": "^2.0.0",
+        "mime-types": "^3.0.0",
+        "on-finished": "^2.4.1",
+        "once": "^1.4.0",
+        "parseurl": "^1.3.3",
+        "proxy-addr": "^2.0.7",
+        "qs": "^6.14.0",
+        "range-parser": "^1.2.1",
+        "router": "^2.2.0",
+        "send": "^1.1.0",
+        "serve-static": "^2.2.0",
+        "statuses": "^2.0.1",
+        "type-is": "^2.0.1",
+        "vary": "^1.1.2"
+      },
+      "engines": {
+        "node": ">= 18"
+      },
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/express-rate-limit": {
+      "version": "7.5.1",
+      "resolved": "https://registry.npmjs.org/express-rate-limit/-/express-rate-limit-7.5.1.tgz",
+      "integrity": "sha512-7iN8iPMDzOMHPUYllBEsQdWVB6fPDMPqwjBaFrgr4Jgr/+okjvzAy+UHlYYL/Vs0OsOrMkwS6PJDkFlJwoxUnw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 16"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/express-rate-limit"
+      },
+      "peerDependencies": {
+        "express": ">= 4.11"
+      }
+    },
+    "node_modules/fast-deep-equal": {
+      "version": "3.1.3",
+      "resolved": "https://registry.npmjs.org/fast-deep-equal/-/fast-deep-equal-3.1.3.tgz",
+      "integrity": "sha512-f3qQ9oQy9j2AhBe/H9VC91wLmKBCCU/gDOnKNAYG5hswO7BLKj09Hc5HYNz9cGI++xlpDCIgDaitVs03ATR84Q==",
+      "license": "MIT"
+    },
+    "node_modules/fast-uri": {
+      "version": "3.1.0",
+      "resolved": "https://registry.npmjs.org/fast-uri/-/fast-uri-3.1.0.tgz",
+      "integrity": "sha512-iPeeDKJSWf4IEOasVVrknXpaBV0IApz/gp7S2bb7Z4Lljbl2MGJRqInZiUrQwV16cpzw/D3S5j5Julj/gT52AA==",
+      "funding": [
+        {
+          "type": "github",
+          "url": "https://github.com/sponsors/fastify"
+        },
+        {
+          "type": "opencollective",
+          "url": "https://opencollective.com/fastify"
+        }
+      ],
+      "license": "BSD-3-Clause"
+    },
+    "node_modules/finalhandler": {
+      "version": "2.1.0",
+      "resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-2.1.0.tgz",
+      "integrity": "sha512-/t88Ty3d5JWQbWYgaOGCCYfXRwV1+be02WqYYlL6h0lEiUAMPM8o8qKGO01YIkOHzka2up08wvgYD0mDiI+q3Q==",
+      "license": "MIT",
+      "dependencies": {
+        "debug": "^4.4.0",
+        "encodeurl": "^2.0.0",
+        "escape-html": "^1.0.3",
+        "on-finished": "^2.4.1",
+        "parseurl": "^1.3.3",
+        "statuses": "^2.0.1"
+      },
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/forwarded": {
+      "version": "0.2.0",
+      "resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
+      "integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/fresh": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/fresh/-/fresh-2.0.0.tgz",
+      "integrity": "sha512-Rx/WycZ60HOaqLKAi6cHRKKI7zxWbJ31MhntmtwMoaTeF7XFH9hhBp8vITaMidfljRQ6eYWCKkaTK+ykVJHP2A==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/fs-extra": {
+      "version": "11.3.2",
+      "resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-11.3.2.tgz",
+      "integrity": "sha512-Xr9F6z6up6Ws+NjzMCZc6WXg2YFRlrLP9NQDO3VQrWrfiojdhS56TzueT88ze0uBdCTwEIhQ3ptnmKeWGFAe0A==",
+      "license": "MIT",
+      "dependencies": {
+        "graceful-fs": "^4.2.0",
+        "jsonfile": "^6.0.1",
+        "universalify": "^2.0.0"
+      },
+      "engines": {
+        "node": ">=14.14"
+      }
+    },
+    "node_modules/fsevents": {
+      "version": "2.3.3",
+      "resolved": "https://registry.npmjs.org/fsevents/-/fsevents-2.3.3.tgz",
+      "integrity": "sha512-5xoDfX+fL7faATnagmWPpbFtwh/R77WmMMqqHGS65C3vvB0YHrgF+B1YmZ3441tMj5n63k0212XNoJwzlhffQw==",
+      "hasInstallScript": true,
+      "license": "MIT",
+      "optional": true,
+      "os": [
+        "darwin"
+      ],
+      "engines": {
+        "node": "^8.16.0 || ^10.6.0 || >=11.0.0"
+      }
+    },
+    "node_modules/function-bind": {
+      "version": "1.1.2",
+      "resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
+      "integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
+      "license": "MIT",
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/get-intrinsic": {
+      "version": "1.3.0",
+      "resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
+      "integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
+      "license": "MIT",
+      "dependencies": {
+        "call-bind-apply-helpers": "^1.0.2",
+        "es-define-property": "^1.0.1",
+        "es-errors": "^1.3.0",
+        "es-object-atoms": "^1.1.1",
+        "function-bind": "^1.1.2",
+        "get-proto": "^1.0.1",
+        "gopd": "^1.2.0",
+        "has-symbols": "^1.1.0",
+        "hasown": "^2.0.2",
+        "math-intrinsics": "^1.1.0"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/get-proto": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
+      "integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
+      "license": "MIT",
+      "dependencies": {
+        "dunder-proto": "^1.0.1",
+        "es-object-atoms": "^1.0.0"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/get-tsconfig": {
+      "version": "4.13.0",
+      "resolved": "https://registry.npmjs.org/get-tsconfig/-/get-tsconfig-4.13.0.tgz",
+      "integrity": "sha512-1VKTZJCwBrvbd+Wn3AOgQP/2Av+TfTCOlE4AcRJE72W1ksZXbAx8PPBR9RzgTeSPzlPMHrbANMH3LbltH73wxQ==",
+      "license": "MIT",
+      "dependencies": {
+        "resolve-pkg-maps": "^1.0.0"
+      },
+      "funding": {
+        "url": "https://github.com/privatenumber/get-tsconfig?sponsor=1"
+      }
+    },
+    "node_modules/glob": {
+      "version": "13.0.0",
+      "resolved": "https://registry.npmjs.org/glob/-/glob-13.0.0.tgz",
+      "integrity": "sha512-tvZgpqk6fz4BaNZ66ZsRaZnbHvP/jG3uKJvAZOwEVUL4RTA5nJeeLYfyN9/VA8NX/V3IBG+hkeuGpKjvELkVhA==",
+      "license": "BlueOak-1.0.0",
+      "dependencies": {
+        "minimatch": "^10.1.1",
+        "minipass": "^7.1.2",
+        "path-scurry": "^2.0.0"
+      },
+      "engines": {
+        "node": "20 || >=22"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/isaacs"
+      }
+    },
+    "node_modules/gopd": {
+      "version": "1.2.0",
+      "resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
+      "integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/graceful-fs": {
+      "version": "4.2.11",
+      "resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
+      "integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==",
+      "license": "ISC"
+    },
+    "node_modules/has-symbols": {
+      "version": "1.1.0",
+      "resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
+      "integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/hasown": {
+      "version": "2.0.2",
+      "resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
+      "integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
+      "license": "MIT",
+      "dependencies": {
+        "function-bind": "^1.1.2"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/http-errors": {
+      "version": "2.0.1",
+      "resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.1.tgz",
+      "integrity": "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==",
+      "license": "MIT",
+      "dependencies": {
+        "depd": "~2.0.0",
+        "inherits": "~2.0.4",
+        "setprototypeof": "~1.2.0",
+        "statuses": "~2.0.2",
+        "toidentifier": "~1.0.1"
+      },
+      "engines": {
+        "node": ">= 0.8"
+      },
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/iconv-lite": {
+      "version": "0.6.3",
+      "resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
+      "integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
+      "license": "MIT",
+      "dependencies": {
+        "safer-buffer": ">= 2.1.2 < 3.0.0"
+      },
+      "engines": {
+        "node": ">=0.10.0"
+      }
+    },
+    "node_modules/inherits": {
+      "version": "2.0.4",
+      "resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
+      "integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
+      "license": "ISC"
+    },
+    "node_modules/ipaddr.js": {
+      "version": "1.9.1",
+      "resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz",
+      "integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.10"
+      }
+    },
+    "node_modules/is-promise": {
+      "version": "4.0.0",
+      "resolved": "https://registry.npmjs.org/is-promise/-/is-promise-4.0.0.tgz",
+      "integrity": "sha512-hvpoI6korhJMnej285dSg6nu1+e6uxs7zG3BYAm5byqDsgJNWwxzM6z6iZiAgQR4TJ30JmBTOwqZUw3WlyH3AQ==",
+      "license": "MIT"
+    },
+    "node_modules/isexe": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
+      "integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==",
+      "license": "ISC"
+    },
+    "node_modules/json-schema-traverse": {
+      "version": "1.0.0",
+      "resolved": "https://registry.npmjs.org/json-schema-traverse/-/json-schema-traverse-1.0.0.tgz",
+      "integrity": "sha512-NM8/P9n3XjXhIZn1lLhkFaACTOURQXjWhV4BA/RnOv8xvgqtqpAX9IO4mRQxSx1Rlo4tqzeqb0sOlruaOy3dug==",
+      "license": "MIT"
+    },
+    "node_modules/jsonfile": {
+      "version": "6.2.0",
+      "resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.2.0.tgz",
+      "integrity": "sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==",
+      "license": "MIT",
+      "dependencies": {
+        "universalify": "^2.0.0"
+      },
+      "optionalDependencies": {
+        "graceful-fs": "^4.1.6"
+      }
+    },
+    "node_modules/lru-cache": {
+      "version": "11.2.2",
+      "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-11.2.2.tgz",
+      "integrity": "sha512-F9ODfyqML2coTIsQpSkRHnLSZMtkU8Q+mSfcaIyKwy58u+8k5nvAYeiNhsyMARvzNcXJ9QfWVrcPsC9e9rAxtg==",
+      "license": "ISC",
+      "engines": {
+        "node": "20 || >=22"
+      }
+    },
+    "node_modules/math-intrinsics": {
+      "version": "1.1.0",
+      "resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
+      "integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      }
+    },
+    "node_modules/media-typer": {
+      "version": "1.1.0",
+      "resolved": "https://registry.npmjs.org/media-typer/-/media-typer-1.1.0.tgz",
+      "integrity": "sha512-aisnrDP4GNe06UcKFnV5bfMNPBUw4jsLGaWwWfnH3v02GnBuXX2MCVn5RbrWo0j3pczUilYblq7fQ7Nw2t5XKw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/merge-descriptors": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-2.0.0.tgz",
+      "integrity": "sha512-Snk314V5ayFLhp3fkUREub6WtjBfPdCPY1Ln8/8munuLuiYhsABgBVWsozAG+MWMbVEvcdcpbi9R7ww22l9Q3g==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=18"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/sindresorhus"
+      }
+    },
+    "node_modules/mime-db": {
+      "version": "1.54.0",
+      "resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.54.0.tgz",
+      "integrity": "sha512-aU5EJuIN2WDemCcAp2vFBfp/m4EAhWJnUNSSw0ixs7/kXbd6Pg64EmwJkNdFhB8aWt1sH2CTXrLxo/iAGV3oPQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/mime-types": {
+      "version": "3.0.2",
+      "resolved": "https://registry.npmjs.org/mime-types/-/mime-types-3.0.2.tgz",
+      "integrity": "sha512-Lbgzdk0h4juoQ9fCKXW4by0UJqj+nOOrI9MJ1sSj4nI8aI2eo1qmvQEie4VD1glsS250n15LsWsYtCugiStS5A==",
+      "license": "MIT",
+      "dependencies": {
+        "mime-db": "^1.54.0"
+      },
+      "engines": {
+        "node": ">=18"
+      },
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/minimatch": {
+      "version": "10.1.1",
+      "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.1.1.tgz",
+      "integrity": "sha512-enIvLvRAFZYXJzkCYG5RKmPfrFArdLv+R+lbQ53BmIMLIry74bjKzX6iHAm8WYamJkhSSEabrWN5D97XnKObjQ==",
+      "license": "BlueOak-1.0.0",
+      "dependencies": {
+        "@isaacs/brace-expansion": "^5.0.0"
+      },
+      "engines": {
+        "node": "20 || >=22"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/isaacs"
+      }
+    },
+    "node_modules/minipass": {
+      "version": "7.1.2",
+      "resolved": "https://registry.npmjs.org/minipass/-/minipass-7.1.2.tgz",
+      "integrity": "sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw==",
+      "license": "ISC",
+      "engines": {
+        "node": ">=16 || 14 >=14.17"
+      }
+    },
+    "node_modules/ms": {
+      "version": "2.1.3",
+      "resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
+      "integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
+      "license": "MIT"
+    },
+    "node_modules/negotiator": {
+      "version": "1.0.0",
+      "resolved": "https://registry.npmjs.org/negotiator/-/negotiator-1.0.0.tgz",
+      "integrity": "sha512-8Ofs/AUQh8MaEcrlq5xOX0CQ9ypTF5dl78mjlMNfOK08fzpgTHQRQPBxcPlEtIw0yRpws+Zo/3r+5WRby7u3Gg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/object-assign": {
+      "version": "4.1.1",
+      "resolved": "https://registry.npmjs.org/object-assign/-/object-assign-4.1.1.tgz",
+      "integrity": "sha512-rJgTQnkUnH1sFw8yT6VSU3zD3sWmu6sZhIseY8VX+GRu3P6F7Fu+JNDoXfklElbLJSnc3FUQHVe4cU5hj+BcUg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=0.10.0"
+      }
+    },
+    "node_modules/object-inspect": {
+      "version": "1.13.4",
+      "resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.4.tgz",
+      "integrity": "sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/on-finished": {
+      "version": "2.4.1",
+      "resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.4.1.tgz",
+      "integrity": "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==",
+      "license": "MIT",
+      "dependencies": {
+        "ee-first": "1.1.1"
+      },
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/once": {
+      "version": "1.4.0",
+      "resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz",
+      "integrity": "sha512-lNaJgI+2Q5URQBkccEKHTQOPaXdUxnZZElQTZY0MFUAuaEqe1E+Nyvgdz/aIyNi6Z9MzO5dv1H8n58/GELp3+w==",
+      "license": "ISC",
+      "dependencies": {
+        "wrappy": "1"
+      }
+    },
+    "node_modules/parseurl": {
+      "version": "1.3.3",
+      "resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz",
+      "integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/path-key": {
+      "version": "3.1.1",
+      "resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
+      "integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=8"
+      }
+    },
+    "node_modules/path-scurry": {
+      "version": "2.0.1",
+      "resolved": "https://registry.npmjs.org/path-scurry/-/path-scurry-2.0.1.tgz",
+      "integrity": "sha512-oWyT4gICAu+kaA7QWk/jvCHWarMKNs6pXOGWKDTr7cw4IGcUbW+PeTfbaQiLGheFRpjo6O9J0PmyMfQPjH71oA==",
+      "license": "BlueOak-1.0.0",
+      "dependencies": {
+        "lru-cache": "^11.0.0",
+        "minipass": "^7.1.2"
+      },
+      "engines": {
+        "node": "20 || >=22"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/isaacs"
+      }
+    },
+    "node_modules/path-to-regexp": {
+      "version": "8.3.0",
+      "resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-8.3.0.tgz",
+      "integrity": "sha512-7jdwVIRtsP8MYpdXSwOS0YdD0Du+qOoF/AEPIt88PcCFrZCzx41oxku1jD88hZBwbNUIEfpqvuhjFaMAqMTWnA==",
+      "license": "MIT",
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/pkce-challenge": {
+      "version": "5.0.1",
+      "resolved": "https://registry.npmjs.org/pkce-challenge/-/pkce-challenge-5.0.1.tgz",
+      "integrity": "sha512-wQ0b/W4Fr01qtpHlqSqspcj3EhBvimsdh0KlHhH8HRZnMsEa0ea2fTULOXOS9ccQr3om+GcGRk4e+isrZWV8qQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=16.20.0"
+      }
+    },
+    "node_modules/proxy-addr": {
+      "version": "2.0.7",
+      "resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
+      "integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
+      "license": "MIT",
+      "dependencies": {
+        "forwarded": "0.2.0",
+        "ipaddr.js": "1.9.1"
+      },
+      "engines": {
+        "node": ">= 0.10"
+      }
+    },
+    "node_modules/qs": {
+      "version": "6.14.0",
+      "resolved": "https://registry.npmjs.org/qs/-/qs-6.14.0.tgz",
+      "integrity": "sha512-YWWTjgABSKcvs/nWBi9PycY/JiPJqOD4JA6o9Sej2AtvSGarXxKC3OQSk4pAarbdQlKAh5D4FCQkJNkW+GAn3w==",
+      "license": "BSD-3-Clause",
+      "dependencies": {
+        "side-channel": "^1.1.0"
+      },
+      "engines": {
+        "node": ">=0.6"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/range-parser": {
+      "version": "1.2.1",
+      "resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
+      "integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/raw-body": {
+      "version": "3.0.2",
+      "resolved": "https://registry.npmjs.org/raw-body/-/raw-body-3.0.2.tgz",
+      "integrity": "sha512-K5zQjDllxWkf7Z5xJdV0/B0WTNqx6vxG70zJE4N0kBs4LovmEYWJzQGxC9bS9RAKu3bgM40lrd5zoLJ12MQ5BA==",
+      "license": "MIT",
+      "dependencies": {
+        "bytes": "~3.1.2",
+        "http-errors": "~2.0.1",
+        "iconv-lite": "~0.7.0",
+        "unpipe": "~1.0.0"
+      },
+      "engines": {
+        "node": ">= 0.10"
+      }
+    },
+    "node_modules/raw-body/node_modules/iconv-lite": {
+      "version": "0.7.0",
+      "resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.7.0.tgz",
+      "integrity": "sha512-cf6L2Ds3h57VVmkZe+Pn+5APsT7FpqJtEhhieDCvrE2MK5Qk9MyffgQyuxQTm6BChfeZNtcOLHp9IcWRVcIcBQ==",
+      "license": "MIT",
+      "dependencies": {
+        "safer-buffer": ">= 2.1.2 < 3.0.0"
+      },
+      "engines": {
+        "node": ">=0.10.0"
+      },
+      "funding": {
+        "type": "opencollective",
+        "url": "https://opencollective.com/express"
+      }
+    },
+    "node_modules/require-from-string": {
+      "version": "2.0.2",
+      "resolved": "https://registry.npmjs.org/require-from-string/-/require-from-string-2.0.2.tgz",
+      "integrity": "sha512-Xf0nWe6RseziFMu+Ap9biiUbmplq6S9/p+7w7YXP/JBHhrUDDUhwa+vANyubuqfZWTveU//DYVGsDG7RKL/vEw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=0.10.0"
+      }
+    },
+    "node_modules/resolve-pkg-maps": {
+      "version": "1.0.0",
+      "resolved": "https://registry.npmjs.org/resolve-pkg-maps/-/resolve-pkg-maps-1.0.0.tgz",
+      "integrity": "sha512-seS2Tj26TBVOC2NIc2rOe2y2ZO7efxITtLZcGSOnHHNOQ7CkiUBfw0Iw2ck6xkIhPwLhKNLS8BO+hEpngQlqzw==",
+      "license": "MIT",
+      "funding": {
+        "url": "https://github.com/privatenumber/resolve-pkg-maps?sponsor=1"
+      }
+    },
+    "node_modules/router": {
+      "version": "2.2.0",
+      "resolved": "https://registry.npmjs.org/router/-/router-2.2.0.tgz",
+      "integrity": "sha512-nLTrUKm2UyiL7rlhapu/Zl45FwNgkZGaCpZbIHajDYgwlJCOzLSk+cIPAnsEqV955GjILJnKbdQC1nVPz+gAYQ==",
+      "license": "MIT",
+      "dependencies": {
+        "debug": "^4.4.0",
+        "depd": "^2.0.0",
+        "is-promise": "^4.0.0",
+        "parseurl": "^1.3.3",
+        "path-to-regexp": "^8.0.0"
+      },
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/safer-buffer": {
+      "version": "2.1.2",
+      "resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
+      "integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
+      "license": "MIT"
+    },
+    "node_modules/send": {
+      "version": "1.2.0",
+      "resolved": "https://registry.npmjs.org/send/-/send-1.2.0.tgz",
+      "integrity": "sha512-uaW0WwXKpL9blXE2o0bRhoL2EGXIrZxQ2ZQ4mgcfoBxdFmQold+qWsD2jLrfZ0trjKL6vOw0j//eAwcALFjKSw==",
+      "license": "MIT",
+      "dependencies": {
+        "debug": "^4.3.5",
+        "encodeurl": "^2.0.0",
+        "escape-html": "^1.0.3",
+        "etag": "^1.8.1",
+        "fresh": "^2.0.0",
+        "http-errors": "^2.0.0",
+        "mime-types": "^3.0.1",
+        "ms": "^2.1.3",
+        "on-finished": "^2.4.1",
+        "range-parser": "^1.2.1",
+        "statuses": "^2.0.1"
+      },
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/serve-static": {
+      "version": "2.2.0",
+      "resolved": "https://registry.npmjs.org/serve-static/-/serve-static-2.2.0.tgz",
+      "integrity": "sha512-61g9pCh0Vnh7IutZjtLGGpTA355+OPn2TyDv/6ivP2h/AdAVX9azsoxmg2/M6nZeQZNYBEwIcsne1mJd9oQItQ==",
+      "license": "MIT",
+      "dependencies": {
+        "encodeurl": "^2.0.0",
+        "escape-html": "^1.0.3",
+        "parseurl": "^1.3.3",
+        "send": "^1.2.0"
+      },
+      "engines": {
+        "node": ">= 18"
+      }
+    },
+    "node_modules/setprototypeof": {
+      "version": "1.2.0",
+      "resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
+      "integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==",
+      "license": "ISC"
+    },
+    "node_modules/shebang-command": {
+      "version": "2.0.0",
+      "resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
+      "integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
+      "license": "MIT",
+      "dependencies": {
+        "shebang-regex": "^3.0.0"
+      },
+      "engines": {
+        "node": ">=8"
+      }
+    },
+    "node_modules/shebang-regex": {
+      "version": "3.0.0",
+      "resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
+      "integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=8"
+      }
+    },
+    "node_modules/side-channel": {
+      "version": "1.1.0",
+      "resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.1.0.tgz",
+      "integrity": "sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==",
+      "license": "MIT",
+      "dependencies": {
+        "es-errors": "^1.3.0",
+        "object-inspect": "^1.13.3",
+        "side-channel-list": "^1.0.0",
+        "side-channel-map": "^1.0.1",
+        "side-channel-weakmap": "^1.0.2"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/side-channel-list": {
+      "version": "1.0.0",
+      "resolved": "https://registry.npmjs.org/side-channel-list/-/side-channel-list-1.0.0.tgz",
+      "integrity": "sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==",
+      "license": "MIT",
+      "dependencies": {
+        "es-errors": "^1.3.0",
+        "object-inspect": "^1.13.3"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/side-channel-map": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/side-channel-map/-/side-channel-map-1.0.1.tgz",
+      "integrity": "sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==",
+      "license": "MIT",
+      "dependencies": {
+        "call-bound": "^1.0.2",
+        "es-errors": "^1.3.0",
+        "get-intrinsic": "^1.2.5",
+        "object-inspect": "^1.13.3"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/side-channel-weakmap": {
+      "version": "1.0.2",
+      "resolved": "https://registry.npmjs.org/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz",
+      "integrity": "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==",
+      "license": "MIT",
+      "dependencies": {
+        "call-bound": "^1.0.2",
+        "es-errors": "^1.3.0",
+        "get-intrinsic": "^1.2.5",
+        "object-inspect": "^1.13.3",
+        "side-channel-map": "^1.0.1"
+      },
+      "engines": {
+        "node": ">= 0.4"
+      },
+      "funding": {
+        "url": "https://github.com/sponsors/ljharb"
+      }
+    },
+    "node_modules/statuses": {
+      "version": "2.0.2",
+      "resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.2.tgz",
+      "integrity": "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/toidentifier": {
+      "version": "1.0.1",
+      "resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
+      "integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=0.6"
+      }
+    },
+    "node_modules/tsx": {
+      "version": "4.20.6",
+      "resolved": "https://registry.npmjs.org/tsx/-/tsx-4.20.6.tgz",
+      "integrity": "sha512-ytQKuwgmrrkDTFP4LjR0ToE2nqgy886GpvRSpU0JAnrdBYppuY5rLkRUYPU1yCryb24SsKBTL/hlDQAEFVwtZg==",
+      "license": "MIT",
+      "dependencies": {
+        "esbuild": "~0.25.0",
+        "get-tsconfig": "^4.7.5"
+      },
+      "bin": {
+        "tsx": "dist/cli.mjs"
+      },
+      "engines": {
+        "node": ">=18.0.0"
+      },
+      "optionalDependencies": {
+        "fsevents": "~2.3.3"
+      }
+    },
+    "node_modules/type-is": {
+      "version": "2.0.1",
+      "resolved": "https://registry.npmjs.org/type-is/-/type-is-2.0.1.tgz",
+      "integrity": "sha512-OZs6gsjF4vMp32qrCbiVSkrFmXtG/AZhY3t0iAMrMBiAZyV9oALtXO8hsrHbMXF9x6L3grlFuwW2oAz7cav+Gw==",
+      "license": "MIT",
+      "dependencies": {
+        "content-type": "^1.0.5",
+        "media-typer": "^1.1.0",
+        "mime-types": "^3.0.0"
+      },
+      "engines": {
+        "node": ">= 0.6"
+      }
+    },
+    "node_modules/typescript": {
+      "version": "5.9.3",
+      "resolved": "https://registry.npmjs.org/typescript/-/typescript-5.9.3.tgz",
+      "integrity": "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw==",
+      "license": "Apache-2.0",
+      "peer": true,
+      "bin": {
+        "tsc": "bin/tsc",
+        "tsserver": "bin/tsserver"
+      },
+      "engines": {
+        "node": ">=14.17"
+      }
+    },
+    "node_modules/undici": {
+      "version": "6.22.0",
+      "resolved": "https://registry.npmjs.org/undici/-/undici-6.22.0.tgz",
+      "integrity": "sha512-hU/10obOIu62MGYjdskASR3CUAiYaFTtC9Pa6vHyf//mAipSvSQg6od2CnJswq7fvzNS3zJhxoRkgNVaHurWKw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">=18.17"
+      }
+    },
+    "node_modules/undici-types": {
+      "version": "7.16.0",
+      "resolved": "https://registry.npmjs.org/undici-types/-/undici-types-7.16.0.tgz",
+      "integrity": "sha512-Zz+aZWSj8LE6zoxD+xrjh4VfkIG8Ya6LvYkZqtUQGJPZjYl53ypCaUwWqo7eI0x66KBGeRo+mlBEkMSeSZ38Nw==",
+      "license": "MIT"
+    },
+    "node_modules/universalify": {
+      "version": "2.0.1",
+      "resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.1.tgz",
+      "integrity": "sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 10.0.0"
+      }
+    },
+    "node_modules/unpipe": {
+      "version": "1.0.0",
+      "resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz",
+      "integrity": "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/vary": {
+      "version": "1.1.2",
+      "resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz",
+      "integrity": "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==",
+      "license": "MIT",
+      "engines": {
+        "node": ">= 0.8"
+      }
+    },
+    "node_modules/which": {
+      "version": "2.0.2",
+      "resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
+      "integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
+      "license": "ISC",
+      "dependencies": {
+        "isexe": "^2.0.0"
+      },
+      "bin": {
+        "node-which": "bin/node-which"
+      },
+      "engines": {
+        "node": ">= 8"
+      }
+    },
+    "node_modules/wrappy": {
+      "version": "1.0.2",
+      "resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.2.tgz",
+      "integrity": "sha512-l4Sp/DRseor9wL6EvV2+TuQn63dMkPjZ/sp9XkghTEbV9KlPS1xUsZ3u7/IQO4wxtcFB4bgpQPRcR3QCvezPcQ==",
+      "license": "ISC"
+    },
+    "node_modules/zod": {
+      "version": "4.1.12",
+      "resolved": "https://registry.npmjs.org/zod/-/zod-4.1.12.tgz",
+      "integrity": "sha512-JInaHOamG8pt5+Ey8kGmdcAcg3OL9reK8ltczgHTAwNhMys/6ThXHityHxVV2p3fkw/c+MAvBHFVYHFZDmjMCQ==",
+      "license": "MIT",
+      "peer": true,
+      "funding": {
+        "url": "https://github.com/sponsors/colinhacks"
+      }
+    },
+    "node_modules/zod-to-json-schema": {
+      "version": "3.25.0",
+      "resolved": "https://registry.npmjs.org/zod-to-json-schema/-/zod-to-json-schema-3.25.0.tgz",
+      "integrity": "sha512-HvWtU2UG41LALjajJrML6uQejQhNJx+JBO9IflpSja4R03iNWfKXrj6W2h7ljuLyc1nKS+9yDyL/9tD1U/yBnQ==",
+      "license": "ISC",
+      "peerDependencies": {
+        "zod": "^3.25 || ^4"
+      }
+    }
+  }
+}

+ 51 - 0
package.json

@@ -0,0 +1,51 @@
+{
+  "name": "docs-rag",
+  "version": "1.0.0",
+  "description": "TypeScript project for storing markdown documents in Qdrant with Ollama embeddings and source code parsing",
+  "main": "dist/index.js",
+  "bin": {
+    "docs-rag": "dist/cli/index.js",
+    "docs-rag-mcp": "dist/mcp/agentic-stdio.js"
+  },
+  "scripts": {
+    "build": "tsc",
+    "start": "node dist/cli/index.js",
+    "dev": "tsx src/cli/index.ts",
+    "mcp": "tsx src/mcp/server.ts",
+    "mcp:cli": "DOTENV_SILENT=true node dist/mcp/stdio.js",
+    "mcp:build": "tsc && node dist/mcp/stdio.js",
+    "cli": "node dist/cli/index.js",
+    "parse": "node dist/cli/index.js parse",
+    "parse:dev": "tsx src/cli/index.ts parse",
+    "parse:validate": "node dist/cli/index.js parse-validate"
+  },
+  "keywords": [
+    "qdrant",
+    "ollama",
+    "rag",
+    "markdown",
+    "typescript",
+    "source-code",
+    "documentation",
+    "parser"
+  ],
+  "author": "",
+  "license": "ISC",
+  "type": "commonjs",
+  "dependencies": {
+    "@modelcontextprotocol/sdk": "^1.22.0",
+    "@qdrant/qdrant-js": "^1.16.0",
+    "@types/node": "^24.10.1",
+    "commander": "^14.0.2",
+    "crypto": "^1.0.1",
+    "dotenv": "^17.2.3",
+    "fs-extra": "^11.3.2",
+    "glob": "^13.0.0",
+    "tsx": "^4.20.6",
+    "typescript": "^5.9.3",
+    "zod": "^4.1.12"
+  },
+  "devDependencies": {
+    "@types/fs-extra": "^11.0.4"
+  }
+}

+ 1169 - 0
parser-implementation-plan.md

@@ -0,0 +1,1169 @@
+# Dynamic Source Code Documentation Parser - TypeScript Implementation Plan
+
+## Overview
+Create a modular source code parser in TypeScript that extracts documentation from comments and generates structured markdown files similar to the `/data/tui/docs` structure. The system will be language-agnostic with pluggable parsers, starting with C++, and will integrate into the existing `docs-rag` project as a new CLI command.
+
+## Architecture Design
+
+### 1. Integration with Current Project
+
+The parser will be integrated into the existing `docs-rag` TypeScript project structure:
+
+```
+src/
+├── cli/
+│   ├── index.ts                    # Existing CLI (add parser command)
+│   └── parser-commands.ts          # New parser-specific commands
+├── lib/
+│   └── parser/                     # New parser library
+│       ├── core/
+│       │   ├── interfaces.ts       # Abstract parser interface
+│       │   ├── documentation-generator.ts  # Markdown output generator
+│       │   ├── comment-parser.ts   # Generic comment extraction
+│       │   └── config.ts           # Configuration management
+│       ├── parsers/
+│       │   ├── base-parser.ts      # Base parser class
+│       │   ├── cpp-parser.ts       # C++ implementation
+│       │   ├── [future] python-parser.ts
+│       │   └── [future] java-parser.ts
+│       ├── ast/
+│       │   ├── nodes.ts            # AST node definitions
+│       │   └── visitor.ts          # Visitor pattern for traversal
+│       └── utils/
+│           ├── file-utils.ts       # File system operations
+│           ├── string-utils.ts      # Text processing utilities
+│           └── markdown-utils.ts    # Markdown generation helpers
+├── services/
+│   └── documentService.ts          # Existing service (extend for parser output)
+├── config/
+│   └── parser-config.ts            # Parser-specific configuration
+└── types/
+    └── parser-types.ts             # TypeScript type definitions
+```
+
+### 2. TypeScript Interface Design
+
+#### Parser Interface (src/lib/parser/core/interfaces.ts)
+```typescript
+// Core interfaces for language parsers
+export interface SourceLocation {
+  filePath: string;
+  line: number;
+  column: number;
+  endLine?: number;
+  endColumn?: number;
+}
+
+export interface DocumentationComment {
+  type: 'doxline' | 'doxyblock' | 'javadoc' | 'unknown';
+  rawContent: string;
+  brief: string;
+  detailed: string;
+  tags: DocTag[];
+  location: SourceLocation;
+}
+
+export interface DocTag {
+  name: string;
+  value: string;
+}
+
+export interface ASTNode {
+  type: 'namespace' | 'class' | 'struct' | 'function' | 'method' | 
+        'variable' | 'enum' | 'enum_value' | 'template' | 'module';
+  name: string;
+  documentation: DocumentationComment;
+  location: SourceLocation;
+  children: ASTNode[];
+  [key: string]: any; // Allow type-specific properties
+}
+
+export interface ILanguageParser {
+  getLanguage(): string;
+  getFileExtensions(): string[];
+  canParse(filePath: string): boolean;
+  parseFile(filePath: string): Promise<ASTNode[]>;
+}
+
+export interface DocumentationConfig {
+  outputDirectory: string;
+  indexTitle: string;
+  generatorName: string;
+  generateIndex: boolean;
+  generateModuleIndexes: boolean;
+  includePrivate: boolean;
+  includeSourceLinks: boolean;
+  sourceRootPath?: string;
+  theme: 'material' | 'github' | 'default';
+}
+
+export interface ParserConfig {
+  languages: string[];
+  includePatterns: string[];
+  excludePatterns: string[];
+  outputPath: string;
+  watchMode: boolean;
+  incremental: boolean;
+}
+```
+
+#### Documentation Generator (src/lib/parser/core/documentation-generator.ts)
+```typescript
+import { ASTNode, DocumentationConfig, SourceLocation } from './interfaces';
+import { FileUtils } from '../utils/file-utils';
+import { MarkdownUtils } from '../utils/markdown-utils';
+
+export class DocumentationGenerator {
+  private config: DocumentationConfig;
+  private moduleStack: string[] = [];
+  private modules: ModuleInfo[] = [];
+
+  constructor(config: DocumentationConfig) {
+    this.config = config;
+  }
+
+  async generate(nodes: ASTNode[]): Promise<void> {
+    // Create output directory structure
+    await this.createDirectoryStructure();
+    
+    // Generate main index
+    if (this.config.generateIndex) {
+      await this.generateIndex(nodes);
+    }
+    
+    // Generate module-specific documentation
+    await this.generateModuleDocumentation(nodes);
+    
+    // Generate individual node documentation
+    await this.generateNodeFiles(nodes);
+  }
+
+  private async createDirectoryStructure(): Promise<void> {
+    await FileUtils.ensureDirectory(this.config.outputDirectory);
+  }
+
+  private async generateIndex(nodes: ASTNode[]): Promise<void> {
+    const content = await this.generateIndexContent(nodes);
+    const indexPath = `${this.config.outputDirectory}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private async generateIndexContent(nodes: ASTNode[]): Promise<string> {
+    const modules = this.organizeByModule(nodes);
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${this.config.indexTitle}
+
+`;
+
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      content += this.generateModuleSection(moduleName, moduleNodes);
+    }
+
+    return content;
+  }
+
+  private organizeByModule(nodes: ASTNode[]): Map<string, ASTNode[]> {
+    const modules = new Map<string, ASTNode[]>();
+    
+    for (const node of nodes) {
+      const moduleName = this.extractModuleName(node);
+      if (!modules.has(moduleName)) {
+        modules.set(moduleName, []);
+      }
+      modules.get(moduleName)!.push(node);
+    }
+    
+    return modules;
+  }
+
+  private extractModuleName(node: ASTNode): string {
+    // Extract module name from namespace or file path
+    if (node.type === 'namespace') {
+      return node.name;
+    }
+    
+    // Extract from file path
+    const pathParts = node.location.filePath.split('/');
+    const fileName = pathParts[pathParts.length - 1];
+    const moduleName = fileName.replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '');
+    
+    return moduleName;
+  }
+
+  private generateModuleSection(moduleName: string, nodes: ASTNode[]): string {
+    let section = `:material-package: [${moduleName}](${moduleName}/index.md)
+:   ${this.generateModuleDescription(nodes)}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      section += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        section += `| [${cls.name}](${moduleName}/${cls.name}.md) | ${desc} |\n`;
+      }
+      section += '\n';
+    }
+
+    return section;
+  }
+
+  private async generateModuleDocumentation(nodes: ASTNode[]): Promise<void> {
+    const modules = this.organizeByModule(nodes);
+    
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      await this.generateModuleFile(moduleName, moduleNodes);
+    }
+  }
+
+  private async generateModuleFile(moduleName: string, nodes: ASTNode[]): Promise<void> {
+    const modulePath = `${this.config.outputDirectory}/${moduleName}`;
+    await FileUtils.ensureDirectory(modulePath);
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${moduleName}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      content += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        content += `| [${cls.name}](${cls.name}.md) | ${desc} |\n`;
+      }
+      content += '\n';
+    }
+
+    const indexPath = `${this.config.outputDirectory}/${moduleName}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private async generateNodeFiles(nodes: ASTNode[]): Promise<void> {
+    for (const node of nodes) {
+      if (node.type === 'class' || node.type === 'struct') {
+        await this.generateClassFile(node);
+      } else if (node.type === 'function' || node.type === 'method') {
+        await this.generateFunctionFile(node);
+      }
+    }
+  }
+
+  private async generateClassFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**${node.type} ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+`;
+
+    // Add functions
+    const methods = node.children.filter(n => n.type === 'function' || n.type === 'method');
+    if (methods.length > 0) {
+      content += '## Functions\n\n';
+      content += this.generateFunctionTable(methods);
+      content += '\n## Function Details\n\n';
+      content += this.generateFunctionDetails(methods);
+    }
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private async generateFunctionFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    const content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**function ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+## Signature
+
+\`\`\`${this.getLanguageForSyntax()}
+${node.signature || node.name}
+\`\`\`
+
+## Source Location
+
+File: \`${node.location.filePath}\`:${node.location.line}
+
+`;
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private generateFunctionTable(functions: ASTNode[]): string {
+    let table = '| Name | Description |\n| ---- | ----------- |\n';
+    
+    for (const func of functions) {
+      const desc = func.documentation.brief || '';
+      const anchor = this.generateAnchor(func.name);
+      table += `| [${func.name}](#${anchor}) | ${desc} |\n`;
+    }
+    
+    return table;
+  }
+
+  private generateFunctionDetails(functions: ASTNode[]): string {
+    let details = '';
+    
+    for (const func of functions) {
+      const anchor = this.generateAnchor(func.name);
+      details += `### ${func.name}<a name="${anchor}"></a>\n`;
+      details += `!!! function "${this.formatFunctionSignature(func)}"\n\n`;
+      details += `    ${this.formatDocumentation(func.documentation).replace(/\n/g, '\n    ')}\n\n`;
+    }
+    
+    return details;
+  }
+
+  private formatDocumentation(doc: DocumentationComment): string {
+    if (!doc.brief && !doc.detailed) return '';
+    
+    let formatted = '';
+    if (doc.brief) formatted += `@brief ${doc.brief}\n\n`;
+    if (doc.detailed) formatted += `${doc.detailed}\n\n`;
+    
+    return formatted;
+  }
+
+  private generateAnchor(name: string): string {
+    return name.toLowerCase().replace(/[^a-z0-9]/g, '-');
+  }
+
+  private getLanguageForSyntax(): string {
+    return 'cpp'; // Will be dynamic based on parser language
+  }
+
+  private formatFunctionSignature(func: ASTNode): string {
+    return func.signature || `${func.name}()`;
+  }
+
+  private generateModuleDescription(nodes: ASTNode[]): string {
+    const descriptions = nodes
+      .map(n => n.documentation.brief)
+      .filter(Boolean)
+      .slice(0, 2);
+    
+    return descriptions.join(' ') || 'Module containing various components and utilities.';
+  }
+}
+
+interface ModuleInfo {
+  name: string;
+  path: string;
+  classes: string[];
+  functions: string[];
+  submodules: string[];
+}
+```
+
+## Implementation Phases
+
+### Phase 1: Foundation (Week 1)
+1. **Project Integration**
+   - Update package.json with new dependencies
+   - Create parser directory structure under `src/lib/parser/`
+   - Add TypeScript types and interfaces
+   - Configure build system for new modules
+
+2. **Core Interfaces & Types**
+   - Define parser interface (src/lib/parser/core/interfaces.ts)
+   - Create AST node definitions (src/lib/parser/ast/nodes.ts)
+   - Implement base parser class (src/lib/parser/parsers/base-parser.ts)
+   - Add configuration management (src/lib/parser/core/config.ts)
+
+3. **Utility Components**
+   - File system utilities (src/lib/parser/utils/file-utils.ts)
+   - String processing helpers (src/lib/parser/utils/string-utils.ts)
+   - Markdown formatting utilities (src/lib/parser/utils/markdown-utils.ts)
+
+### Phase 2: C++ Parser (Week 2)
+1. **C++ Comment Parser**
+   - Detect and extract documentation comments (`/** */`, `///`, `//!`)
+   - Parse common documentation tags (`@brief`, `@param`, `@return`, etc.)
+   - Handle multi-line comments with proper formatting
+
+2. **C++ AST Builder**
+   - Parse class/struct definitions using regex patterns
+   - Extract function/method signatures
+   - Identify inheritance relationships
+   - Handle templates and namespaces
+
+3. **Integration Testing**
+   - Test with sample C++ files
+   - Verify AST structure correctness
+   - Validate comment extraction
+
+### Phase 3: CLI Integration (Week 3)
+1. **CLI Command Integration**
+   - Add parser commands to existing CLI (src/cli/parser-commands.ts)
+   - Integrate with existing commander.js structure
+   - Add configuration options and validation
+
+2. **New CLI Commands**
+   ```bash
+   docs-rag parse --language cpp --input ./src --output ./docs
+   docs-rag parse --config parser-config.json
+   docs-rag parse --watch --incremental
+   docs-rag parse --list-languages
+   ```
+
+3. **Configuration System**
+   - Parser-specific configuration (src/config/parser-config.ts)
+   - Integration with existing configuration pattern
+   - Support for project-specific parser settings
+
+### Phase 4: Documentation Generation (Week 4)
+1. **Markdown Output Generator**
+   - Replicate the structure from `/data/tui/docs`
+   - Generate index files with package references
+   - Create detailed class documentation pages
+   - Implement function detail sections
+
+2. **Output Structure**
+   ```
+   docs/
+   ├── index.md                    # Main index
+   ├── module1/
+   │   ├── index.md               # Module index
+   │   ├── Class1.md
+   │   ├── Class2.md
+   │   └── SubModule/
+   │       └── index.md
+   └── module2/
+       └── ...
+   ```
+
+3. **Integration with DocumentService**
+   - Extend DocumentService to handle parser-generated markdown
+   - Add parser output to existing RAG functionality
+   - Enable search across generated documentation
+
+### Phase 5: Advanced Features (Week 5-6)
+1. **Enhanced Comment Parsing**
+   - Support for custom comment tags
+   - Example code block extraction
+   - TODO/FIXME note extraction
+   - Version change detection
+
+2. **Quality Improvements**
+   - Cross-reference generation
+   - Inheritance diagrams (ASCII)
+   - Namespace/module organization
+   - Search-friendly indexing
+
+3. **Performance Optimization**
+   - Parallel file processing using worker threads
+   - Incremental parsing (only changed files)
+   - Memory-efficient AST building
+
+### Phase 6: Extensibility Framework (Week 7-8)
+1. **Plugin Architecture**
+   - Dynamic parser loading
+   - Configuration-driven language selection
+   - Custom output formatters
+
+2. **Future Language Support**
+   - Create interfaces for Python parser
+   - Design Java parser structure
+   - Document extension guidelines
+
+3. **Integration Enhancements**
+   - Integration with existing MCP server
+   - Add parser commands to MCP interface
+   - Enable programmatic parser usage via API
+
+## Detailed Implementation Plan
+
+### Package.json Updates
+
+```json
+{
+  "dependencies": {
+    "@types/node": "^24.10.1",
+    "commander": "^14.0.2",
+    "fs-extra": "^11.3.2",
+    "glob": "^13.0.0",
+    "typescript": "^5.9.3",
+    "zod": "^4.1.12",
+    "chokidar": "^4.0.1",           // For watch mode
+    "typescript-eslint-parser": "^6.0.0", // For future TS parsing
+    "@babel/parser": "^7.23.0",      // For future JS parsing
+    "acorn": "^8.11.0"               // For future JS parsing
+  },
+  "scripts": {
+    "build": "tsc",
+    "dev": "tsx src/cli/index.ts",
+    "parse": "node dist/cli/index.js parse",
+    "parse:dev": "tsx src/cli/index.ts parse"
+  }
+}
+```
+
+### C++ Parser Implementation (src/lib/parser/parsers/cpp-parser.ts)
+
+```typescript
+import { BaseParser } from './base-parser';
+import { ILanguageParser, ASTNode, DocumentationComment } from '../core/interfaces';
+import { FileUtils } from '../utils/file-utils';
+
+export class CppParser extends BaseParser implements ILanguageParser {
+  getLanguage(): string {
+    return 'cpp';
+  }
+
+  getFileExtensions(): string[] {
+    return ['.cpp', '.h', '.hpp', '.cxx', '.cc', '.c'];
+  }
+
+  canParse(filePath: string): boolean {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    return this.getFileExtensions().includes(ext);
+  }
+
+  async parseFile(filePath: string): Promise<ASTNode[]> {
+    const content = await FileUtils.readFile(filePath);
+    const lines = content.split('\n');
+    
+    // Extract comments first
+    const comments = this.extractComments(lines);
+    
+    // Parse code elements
+    const nodes = await this.parseCodeElements(lines, comments, filePath);
+    
+    return nodes;
+  }
+
+  private extractComments(lines: string[]): Map<number, DocumentationComment> {
+    const comments = new Map<number, DocumentationComment>();
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      const trimmed = line.trim();
+      
+      // Detect different comment types
+      if (trimmed.startsWith('/**')) {
+        const comment = this.parseBlockComment(lines, i);
+        if (comment) {
+          comments.set(i + 1, comment);
+          i += this.getCommentHeight(comment.rawContent) - 1;
+        }
+      } else if (trimmed.startsWith('///') || trimmed.startsWith('//!')) {
+        const comment = this.parseLineComment(lines, i);
+        if (comment) {
+          comments.set(i + 1, comment);
+        }
+      }
+    }
+    
+    return comments;
+  }
+
+  private parseBlockComment(lines: string[], startIndex: number): DocumentationComment | null {
+    let content = '';
+    let i = startIndex;
+    
+    // Find the end of the block comment
+    while (i < lines.length && !lines[i].includes('*/')) {
+      content += lines[i] + '\n';
+      i++;
+    }
+    if (i < lines.length) {
+      content += lines[i] + '\n';
+    }
+    
+    return this.parseCommentContent(content, 'doxyblock', startIndex + 1);
+  }
+
+  private parseLineComment(lines: string[], startIndex: number): DocumentationComment | null {
+    const line = lines[startIndex];
+    return this.parseCommentContent(line, 'doxline', startIndex + 1);
+  }
+
+  private parseCommentContent(rawContent: string, type: 'doxline' | 'doxyblock', line: number): DocumentationComment {
+    // Clean up the comment markers
+    const cleaned = rawContent
+      .replace(/\/\*\*|\/\*|\*\/|\/\/\/|\/\!/g, '')
+      .split('\n')
+      .map(line => line.trim().replace(/^\*\s?/, ''))
+      .join('\n')
+      .trim();
+
+    // Extract @tags
+    const tagRegex = /@(\w+)(?:\s+(.+?))?(?=\s+@|$)/gs;
+    const tags: { name: string; value: string }[] = [];
+    let tagMatch;
+    
+    while ((tagMatch = tagRegex.exec(cleaned)) !== null) {
+      tags.push({
+        name: tagMatch[1],
+        value: tagMatch[2]?.trim() || ''
+      });
+    }
+
+    // Extract brief (first sentence or first line before @tags)
+    const beforeTags = cleaned.split('@')[0].trim();
+    const brief = this.extractBrief(beforeTags);
+
+    return {
+      type,
+      rawContent,
+      brief,
+      detailed: beforeTags.substring(brief.length).trim(),
+      tags,
+      location: { filePath: '', line, column: 0 }
+    };
+  }
+
+  private extractBrief(text: string): string {
+    // Try to extract first sentence
+    const sentences = text.split(/[.!?]/);
+    if (sentences.length > 1 && sentences[0].length < 100) {
+      return sentences[0].trim() + '.';
+    }
+    
+    // Fall back to first line
+    const lines = text.split('\n');
+    return lines[0].trim() || text.substring(0, 80).trim();
+  }
+
+  private getCommentHeight(content: string): number {
+    return content.split('\n').length;
+  }
+
+  private async parseCodeElements(
+    lines: string[], 
+    comments: Map<number, DocumentationComment>, 
+    filePath: string
+  ): Promise<ASTNode[]> {
+    const nodes: ASTNode[] = [];
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      
+      // Parse classes
+      const classMatch = line.match(/^\s*(class|struct)\s+(\w+)/);
+      if (classMatch) {
+        const node = await this.parseClass(lines, i, comments, filePath);
+        if (node) {
+          nodes.push(node);
+        }
+      }
+      
+      // Parse functions
+      const functionMatch = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+      if (functionMatch && !line.includes('class')) {
+        const node = await this.parseFunction(lines, i, comments, filePath);
+        if (node) {
+          nodes.push(node);
+        }
+      }
+    }
+    
+    return nodes;
+  }
+
+  private async parseClass(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(class|struct)\s+(\w+)/);
+    
+    if (!match) return null;
+    
+    const [, type, name] = match;
+    
+    // Find preceding comment
+    let comment = comments.get(startIndex);
+    if (!comment) {
+      comment = comments.get(startIndex - 1);
+    }
+    
+    return {
+      type: type as 'class' | 'struct',
+      name,
+      documentation: comment || this.emptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      isStruct: type === 'struct'
+    };
+  }
+
+  private async parseFunction(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+    
+    if (!match) return null;
+    
+    const [, name] = match;
+    
+    // Find preceding comment
+    let comment = comments.get(startIndex);
+    if (!comment) {
+      comment = comments.get(startIndex - 1);
+    }
+    
+    return {
+      type: 'function',
+      name,
+      documentation: comment || this.emptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      signature: line.trim()
+    };
+  }
+
+  private emptyDocumentation(): DocumentationComment {
+    return {
+      type: 'unknown',
+      rawContent: '',
+      brief: '',
+      detailed: '',
+      tags: [],
+      location: { filePath: '', line: 0, column: 0 }
+    };
+  }
+}
+```
+
+### CLI Integration (src/cli/parser-commands.ts)
+
+```typescript
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { ParseService } from '../services/parseService';
+import { existsSync } from 'fs';
+import { resolve } from 'path';
+
+const parseService = new ParseService();
+
+export const parserCommands = (program: Command) => {
+  program
+    .command('parse')
+    .description('Generate documentation from source code comments')
+    .requiredOption('-i, --input <path>', 'Input directory containing source files')
+    .requiredOption('-o, --output <path>', 'Output directory for generated documentation')
+    .option('-l, --languages <languages>', 'Comma-separated list of languages to parse', 'cpp')
+    .option('-c, --config <config>', 'Configuration file path')
+    .option('-w, --watch', 'Watch for file changes and regenerate', false)
+    .option('--incremental', 'Only process changed files', false)
+    .option('--include-private', 'Include private members', false)
+    .option('--dry-run', 'Show what would be parsed without generating files', false)
+    .action(async (options) => {
+      if (!existsSync(options.input)) {
+        console.error(`Error: Input directory '${options.input}' does not exist`);
+        process.exit(1);
+      }
+
+      try {
+        const result = await parseService.generateDocumentation({
+          inputPath: resolve(options.input),
+          outputPath: resolve(options.output),
+          languages: options.languages.split(',').map((l: string) => l.trim()),
+          configPath: options.config ? resolve(options.config) : undefined,
+          watch: options.watch,
+          incremental: options.incremental,
+          includePrivate: options.includePrivate,
+          dryRun: options.dryRun
+        });
+
+        if (options.dryRun) {
+          console.log('Dry run results:');
+          console.log(`  Files to process: ${result.filesToProcess.length}`);
+          console.log(`  Estimated output files: ${result.estimatedFiles}`);
+          result.filesToProcess.forEach(file => console.log(`    - ${file}`));
+        } else {
+          console.log(`Documentation generated successfully!`);
+          console.log(`  Processed ${result.processedFiles} files`);
+          console.log(`  Generated ${result.generatedFiles} documentation files`);
+          console.log(`  Output directory: ${options.output}`);
+          
+          if (result.errors.length > 0) {
+            console.log(`  Warnings/Errors: ${result.errors.length}`);
+            result.errors.forEach(error => console.log(`    - ${error}`));
+          }
+        }
+      } catch (error) {
+        console.error('Error:', error);
+        process.exit(1);
+      }
+    });
+
+  program
+    .command('parse-list-languages')
+    .description('List all supported parser languages')
+    .action(() => {
+      const languages = parseService.getSupportedLanguages();
+      console.log('Supported languages:');
+      languages.forEach(lang => {
+        console.log(`  ${lang.name}: ${lang.description}`);
+        console.log(`    Extensions: ${lang.extensions.join(', ')}`);
+        console.log();
+      });
+    });
+
+  program
+    .command('parse-validate')
+    .description('Validate configuration and source files')
+    .requiredOption('-i, --input <path>', 'Input directory to validate')
+    .option('-c, --config <config>', 'Configuration file to validate')
+    .action(async (options) => {
+      if (!existsSync(options.input)) {
+        console.error(`Error: Input directory '${options.input}' does not exist`);
+        process.exit(1);
+      }
+
+      try {
+        const validation = await parseService.validateConfiguration({
+          inputPath: resolve(options.input),
+          configPath: options.config ? resolve(options.config) : undefined
+        });
+
+        console.log('Validation results:');
+        console.log(`  Input directory: ${validation.inputValid ? '✓' : '✗'}`);
+        console.log(`  Configuration: ${validation.configValid ? '✓' : '✗'}`);
+        console.log(`  Supported files: ${validation.supportedFiles.length}`);
+        
+        if (validation.unsupportedFiles.length > 0) {
+          console.log(`  Unsupported files: ${validation.unsupportedFiles.length}`);
+          validation.unsupportedFiles.forEach(file => console.log(`    - ${file}`));
+        }
+        
+        if (validation.warnings.length > 0) {
+          console.log(`  Warnings: ${validation.warnings.length}`);
+          validation.warnings.forEach(warning => console.log(`    - ${warning}`));
+        }
+      } catch (error) {
+        console.error('Error:', error);
+        process.exit(1);
+      }
+    });
+};
+```
+
+### Parse Service (src/services/parseService.ts)
+
+```typescript
+import { 
+  ILanguageParser, 
+  DocumentationConfig, 
+  ParserConfig, 
+  ASTNode 
+} from '../lib/parser/core/interfaces';
+import { CppParser } from '../lib/parser/parsers/cpp-parser';
+import { DocumentationGenerator } from '../lib/parser/core/documentation-generator';
+import { FileUtils } from '../lib/parser/utils/file-utils';
+import { ConfigLoader } from '../lib/parser/core/config';
+import { chokidar } from 'chokidar';
+import { resolve, basename, extname } from 'path';
+
+interface ParseOptions {
+  inputPath: string;
+  outputPath: string;
+  languages: string[];
+  configPath?: string;
+  watch?: boolean;
+  incremental?: boolean;
+  includePrivate?: boolean;
+  dryRun?: boolean;
+}
+
+interface ParseResult {
+  processedFiles: number;
+  generatedFiles: number;
+  filesToProcess: string[];
+  errors: string[];
+  estimatedFiles: number;
+}
+
+interface ValidationResult {
+  inputValid: boolean;
+  configValid: boolean;
+  supportedFiles: string[];
+  unsupportedFiles: string[];
+  warnings: string[];
+}
+
+interface LanguageInfo {
+  name: string;
+  description: string;
+  extensions: string[];
+}
+
+export class ParseService {
+  private parsers: Map<string, ILanguageParser> = new Map();
+  private configLoader: ConfigLoader;
+
+  constructor() {
+    this.configLoader = new ConfigLoader();
+    this.registerDefaultParsers();
+  }
+
+  private registerDefaultParsers(): void {
+    this.parsers.set('cpp', new CppParser());
+    // Future: this.parsers.set('python', new PythonParser());
+    // Future: this.parsers.set('java', new JavaParser());
+  }
+
+  getSupportedLanguages(): LanguageInfo[] {
+    return Array.from(this.parsers.entries()).map(([key, parser]) => ({
+      name: key,
+      description: `${key.toUpperCase()} source code parser`,
+      extensions: parser.getFileExtensions()
+    }));
+  }
+
+  async generateDocumentation(options: ParseOptions): Promise<ParseResult> {
+    // Load configuration
+    const config = await this.loadConfiguration(options);
+    
+    // Find source files
+    const sourceFiles = await this.findSourceFiles(options.inputPath, options.languages);
+    
+    if (options.dryRun) {
+      return {
+        processedFiles: 0,
+        generatedFiles: 0,
+        filesToProcess: sourceFiles,
+        errors: [],
+        estimatedFiles: this.estimateOutputFiles(sourceFiles)
+      };
+    }
+
+    // Parse files
+    const nodes: ASTNode[] = [];
+    const errors: string[] = [];
+
+    for (const filePath of sourceFiles) {
+      try {
+        const fileNodes = await this.parseFile(filePath);
+        nodes.push(...fileNodes);
+      } catch (error) {
+        errors.push(`Failed to parse ${filePath}: ${error}`);
+      }
+    }
+
+    // Generate documentation
+    const generator = new DocumentationGenerator(config);
+    await generator.generate(nodes);
+
+    const generatedFiles = await this.countGeneratedFiles(config.outputDirectory);
+
+    return {
+      processedFiles: sourceFiles.length,
+      generatedFiles,
+      filesToProcess: sourceFiles,
+      errors
+    };
+  }
+
+  async validateConfiguration(options: {
+    inputPath: string;
+    configPath?: string;
+  }): Promise<ValidationResult> {
+    const inputValid = await FileUtils.exists(options.inputPath);
+    
+    let configValid = true;
+    let warnings: string[] = [];
+
+    try {
+      if (options.configPath) {
+        await this.configLoader.loadConfig(options.configPath);
+      }
+    } catch (error) {
+      configValid = false;
+      warnings.push(`Configuration error: ${error}`);
+    }
+
+    const allFiles = await FileUtils.getAllFiles(options.inputPath);
+    const supportedFiles = allFiles.filter(file => {
+      const ext = extname(file);
+      return Array.from(this.parsers.values()).some(parser => 
+        parser.getFileExtensions().includes(ext)
+      );
+    });
+
+    const unsupportedFiles = allFiles.filter(file => !supportedFiles.includes(file));
+
+    return {
+      inputValid,
+      configValid,
+      supportedFiles,
+      unsupportedFiles,
+      warnings
+    };
+  }
+
+  private async loadConfiguration(options: ParseOptions): Promise<DocumentationConfig> {
+    let config: DocumentationConfig;
+
+    if (options.configPath) {
+      config = await this.configLoader.loadConfig(options.configPath);
+    } else {
+      config = this.createDefaultConfig(options);
+    }
+
+    return config;
+  }
+
+  private createDefaultConfig(options: ParseOptions): DocumentationConfig {
+    return {
+      outputDirectory: options.outputPath,
+      indexTitle: 'Source Code Documentation',
+      generatorName: 'docs-rag-parser',
+      generateIndex: true,
+      generateModuleIndexes: true,
+      includePrivate: options.includePrivate || false,
+      includeSourceLinks: true,
+      sourceRootPath: options.inputPath,
+      theme: 'material'
+    };
+  }
+
+  private async findSourceFiles(inputPath: string, languages: string[]): Promise<string[]> {
+    const allFiles = await FileUtils.getAllFiles(inputPath);
+    
+    return allFiles.filter(file => {
+      const parser = this.getParserForFile(file);
+      return parser && languages.includes(parser.getLanguage());
+    });
+  }
+
+  private getParserForFile(filePath: string): ILanguageParser | null {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    
+    for (const parser of this.parsers.values()) {
+      if (parser.getFileExtensions().includes(ext)) {
+        return parser;
+      }
+    }
+    
+    return null;
+  }
+
+  private async parseFile(filePath: string): Promise<ASTNode[]> {
+    const parser = this.getParserForFile(filePath);
+    if (!parser) {
+      throw new Error(`No parser found for file: ${filePath}`);
+    }
+
+    return await parser.parseFile(filePath);
+  }
+
+  private estimateOutputFiles(sourceFiles: string[]): number {
+    // Rough estimation: one module index per unique file name + one per class/function
+    const uniqueNames = new Set(sourceFiles.map(file => 
+      basename(file).replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '')
+    ));
+    return uniqueNames.size + Math.floor(sourceFiles.length * 2); // Estimated 2 items per file
+  }
+
+  private async countGeneratedFiles(outputDirectory: string): Promise<number> {
+    try {
+      const files = await FileUtils.getAllFiles(outputDirectory);
+      return files.filter(file => file.endsWith('.md')).length;
+    } catch {
+      return 0;
+    }
+  }
+}
+```
+
+### Update Main CLI (src/cli/index.ts)
+
+```typescript
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { DocumentService } from '../services/documentService';
+import { existsSync } from 'fs';
+import { parserCommands } from './parser-commands';
+
+const program = new Command();
+const documentService = new DocumentService();
+
+program
+  .name('docs-rag')
+  .description('CLI tool for managing markdown documents in Qdrant and parsing source code')
+  .version('1.0.0');
+
+// Existing commands...
+// ... (keep all existing commands)
+
+// Add parser commands
+parserCommands(program);
+
+program.parse();
+```
+
+## Success Criteria
+
+1. **Functional Requirements**
+   - Parse C++ files with 95% comment extraction accuracy
+   - Generate documentation matching `/data/tui/docs` format
+   - Process projects with 1000+ files efficiently
+   - Support incremental rebuilds
+
+2. **Integration Requirements**
+   - Seamlessly integrate into existing CLI structure
+   - Follow existing TypeScript patterns and conventions
+   - Use existing dependencies where possible
+   - Maintain compatibility with existing commands
+
+3. **Quality Requirements**
+   - Produce syntactically correct Markdown
+   - Generate cross-references between modules
+   - Handle complex C++ constructs (templates, namespaces)
+   - Maintain source code privacy (no code leakage in docs)
+
+## Future Enhancements
+
+### Short Term (Next 3 months)
+- Python language parser
+- Integration with DocumentService for RAG
+- HTML documentation output
+- Enhanced watch mode with better performance
+
+### Long Term (6-12 months)
+- Multi-language project support
+- Interactive documentation web UI
+- AI-powered comment generation suggestions
+- Integration with CI/CD pipelines
+- Documentation quality scoring
+
+This updated plan fully integrates the source code parser into the existing TypeScript project while following the established patterns and conventions.

+ 607 - 0
quick-start-implementation.md

@@ -0,0 +1,607 @@
+# Quick Start Implementation Guide
+
+## Project Setup Commands
+
+```bash
+# Create project structure
+mkdir docs-parser && cd docs-parser
+mkdir -p src/{core,parsers,ast,utils} include tests examples config docs
+
+# Initialize CMake project
+cat > CMakeLists.txt << 'EOF'
+cmake_minimum_required(VERSION 3.16)
+project(DocsParser VERSION 1.0.0 LANGUAGES CXX)
+
+set(CMAKE_CXX_STANDARD 20)
+set(CMAKE_CXX_STANDARD_REQUIRED ON)
+
+# Include directories
+include_directories(include)
+
+# Find required packages
+find_package(GTest REQUIRED)
+
+# Add subdirectories
+add_subdirectory(src)
+add_subdirectory(tests)
+
+enable_testing()
+EOF
+
+# Create src CMakeLists.txt
+cat > src/CMakeLists.txt << 'EOF'
+# Core library sources
+set(CORE_SOURCES
+    core/parser_interface.cpp
+    core/documentation_generator.cpp
+    core/comment_parser.cpp
+    core/config.cpp
+)
+
+# AST sources
+set(AST_SOURCES
+    ast/ast_nodes.cpp
+    ast/visitor.cpp
+)
+
+# Parser sources
+set(PARSER_SOURCES
+    parsers/cpp_parser.cpp
+)
+
+# Utility sources
+set(UTIL_SOURCES
+    utils/file_utils.cpp
+    utils/string_utils.cpp
+    utils/markdown_utils.cpp
+)
+
+# Create static library
+add_library(docs_parser_lib
+    ${CORE_SOURCES}
+    ${AST_SOURCES}
+    ${PARSER_SOURCES}
+    ${UTIL_SOURCES}
+)
+
+target_include_directories(docs_parser_lib PUBLIC include)
+
+# Link required libraries
+target_link_libraries(docs_parser_lib PRIVATE GTest::gtest)
+
+# Create main executable
+add_executable(docs_parser main.cpp)
+target_link_libraries(docs_parser docs_parser_lib)
+EOF
+
+# Create tests CMakeLists.txt
+cat > tests/CMakeLists.txt << 'EOF'
+# Test sources
+file(GLOB_RECURSE TEST_SOURCES "*.cpp")
+
+# Create test executable
+add_executable(run_tests ${TEST_SOURCES})
+target_link_libraries(run_tests docs_parser_lib GTest::gtest_main GTest::gtest)
+
+# Register tests
+add_test(NAME AllTests COMMAND run_tests)
+EOF
+```
+
+## Core Implementation Files
+
+### 1. Core Interface (`include/parser_interface.h`)
+
+```cpp
+#pragma once
+
+#include <memory>
+#include <vector>
+#include <string>
+
+struct SourceLocation {
+    std::string filePath;
+    int line = 0;
+    int column = 0;
+};
+
+class ASTNode {
+public:
+    enum class NodeType { CLASS, FUNCTION, NAMESPACE, UNKNOWN };
+    
+    virtual ~ASTNode() = default;
+    virtual NodeType getType() const = 0;
+    virtual std::string getName() const = 0;
+    virtual std::string getDocumentation() const = 0;
+    virtual SourceLocation getLocation() const = 0;
+};
+
+class ILanguageParser {
+public:
+    virtual ~ILanguageParser() = default;
+    virtual std::vector<std::unique_ptr<ASTNode>> parseFile(const std::string& filePath) = 0;
+    virtual std::string getLanguage() const = 0;
+    virtual std::vector<std::string> getFileExtensions() const = 0;
+    virtual bool canParse(const std::string& filePath) const = 0;
+};
+```
+
+### 2. Simple C++ Parser (`src/parsers/cpp_parser.cpp`)
+
+```cpp
+#include "parser_interface.h"
+#include <regex>
+#include <fstream>
+#include <sstream>
+
+class SimpleClassNode : public ASTNode {
+private:
+    std::string name;
+    std::string documentation;
+    SourceLocation location;
+    std::vector<std::unique_ptr<ASTNode>> children;
+    
+public:
+    SimpleClassNode(const std::string& n, const std::string& doc, SourceLocation loc)
+        : name(n), documentation(doc), location(loc) {}
+    
+    NodeType getType() const override { return NodeType::CLASS; }
+    std::string getName() const override { return name; }
+    std::string getDocumentation() const override { return documentation; }
+    SourceLocation getLocation() const override { return location; }
+    
+    void addChild(std::unique_ptr<ASTNode> child) {
+        children.push_back(std::move(child));
+    }
+};
+
+class SimpleFunctionNode : public ASTNode {
+private:
+    std::string name;
+    std::string documentation;
+    std::string signature;
+    SourceLocation location;
+    
+public:
+    SimpleFunctionNode(const std::string& n, const std::string& doc, 
+                      const std::string& sig, SourceLocation loc)
+        : name(n), documentation(doc), signature(sig), location(loc) {}
+    
+    NodeType getType() const override { return NodeType::FUNCTION; }
+    std::string getName() const override { return name; }
+    std::string getDocumentation() const override { return documentation; }
+    SourceLocation getLocation() const override { return location; }
+    
+    const std::string& getSignature() const { return signature; }
+};
+
+class CppParser : public ILanguageParser {
+public:
+    std::vector<std::unique_ptr<ASTNode>> parseFile(const std::string& filePath) override {
+        std::vector<std::unique_ptr<ASTNode>> nodes;
+        
+        std::ifstream file(filePath);
+        if (!file.is_open()) return nodes;
+        
+        std::string content((std::istreambuf_iterator<char>(file)),
+                           std::istreambuf_iterator<char>());
+        
+        // Extract comments first
+        std::map<int, std::string> comments;
+        extractComments(content, comments);
+        
+        // Simple regex parsing for classes and functions
+        parseClasses(content, comments, nodes, filePath);
+        parseFunctions(content, comments, nodes, filePath);
+        
+        return nodes;
+    }
+    
+    std::string getLanguage() const override { return "cpp"; }
+    
+    std::vector<std::string> getFileExtensions() const override {
+        return {".cpp", ".h", ".hpp", ".cxx", ".cc", ".c"};
+    }
+    
+    bool canParse(const std::string& filePath) const override {
+        std::string ext = filePath.substr(filePath.find_last_of('.'));
+        auto extensions = getFileExtensions();
+        return std::find(extensions.begin(), extensions.end(), ext) != extensions.end();
+    }
+    
+private:
+    void extractComments(const std::string& content, std::map<int, std::string>& comments) {
+        std::regex commentRegex(R"((/\*\*.*?\*/|///.*?$|//!.*?$))", std::regex::dotall);
+        std::sregex_iterator iter(content.begin(), content.end(), commentRegex);
+        std::sregex_iterator end;
+        
+        std::istringstream stream(content);
+        std::string line;
+        int lineNum = 0;
+        
+        while (std::getline(stream, line)) {
+            lineNum++;
+            for (auto it = iter; it != end; ++it) {
+                std::smatch match = *it;
+                if (match.position() < line.length() + stream.tellg()) {
+                    std::string comment = match.str();
+                    if (comment.find("/**") == 0 || comment.find("///") == 0 || comment.find("//!") == 0) {
+                        // Clean up comment
+                        std::string clean = std::regex_replace(comment, std::regex(R"(/\*\*|\*/|///|//!|\*)"), "");
+                        clean = std::regex_replace(clean, std::regex(R"(^\s+)"), "");
+                        comments[lineNum] = clean;
+                    }
+                }
+            }
+        }
+    }
+    
+    void parseClasses(const std::string& content, const std::map<int, std::string>& comments,
+                     std::vector<std::unique_ptr<ASTNode>>& nodes, const std::string& filePath) {
+        std::regex classRegex(R"(class\s+(\w+)[^{]*\{)");
+        std::sregex_iterator iter(content.begin(), content.end(), classRegex);
+        std::sregex_iterator end;
+        
+        for (auto it = iter; it != end; ++it) {
+            std::smatch match = *it;
+            std::string className = match[1].str();
+            
+            // Find line number
+            std::string before = content.substr(0, match.position());
+            int line = std::count(before.begin(), before.end(), '\n') + 1;
+            
+            // Find preceding comment
+            std::string doc;
+            auto commentIt = comments.find(line - 1);
+            if (commentIt != comments.end()) {
+                doc = commentIt->second;
+            }
+            
+            SourceLocation loc{filePath, line};
+            nodes.push_back(std::make_unique<SimpleClassNode>(className, doc, loc));
+        }
+    }
+    
+    void parseFunctions(const std::string& content, const std::map<int, std::string>& comments,
+                       std::vector<std::unique_ptr<ASTNode>>& nodes, const std::string& filePath) {
+        std::regex functionRegex(R"((\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;))");
+        std::sregex_iterator iter(content.begin(), content.end(), functionRegex);
+        std::sregex_iterator end;
+        
+        for (auto it = iter; it != end; ++it) {
+            std::smatch match = *it;
+            std::string functionName = match[2].str();
+            std::string signature = match.str();
+            
+            // Skip class definitions
+            if (signature.find("class ") == 0) continue;
+            
+            // Find line number
+            std::string before = content.substr(0, match.position());
+            int line = std::count(before.begin(), before.end(), '\n') + 1;
+            
+            // Find preceding comment
+            std::string doc;
+            auto commentIt = comments.find(line - 1);
+            if (commentIt != comments.end()) {
+                doc = commentIt->second;
+            }
+            
+            SourceLocation loc{filePath, line};
+            nodes.push_back(std::make_unique<SimpleFunctionNode>(functionName, doc, signature, loc));
+        }
+    }
+};
+```
+
+### 3. Documentation Generator (`src/core/documentation_generator.cpp`)
+
+```cpp
+#include "parser_interface.h"
+#include <fstream>
+#include <filesystem>
+#include <iostream>
+
+struct DocumentationConfig {
+    std::string outputDirectory = "docs";
+    std::string indexTitle = "Documentation";
+    std::string generatorName = "docs-parser";
+};
+
+class DocumentationGenerator {
+private:
+    DocumentationConfig config;
+    std::string currentContent;
+    
+public:
+    explicit DocumentationGenerator(const DocumentationConfig& cfg) : config(cfg) {}
+    
+    void generate(const std::vector<std::unique_ptr<ASTNode>>& nodes) {
+        // Create output directory
+        std::filesystem::create_directories(config.outputDirectory);
+        
+        // Generate main index
+        generateIndex(nodes);
+        
+        // Generate individual files for classes
+        generateClassFiles(nodes);
+        
+        // Generate module organization
+        generateModuleStructure(nodes);
+    }
+    
+private:
+    void generateIndex(const std::vector<std::unique_ptr<ASTNode>>& nodes) {
+        std::string content = R"(---
+generator: )" + config.generatorName + R"(---
+
+# )" + config.indexTitle + R"(
+
+)";
+        
+        content += generateModuleList(nodes);
+        
+        std::ofstream indexFile(config.outputDirectory + "/index.md");
+        indexFile << content;
+        indexFile.close();
+    }
+    
+    std::string generateModuleList(const std::vector<std::unique_ptr<ASTNode>>& nodes) {
+        std::stringstream ss;
+        
+        // Group classes and functions
+        std::vector<ASTNode*> classes;
+        std::vector<ASTNode*> functions;
+        
+        for (const auto& node : nodes) {
+            if (node->getType() == ASTNode::NodeType::CLASS) {
+                classes.push_back(node.get());
+            } else if (node->getType() == ASTNode::NodeType::FUNCTION) {
+                functions.push_back(node.get());
+            }
+        }
+        
+        if (!classes.empty()) {
+            ss << "## Classes\n\n";
+            for (const auto* cls : classes) {
+                ss << "- [" << cls->getName() << "](" << cls->getName() << ".md)";
+                if (!cls->getDocumentation().empty()) {
+                    ss << " - " << extractBrief(cls->getDocumentation());
+                }
+                ss << "\n";
+            }
+            ss << "\n";
+        }
+        
+        if (!functions.empty()) {
+            ss << "## Functions\n\n";
+            for (const auto* func : functions) {
+                ss << "- [" << func->getName() << "](" << func->getName() << ".md)";
+                if (!func->getDocumentation().empty()) {
+                    ss << " - " << extractBrief(func->getDocumentation());
+                }
+                ss << "\n";
+            }
+            ss << "\n";
+        }
+        
+        return ss.str();
+    }
+    
+    void generateClassFiles(const std::vector<std::unique_ptr<ASTNode>>& nodes) {
+        for (const auto& node : nodes) {
+            if (node->getType() == ASTNode::NodeType::CLASS) {
+                generateClassFile(static_cast<SimpleClassNode*>(node.get()));
+            } else if (node->getType() == ASTNode::NodeType::FUNCTION) {
+                generateFunctionFile(static_cast<SimpleFunctionNode*>(node.get()));
+            }
+        }
+    }
+    
+    void generateClassFile(const SimpleClassNode* classNode) {
+        std::string filename = config.outputDirectory + "/" + classNode->getName() + ".md";
+        std::ofstream file(filename);
+        
+        file << R"(---
+generator: )" << config.generatorName << R"(---
+
+# )" << classNode->getName() << R"(
+
+**class )" << classNode->getName() << R"(**
+
+)";
+        
+        if (!classNode->getDocumentation().empty()) {
+            file << formatDocumentation(classNode->getDocumentation()) << "\n\n";
+        }
+        
+        file << "## Source Location\n\n";
+        file << "File: `" << classNode->getLocation().filePath;
+        file << "`:" << classNode->getLocation().line << "\n\n";
+        
+        file.close();
+    }
+    
+    void generateFunctionFile(const SimpleFunctionNode* funcNode) {
+        std::string filename = config.outputDirectory + "/" + funcNode->getName() + ".md";
+        std::ofstream file(filename);
+        
+        file << R"(---
+generator: )" << config.generatorName << R"(---
+
+# )" << funcNode->getName() << R"(
+
+**function )" << funcNode->getName() << R"(**
+
+)";
+        
+        if (!funcNode->getDocumentation().empty()) {
+            file << formatDocumentation(funcNode->getDocumentation()) << "\n\n";
+        }
+        
+        file << "## Signature\n\n";
+        file << "```cpp\n" << funcNode->getSignature() << "\n```\n\n";
+        
+        file << "## Source Location\n\n";
+        file << "File: `" << funcNode->getLocation().filePath;
+        file << "`:" << funcNode->getLocation().line << "\n\n";
+        
+        file.close();
+    }
+    
+    void generateModuleStructure(const std::vector<std::unique_ptr<ASTNode>>& nodes) {
+        // For now, just put everything in the root directory
+        // Later we'll organize this into subdirectories based on namespaces
+    }
+    
+    std::string extractBrief(const std::string& doc) {
+        // Extract first sentence as brief description
+        size_t pos = doc.find('.');
+        if (pos != std::string::npos && pos < 100) {
+            return doc.substr(0, pos + 1);
+        }
+        return doc.length() > 80 ? doc.substr(0, 77) + "..." : doc;
+    }
+    
+    std::string formatDocumentation(const std::string& doc) {
+        std::string formatted = doc;
+        
+        // Replace @brief and similar tags
+        formatted = std::regex_replace(formatted, std::regex(R"(@brief\s+)"), "");
+        
+        // Format paragraphs
+        formatted = std::regex_replace(formatted, std::regex(R"(\n\s*\n)"), "\n\n");
+        
+        return formatted;
+    }
+};
+```
+
+### 4. Main Application (`main.cpp`)
+
+```cpp
+#include "parser_interface.h"
+#include <iostream>
+#include <filesystem>
+#include <vector>
+
+// Forward declarations
+std::unique_ptr<ILanguageParser> createCppParser();
+
+int main(int argc, char* argv[]) {
+    if (argc < 2) {
+        std::cerr << "Usage: docs-parser <input-path> [output-path]" << std::endl;
+        return 1;
+    }
+    
+    std::string inputPath = argv[1];
+    std::string outputPath = argc > 2 ? argv[2] : "docs";
+    
+    // Configuration
+    DocumentationConfig config;
+    config.outputDirectory = outputPath;
+    config.indexTitle = "Source Code Documentation";
+    config.generatorName = "docs-parser";
+    
+    // Create parser
+    auto parser = createCppParser();
+    
+    // Find source files
+    std::vector<std::string> sourceFiles;
+    for (const auto& entry : std::filesystem::recursive_directory_iterator(inputPath)) {
+        if (entry.is_regular_file() && parser->canParse(entry.path().string())) {
+            sourceFiles.push_back(entry.path().string());
+        }
+    }
+    
+    std::cout << "Found " << sourceFiles.size() << " source files" << std::endl;
+    
+    // Parse all files
+    std::vector<std::unique_ptr<ASTNode>> allNodes;
+    for (const auto& file : sourceFiles) {
+        std::cout << "Parsing: " << file << std::endl;
+        auto nodes = parser->parseFile(file);
+        for (auto& node : nodes) {
+            allNodes.push_back(std::move(node));
+        }
+    }
+    
+    std::cout << "Found " << allNodes.size() << " code elements" << std::endl;
+    
+    // Generate documentation
+    DocumentationGenerator generator(config);
+    generator.generate(allNodes);
+    
+    std::cout << "Documentation generated in: " << outputPath << std::endl;
+    return 0;
+}
+
+// Factory function
+std::unique_ptr<ILanguageParser> createCppParser() {
+    return std::make_unique<CppParser>();
+}
+```
+
+## Build and Run Instructions
+
+```bash
+# Build the project
+mkdir build && cd build
+cmake ..
+make
+
+# Run on example C++ code
+./docs_parser /path/to/cpp/source ./output/docs
+
+# Test on the TUI project
+./docs_parser /data/tui/src ./tui-docs
+```
+
+## Test Example File
+
+Create `test_example.cpp`:
+
+```cpp
+#include <iostream>
+
+/**
+ * @brief A simple calculator class
+ * This class provides basic arithmetic operations.
+ */
+class Calculator {
+public:
+    /// @brief Add two numbers
+    int add(int a, int b);
+    
+    /**
+     * @brief Multiply two numbers
+     * @param a First number
+     * @param b Second number
+     * @return Product of a and b
+     */
+    int multiply(int a, int b);
+};
+
+/**
+ * @brief Global utility function
+ * This function demonstrates global documentation.
+ */
+void utilityFunction() {
+    std::cout << "Utility function called" << std::endl;
+}
+```
+
+Run the parser:
+```bash
+./docs_parser . ./test-docs
+```
+
+Expected output structure:
+```
+test-docs/
+├── index.md
+├── Calculator.md
+└── utilityFunction.md
+```
+
+This minimal implementation provides a working foundation that can be extended with more sophisticated parsing, better documentation extraction, and enhanced formatting to match the exact style of the `/data/tui/docs` structure.

+ 141 - 0
src/cli/index.ts

@@ -0,0 +1,141 @@
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { DocumentService } from '../services/documentService';
+import { parserCommands } from './parser-commands';
+import { existsSync } from 'fs';
+
+const program = new Command();
+const documentService = new DocumentService();
+
+program
+  .name('docs-rag')
+  .description('CLI tool for managing markdown documents in Qdrant and parsing source code')
+  .version('1.0.0');
+
+program
+  .command('add')
+  .description('Add a document collection from markdown files')
+  .requiredOption('-n, --name <name>', 'Document name for the collection')
+  .requiredOption('-f, --folder <folder>', 'Folder containing markdown files')
+  .option('-r, --recursive', 'Scan folders recursively', true)
+  .action(async (options) => {
+    if (!existsSync(options.folder)) {
+      console.error(`Error: Folder '${options.folder}' does not exist`);
+      process.exit(1);
+    }
+
+    try {
+      await documentService.addDocument({
+        name: options.name,
+        folder: options.folder,
+        recursive: options.recursive,
+      });
+    } catch (error) {
+      console.error('Error:', error);
+      process.exit(1);
+    }
+  });
+
+program
+  .command('search')
+  .description('Search within a document collection')
+  .requiredOption('-d, --document <document>', 'Document name to search in')
+  .requiredOption('-q, --query <query>', 'Search query')
+  .option('-l, --limit <limit>', 'Maximum number of results', '10')
+  .action(async (options) => {
+    try {
+      const results = await documentService.searchDocuments(
+        options.document,
+        options.query,
+        parseInt(options.limit)
+      );
+      
+      console.log(`Found ${results.length} results:`);
+      console.log('');
+      
+      results.forEach((result, index) => {
+        console.log(`${index + 1}. Score: ${result.score.toFixed(4)}`);
+        console.log(`   File: ${result.payload.filename}`);
+        console.log(`   Paragraph ${result.payload.paragraphIndex + 1}:`);
+        console.log(`   ${result.payload.text.substring(0, 200)}${result.payload.text.length > 200 ? '...' : ''}`);
+        console.log('');
+      });
+    } catch (error) {
+      console.error('Error:', error);
+      process.exit(1);
+    }
+  });
+
+program
+  .command('list')
+  .description('List all document collections')
+  .action(async () => {
+    try {
+      const collections = await documentService.listCollections();
+      
+      if (collections.length === 0) {
+        console.log('No document collections found');
+      } else {
+        console.log('Available document collections:');
+        collections.forEach(collection => console.log(`  - ${collection}`));
+      }
+    } catch (error) {
+      console.error('Error:', error);
+      process.exit(1);
+    }
+  });
+
+program
+  .command('info')
+  .description('Get information about a document collection')
+  .requiredOption('-d, --document <document>', 'Document name')
+  .action(async (options) => {
+    try {
+      const info = await documentService.getDocumentInfo(options.document);
+      console.log(`Collection: ${options.document}`);
+      if (info.points_count !== undefined) {
+        console.log(`Vectors count: ${info.points_count}`);
+        console.log(`Vector size: ${info.config.params.vectors.size}`);
+        console.log(`Distance: ${info.config.params.vectors.distance}`);
+      } else {
+        console.log('Collection info structure:', JSON.stringify(info, null, 2));
+      }
+    } catch (error) {
+      console.error('Error:', error);
+      process.exit(1);
+    }
+  });
+
+program
+  .command('mcp')
+  .description('Start the MCP server for document RAG')
+  .option('-t, --transport <transport>', 'Transport type (stdio or http)', 'stdio')
+  .option('-p, --port <port>', 'Port for HTTP transport (only with --transport http)')
+  .action(async (options) => {
+    try {
+      if (options.transport === 'stdio') {
+        // Start MCP server with stdio transport
+        const { startMcpServer } = await import('../mcp/server.js');
+        await startMcpServer();
+      } else if (options.transport === 'http') {
+        if (!options.port) {
+          console.error('Error: --port is required when using --transport http');
+          process.exit(1);
+        }
+        console.log(`HTTP transport not yet implemented for port ${options.port}`);
+        process.exit(1);
+      } else {
+        console.error(`Error: Invalid transport '${options.transport}'. Must be 'stdio' or 'http'`);
+        process.exit(1);
+      }
+    } catch (error) {
+      console.error('Error starting MCP server:', error);
+      process.exit(1);
+    }
+  });
+
+// Add parser commands
+parserCommands(program);
+
+program.parse();

+ 111 - 0
src/cli/parser-commands.ts

@@ -0,0 +1,111 @@
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { ParseService } from '../services/parseService';
+import { existsSync } from 'fs';
+import { resolve } from 'path';
+
+const parseService = new ParseService();
+
+export const parserCommands = (program: Command) => {
+  program
+    .command('parse')
+    .description('Generate documentation from source code comments')
+    .requiredOption('-i, --input <path>', 'Input directory containing source files')
+    .requiredOption('-o, --output <path>', 'Output directory for generated documentation')
+    .option('-l, --languages <languages>', 'Comma-separated list of languages to parse', 'cpp')
+    .option('-c, --config <config>', 'Configuration file path')
+    .option('-w, --watch', 'Watch for file changes and regenerate', false)
+    .option('--incremental', 'Only process changed files', false)
+    .option('--include-private', 'Include private members', false)
+    .option('--dry-run', 'Show what would be parsed without generating files', false)
+    .action(async (options) => {
+      if (!existsSync(options.input)) {
+        console.error(`Error: Input directory '${options.input}' does not exist`);
+        process.exit(1);
+      }
+
+      try {
+        const result = await parseService.generateDocumentation({
+          inputPath: resolve(options.input),
+          outputPath: resolve(options.output),
+          languages: options.languages.split(',').map((l: string) => l.trim()),
+          configPath: options.config ? resolve(options.config) : undefined,
+          watch: options.watch,
+          incremental: options.incremental,
+          includePrivate: options.includePrivate,
+          dryRun: options.dryRun
+        });
+
+        if (options.dryRun) {
+          console.log('Dry run results:');
+          console.log(`  Files to process: ${result.filesToProcess.length}`);
+          console.log(`  Estimated output files: ${result.estimatedFiles}`);
+          result.filesToProcess.forEach((file: string) => console.log(`    - ${file}`));
+        } else {
+          console.log(`Documentation generated successfully!`);
+          console.log(`  Processed ${result.processedFiles} files`);
+          console.log(`  Generated ${result.generatedFiles} documentation files`);
+          console.log(`  Output directory: ${options.output}`);
+          
+          if (result.errors.length > 0) {
+            console.log(`  Warnings/Errors: ${result.errors.length}`);
+            result.errors.forEach((error: string) => console.log(`    - ${error}`));
+          }
+        }
+      } catch (error: any) {
+        console.error('Error:', error);
+        process.exit(1);
+      }
+    });
+
+  program
+    .command('parse-list-languages')
+    .description('List all supported parser languages')
+    .action(() => {
+      const languages = parseService.getSupportedLanguages();
+      console.log('Supported languages:');
+      languages.forEach(lang => {
+        console.log(`  ${lang.name}: ${lang.description}`);
+        console.log(`    Extensions: ${lang.extensions.join(', ')}`);
+        console.log();
+      });
+    });
+
+  program
+    .command('parse-validate')
+    .description('Validate configuration and source files')
+    .requiredOption('-i, --input <path>', 'Input directory to validate')
+    .option('-c, --config <config>', 'Configuration file to validate')
+    .action(async (options) => {
+      if (!existsSync(options.input)) {
+        console.error(`Error: Input directory '${options.input}' does not exist`);
+        process.exit(1);
+      }
+
+      try {
+        const validation = await parseService.validateConfiguration({
+          inputPath: resolve(options.input),
+          configPath: options.config ? resolve(options.config) : undefined
+        });
+
+        console.log('Validation results:');
+        console.log(`  Input directory: ${validation.inputValid ? '✓' : '✗'}`);
+        console.log(`  Configuration: ${validation.configValid ? '✓' : '✗'}`);
+        console.log(`  Supported files: ${validation.supportedFiles.length}`);
+        
+        if (validation.unsupportedFiles.length > 0) {
+          console.log(`  Unsupported files: ${validation.unsupportedFiles.length}`);
+          validation.unsupportedFiles.forEach((file: string) => console.log(`    - ${file}`));
+        }
+        
+        if (validation.warnings.length > 0) {
+          console.log(`  Warnings: ${validation.warnings.length}`);
+          validation.warnings.forEach((warning: string) => console.log(`    - ${warning}`));
+        }
+      } catch (error: any) {
+        console.error('Error:', error);
+        process.exit(1);
+      }
+    });
+};

+ 16 - 0
src/config/index.ts

@@ -0,0 +1,16 @@
+import dotenv from 'dotenv';
+
+dotenv.config({
+  "quiet": true
+});
+
+export const config = {
+  qdrant: {
+    url: process.env.QDRANT_URL || 'http://localhost:6333',
+    apiKey: process.env.QDRANT_API_KEY || '',
+  },
+  ollama: {
+    url: process.env.OLLAMA_URL || 'http://localhost:11434',
+    embeddingModel: process.env.OLLAMA_EMBEDDING_MODEL || 'nomic-embed-text',
+  },
+};

+ 75 - 0
src/lib/fileProcessor.ts

@@ -0,0 +1,75 @@
+import crypto from 'crypto';
+import fs from 'fs-extra';
+import path from 'path';
+
+export interface FileMetadata {
+  hash: string;
+  filename: string;
+  lastModified: number;
+  filePath: string;
+}
+
+export interface DocumentChunk {
+  text: string;
+  metadata: FileMetadata;
+  paragraphIndex: number;
+}
+
+export class FileProcessor {
+  static calculateHash(content: string): string {
+    return crypto.createHash('sha256').update(content).digest('hex');
+  }
+
+  static async getFileMetadata(filePath: string): Promise<FileMetadata> {
+    const stats = await fs.stat(filePath);
+    const content = await fs.readFile(filePath, 'utf-8');
+    const hash = this.calculateHash(content);
+    
+    return {
+      hash,
+      filename: path.basename(filePath),
+      lastModified: stats.mtime.getTime(),
+      filePath,
+    };
+  }
+
+  static splitIntoParagraphs(text: string): string[] {
+    return text
+      .split(/\n\s*\n/)
+      .map(paragraph => paragraph.trim())
+      .filter(paragraph => paragraph.length > 0);
+  }
+
+  static async processMarkdownFile(filePath: string): Promise<DocumentChunk[]> {
+    const metadata = await this.getFileMetadata(filePath);
+    const content = await fs.readFile(filePath, 'utf-8');
+    const paragraphs = this.splitIntoParagraphs(content);
+    
+    return paragraphs.map((paragraph, index) => ({
+      text: paragraph,
+      metadata,
+      paragraphIndex: index,
+    }));
+  }
+
+  static async findMarkdownFiles(dir: string, recursive: boolean = true): Promise<string[]> {
+    const files: string[] = [];
+    
+    async function scan(currentDir: string) {
+      const entries = await fs.readdir(currentDir, { withFileTypes: true });
+      
+      for (const entry of entries) {
+        const fullPath = path.join(currentDir, entry.name);
+        
+        if (entry.isDirectory() && recursive) {
+          await scan(fullPath);
+        } else if (entry.isFile() && path.extname(entry.name).toLowerCase() === '.md') {
+          files.push(fullPath);
+        }
+      }
+    }
+    
+    await scan(dir);
+    return files;
+  }
+}

+ 264 - 0
src/lib/parser/core/documentation-generator.ts

@@ -0,0 +1,264 @@
+import { ASTNode, DocumentationConfig, DocumentationComment } from '../../../types/parser-types';
+import { FileUtils } from '../utils/file-utils';
+import { MarkdownUtils } from '../utils/markdown-utils';
+
+export class DocumentationGenerator {
+  private config: DocumentationConfig;
+
+  constructor(config: DocumentationConfig) {
+    this.config = config;
+  }
+
+  async generate(nodes: ASTNode[]): Promise<void> {
+    await FileUtils.ensureDirectory(this.config.outputDirectory);
+    
+    if (this.config.generateIndex) {
+      await this.generateIndex(nodes);
+    }
+    
+    await this.generateModuleDocumentation(nodes);
+    await this.generateNodeFiles(nodes);
+  }
+
+  private async generateIndex(nodes: ASTNode[]): Promise<void> {
+    const modules = this.organizeByModule(nodes);
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${this.config.indexTitle}
+
+`;
+
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      content += this.generateModuleSection(moduleName, moduleNodes);
+    }
+
+    const indexPath = `${this.config.outputDirectory}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private organizeByModule(nodes: ASTNode[]): Map<string, ASTNode[]> {
+    const modules = new Map<string, ASTNode[]>();
+    
+    for (const node of nodes) {
+      const moduleName = this.extractModuleName(node);
+      if (!modules.has(moduleName)) {
+        modules.set(moduleName, []);
+      }
+      modules.get(moduleName)!.push(node);
+    }
+    
+    return modules;
+  }
+
+  private extractModuleName(node: ASTNode): string {
+    if (node.type === 'namespace') {
+      return node.name;
+    }
+    
+    const pathParts = node.location.filePath.split('/');
+    const fileName = pathParts[pathParts.length - 1];
+    const moduleName = fileName.replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '');
+    
+    return moduleName;
+  }
+
+  private generateModuleSection(moduleName: string, nodes: ASTNode[]): string {
+    let section = `:material-package: [${moduleName}](${moduleName}/index.md)
+:   ${this.generateModuleDescription(nodes)}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      section += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        section += `| [${cls.name}](${moduleName}/${cls.name}.md) | ${desc} |\n`;
+      }
+      section += '\n';
+    }
+
+    return section;
+  }
+
+  private async generateModuleDocumentation(nodes: ASTNode[]): Promise<void> {
+    const modules = this.organizeByModule(nodes);
+    
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      await this.generateModuleFile(moduleName, moduleNodes);
+    }
+  }
+
+  private async generateModuleFile(moduleName: string, nodes: ASTNode[]): Promise<void> {
+    const modulePath = `${this.config.outputDirectory}/${moduleName}`;
+    await FileUtils.ensureDirectory(modulePath);
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${moduleName}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      content += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        content += `| [${cls.name}](${cls.name}.md) | ${desc} |\n`;
+      }
+      content += '\n';
+    }
+
+    const indexPath = `${this.config.outputDirectory}/${moduleName}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private async generateNodeFiles(nodes: ASTNode[]): Promise<void> {
+    for (const node of nodes) {
+      if (node.type === 'class' || node.type === 'struct') {
+        await this.generateClassFile(node);
+      } else if (node.type === 'function' || node.type === 'method') {
+        await this.generateFunctionFile(node);
+      } else {
+        // For other types, generate based on the node's original location module
+        await this.generateNodeFile(node);
+      }
+    }
+  }
+
+  private async generateClassFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**${node.type} ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+`;
+
+    const methods = node.children.filter(n => n.type === 'function' || n.type === 'method');
+    if (methods.length > 0) {
+      content += '## Functions\n\n';
+      content += this.generateFunctionTable(methods);
+      content += '\n## Function Details\n\n';
+      content += this.generateFunctionDetails(methods);
+    }
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private async generateFunctionFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    const content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**function ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+## Signature
+
+${MarkdownUtils.formatCodeBlock(node.signature || node.name, 'cpp')}
+
+## Source Location
+
+File: \`${node.location.filePath}\`:${node.location.line}
+
+`;
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private async generateNodeFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    const content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**${node.type} ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+## Signature
+
+${MarkdownUtils.formatCodeBlock(node.signature || node.name, 'cpp')}
+
+## Source Location
+
+File: \`${node.location.filePath}\`:${node.location.line}
+
+`;
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private generateFunctionTable(functions: ASTNode[]): string {
+    const headers = ['Name', 'Description'];
+    const rows = functions.map((func: ASTNode) => {
+      const desc = func.documentation.brief || '';
+      const anchor = MarkdownUtils.generateAnchor(func.name);
+      return [`[${func.name}](#${anchor})`, desc];
+    });
+    
+    return MarkdownUtils.formatTable(headers, rows);
+  }
+
+  private generateFunctionDetails(functions: ASTNode[]): string {
+    let details = '';
+    
+    for (const func of functions) {
+      const anchor = MarkdownUtils.generateAnchor(func.name);
+      details += `### ${func.name}<a name="${anchor}"></a>\n`;
+      details += `!!! function "${this.formatFunctionSignature(func)}"\n\n`;
+      details += `    ${this.formatDocumentation(func.documentation).replace(/\n/g, '\n    ')}\n\n`;
+    }
+    
+    return details;
+  }
+
+  private formatDocumentation(doc: DocumentationComment): string {
+    if (!doc.brief && !doc.detailed) return '';
+    
+    let formatted = '';
+    if (doc.brief) formatted += `@brief ${doc.brief}\n\n`;
+    if (doc.detailed) formatted += `${doc.detailed}\n\n`;
+    
+    return formatted;
+  }
+
+  private formatFunctionSignature(func: ASTNode): string {
+    return func.signature || `${func.name}()`;
+  }
+
+  private generateModuleDescription(nodes: ASTNode[]): string {
+    const descriptions = nodes
+      .map(n => n.documentation.brief)
+      .filter(Boolean)
+      .slice(0, 2);
+    
+    return descriptions.join(' ') || 'Module containing various components and utilities.';
+  }
+}

+ 21 - 0
src/lib/parser/core/interfaces.ts

@@ -0,0 +1,21 @@
+import type {
+  ILanguageParser,
+  ParseOptions,
+  ParseResult,
+  ValidationResult,
+  LanguageInfo,
+  DocumentationConfig
+} from '../../../types/parser-types';
+
+// Re-export types for backward compatibility
+export type {
+  ILanguageParser,
+  ParseOptions,
+  ParseResult,
+  ValidationResult,
+  LanguageInfo,
+  DocumentationConfig
+};
+
+// Also export all types from parser-types
+export * from '../../../types/parser-types';

+ 33 - 0
src/lib/parser/parsers/base-parser.ts

@@ -0,0 +1,33 @@
+import type { ILanguageParser } from '../core/interfaces';
+
+export abstract class BaseParser implements ILanguageParser {
+  abstract getLanguage(): string;
+  abstract getFileExtensions(): string[];
+  abstract parseFile(filePath: string): Promise<any[]>;
+
+  canParse(filePath: string): boolean {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    return this.getFileExtensions().includes(ext);
+  }
+
+  protected createEmptyDocumentation() {
+    return {
+      type: 'unknown' as const,
+      rawContent: '',
+      brief: '',
+      detailed: '',
+      tags: [],
+      location: { filePath: '', line: 0, column: 0 }
+    };
+  }
+
+  protected extractBrief(text: string): string {
+    const sentences = text.split(/[.!?]/);
+    if (sentences.length > 1 && sentences[0].length < 100) {
+      return sentences[0].trim() + '.';
+    }
+    
+    const lines = text.split('\n');
+    return lines[0].trim() || text.substring(0, 80).trim();
+  }
+}

+ 268 - 0
src/lib/parser/parsers/cpp-parser.ts

@@ -0,0 +1,268 @@
+import { BaseParser } from './base-parser';
+import { FileUtils } from '../utils/file-utils';
+import { ASTNode, DocumentationComment } from '../../../types/parser-types';
+
+export class CppParser extends BaseParser {
+  getLanguage(): string {
+    return 'cpp';
+  }
+
+  getFileExtensions(): string[] {
+    return ['.cpp', '.h', '.hpp', '.cxx', '.cc', '.c'];
+  }
+
+  async parseFile(filePath: string): Promise<ASTNode[]> {
+    const content = await FileUtils.readFile(filePath);
+    const lines = content.split('\n');
+    
+    const comments = this.extractComments(lines);
+    const nodes = await this.parseCodeElements(lines, comments, filePath);
+    
+    return nodes;
+  }
+
+  private extractComments(lines: string[]): Map<number, DocumentationComment> {
+    const comments = new Map<string, DocumentationComment>();
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      const trimmed = line.trim();
+      
+      if (trimmed.startsWith('/**')) {
+        const comment = this.parseBlockComment(lines, i);
+        if (comment) {
+          comments.set(String(i + 1), comment);
+          i += this.getCommentHeight(comment.rawContent) - 1;
+        }
+      } else if (trimmed.startsWith('///') || trimmed.startsWith('//!')) {
+        const comment = this.parseLineComment(lines, i);
+        if (comment) {
+          comments.set(String(i + 1), comment);
+        }
+      }
+    }
+    
+    // Convert back to Map<number, DocumentationComment>
+    const numberComments = new Map<number, DocumentationComment>();
+    for (const [keyStr, value] of comments.entries()) {
+      numberComments.set(parseInt(keyStr), value);
+    }
+    return numberComments;
+  }
+
+  private parseBlockComment(lines: string[], startIndex: number): DocumentationComment | null {
+    let content = '';
+    let i = startIndex;
+    
+    while (i < lines.length && !lines[i].includes('*/')) {
+      content += lines[i] + '\n';
+      i++;
+    }
+    if (i < lines.length) {
+      content += lines[i] + '\n';
+    }
+    
+    return this.parseCommentContent(content, 'doxyblock', startIndex + 1);
+  }
+
+  private parseLineComment(lines: string[], startIndex: number): DocumentationComment | null {
+    const line = lines[startIndex];
+    return this.parseCommentContent(line, 'doxline', startIndex + 1);
+  }
+
+  private parseCommentContent(rawContent: string, type: 'doxline' | 'doxyblock', line: number): DocumentationComment {
+    let cleaned = rawContent;
+    
+    if (type === 'doxyblock') {
+      cleaned = cleaned
+        .replace(/\/\*\*|\/\*|\*\//g, '')
+        .split('\n')
+        .map(line => line.trim().replace(/^\*\s?/, '').replace(/^\s*[\/\\*]\s*/, ''))
+        .join('\n')
+        .trim();
+    } else {
+      cleaned = cleaned
+        .replace(/\/\/\/|\/\!/g, '')
+        .trim();
+    }
+
+    // Extract @tags
+    const tagRegex = /@(\w+)(?:\s+(.+?))?(?=\s+@|$)/gs;
+    const tags: { name: string; value: string }[] = [];
+    let tagMatch;
+    
+    // Extract all tags first
+    while ((tagMatch = tagRegex.exec(cleaned)) !== null) {
+      tags.push({
+        name: tagMatch[1],
+        value: tagMatch[2]?.trim() || ''
+      });
+    }
+
+    // Find @brief tag if exists
+    const briefTag = tags.find(tag => tag.name === 'brief');
+    
+    // If @brief tag exists, use its value
+    if (briefTag) {
+      return {
+        type,
+        rawContent,
+        brief: briefTag.value,
+        detailed: cleaned.replace(/@brief\s+.*?(?=\s+@|$)/gs, '').trim(),
+        tags,
+        location: { filePath: '', line, column: 0 }
+      };
+    }
+
+    // For block comments with no @tags, use entire content as brief
+    if (type === 'doxyblock' && tags.length === 0) {
+      return {
+        type,
+        rawContent,
+        brief: cleaned,
+        detailed: '',
+        tags,
+        location: { filePath: '', line, column: 0 }
+      };
+    }
+
+    // Otherwise, extract first sentence as brief
+    const brief = this.extractBrief(cleaned);
+    const detailed = cleaned.substring(brief.length).trim();
+
+    return {
+      type,
+      rawContent,
+      brief,
+      detailed,
+      tags,
+      location: { filePath: '', line, column: 0 }
+    };
+  }
+
+  private getCommentHeight(content: string): number {
+    return content.split('\n').length;
+  }
+
+  private async parseCodeElements(
+    lines: string[], 
+    comments: Map<number, DocumentationComment>, 
+    filePath: string
+  ): Promise<ASTNode[]> {
+    const nodes: ASTNode[] = [];
+    let classContext: ASTNode | null = null;
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      
+      const classMatch = line.match(/^\s*(class|struct)\s+(\w+)/);
+      if (classMatch) {
+        const node = await this.parseClass(lines, i, comments, filePath);
+        if (node) {
+          nodes.push(node);
+          classContext = node;
+        }
+      }
+      
+      // Check for end of class
+      if (classContext && line.match(/^\s*};?\s*$/)) {
+        classContext = null;
+        continue;
+      }
+      
+      // Only parse functions if not in class context, or if they are class methods
+      const functionMatch = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+      if (functionMatch && !line.includes('class')) {
+        const node = await this.parseFunction(lines, i, comments, filePath);
+        if (node) {
+          if (classContext) {
+            // Add as method to current class
+            classContext.children.push(node);
+          } else {
+            // Add as standalone function
+            nodes.push(node);
+          }
+        }
+      }
+    }
+    
+    return nodes;
+  }
+
+  private async parseClass(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(class|struct)\s+(\w+)/);
+    
+    if (!match) return null;
+    
+    const [, type, name] = match;
+    
+    // Find the closest preceding comment
+    let comment = this.findClosestPrecedingComment(comments, startIndex);
+    
+    return {
+      type: type as 'class' | 'struct',
+      name,
+      documentation: comment || this.createEmptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      isStruct: type === 'struct'
+    };
+  }
+
+  private async parseFunction(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+    
+    if (!match) return null;
+    
+    const [, name] = match;
+    
+    // Find the closest preceding comment
+    let comment = this.findClosestPrecedingComment(comments, startIndex);
+    
+    return {
+      type: 'function',
+      name,
+      documentation: comment || this.createEmptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      signature: line.trim()
+    };
+  }
+
+  private findClosestPrecedingComment(comments: Map<number, DocumentationComment>, startIndex: number): DocumentationComment | undefined {
+    let closestComment: DocumentationComment | undefined;
+    let closestDistance = Infinity;
+    let closestLine = -1;
+    
+    // Look for comments that come before the code element
+    for (const [commentLine, comment] of comments.entries()) {
+      if (commentLine <= startIndex) {
+        const distance = startIndex - commentLine;
+        if (distance < closestDistance && distance < 10) { // Only consider comments within 10 lines
+          closestComment = comment;
+          closestDistance = distance;
+          closestLine = commentLine;
+        }
+      }
+    }
+    
+    // Remove the used comment so it doesn't get attached to other elements
+    if (closestComment && closestLine > 0) {
+      comments.delete(closestLine);
+    }
+    
+    return closestComment;
+  }
+}

+ 49 - 0
src/lib/parser/utils/file-utils.ts

@@ -0,0 +1,49 @@
+import { promises as fs } from 'fs';
+import { join, resolve } from 'path';
+import { glob } from 'glob';
+
+export class FileUtils {
+  static async readFile(filePath: string): Promise<string> {
+    return await fs.readFile(filePath, 'utf-8');
+  }
+
+  static async writeFile(filePath: string, content: string): Promise<void> {
+    const dir = filePath.substring(0, filePath.lastIndexOf('/'));
+    await this.ensureDirectory(dir);
+    await fs.writeFile(filePath, content, 'utf-8');
+  }
+
+  static async ensureDirectory(dirPath: string): Promise<void> {
+    try {
+      await fs.access(dirPath);
+    } catch {
+      await fs.mkdir(dirPath, { recursive: true });
+    }
+  }
+
+  static async exists(filePath: string): Promise<boolean> {
+    try {
+      await fs.access(filePath);
+      return true;
+    } catch {
+      return false;
+    }
+  }
+
+  static async getAllFiles(dirPath: string): Promise<string[]> {
+    const files = await glob(join(dirPath, '**/*'), {
+      nodir: true,
+      absolute: true
+    });
+    return files;
+  }
+
+  static async getAllSourceFiles(dirPath: string, extensions: string[]): Promise<string[]> {
+    const pattern = join(dirPath, '**/*{${extensions.join(',')}}');
+    const files = await glob(pattern, {
+      nodir: true,
+      absolute: true
+    });
+    return files;
+  }
+}

+ 39 - 0
src/lib/parser/utils/markdown-utils.ts

@@ -0,0 +1,39 @@
+export class MarkdownUtils {
+  static escape(text: string): string {
+    return text
+      .replace(/\\/g, '\\\\')
+      .replace(/#/g, '\\#')
+      .replace(/\*/g, '\\*')
+      .replace(/_/g, '\\_')
+      .replace(/`/g, '\\`')
+      .replace(/\[/g, '\\[')
+      .replace(/\]/g, '\\]');
+  }
+
+  static generateAnchor(text: string): string {
+    return text
+      .toLowerCase()
+      .replace(/[^a-z0-9\s-]/g, '')
+      .replace(/\s+/g, '-')
+      .replace(/-+/g, '-')
+      .replace(/^-|-$/g, '');
+  }
+
+  static formatCodeBlock(code: string, language?: string): string {
+    const lang = language || '';
+    return `\`\`\`${lang}\n${code}\n\`\`\``;
+  }
+
+  static formatTable(headers: string[], rows: string[][]): string {
+    if (rows.length === 0) return '';
+
+    let table = '| ' + headers.join(' | ') + ' |\n';
+    table += '| ' + headers.map(() => '---').join(' | ') + ' |\n';
+
+    for (const row of rows) {
+      table += '| ' + row.map(cell => cell || '').join(' | ') + ' |\n';
+    }
+
+    return table;
+  }
+}

+ 19 - 0
src/mcp/agentic-stdio.ts

@@ -0,0 +1,19 @@
+#!/usr/bin/env node
+
+// Clean MCP server entry point for AgenticCoder
+import { startMcpServer } from './server.js';
+
+// Set required environment variables directly if not set
+if (!process.env.QDRANT_URL) process.env.QDRANT_URL = 'http://localhost:6333';
+if (!process.env.OLLAMA_URL) process.env.OLLAMA_URL = 'http://localhost:11434';
+if (!process.env.OLLAMA_EMBEDDING_MODEL) process.env.OLLAMA_EMBEDDING_MODEL = 'nomic-embed-text';
+
+// Handle graceful shutdown
+process.on('SIGINT', () => process.exit(0));
+process.on('SIGTERM', () => process.exit(0));
+
+// Start server
+startMcpServer().catch((error) => {
+  console.error('Server startup failed:', error);
+  process.exit(1);
+});

+ 51 - 0
src/mcp/cli.ts

@@ -0,0 +1,51 @@
+#!/usr/bin/env node
+
+import { program } from 'commander';
+import { startMcpServer } from './server.js';
+
+program
+  .name('docs-rag-mcp')
+  .description('MCP server for document RAG operations')
+  .version('1.0.0');
+
+program
+  .requiredOption('--transport <transport>', 'Transport type (stdio or http)')
+  .option('--api-key <key>', 'API key (not used for local implementation)')
+  .option('--port <port>', 'Port for HTTP transport (only with --transport http)')
+  .action(async (options) => {
+    try {
+      if (options.transport === 'stdio') {
+        if (options.port) {
+          console.error('Error: --port is not allowed when using --transport stdio');
+          process.exit(1);
+        }
+        if (options.apiKey) {
+          console.error('Error: --api-key is not allowed when using --transport stdio');
+          process.exit(1);
+        }
+        
+        // Start MCP server with stdio transport
+        await startMcpServer();
+      } else if (options.transport === 'http') {
+        if (!options.port) {
+          console.error('Error: --port is required when using --transport http');
+          process.exit(1);
+        }
+        if (options.apiKey) {
+          console.error('Error: --api-key is not allowed when using --transport http');
+          process.exit(1);
+        }
+        
+        console.error(`HTTP transport not yet implemented for port ${options.port}`);
+        process.exit(1);
+      } else {
+        console.error(`Error: Invalid --transport value: '${options.transport}'. Must be one of: stdio, http.`);
+        process.exit(1);
+      }
+    } catch (error) {
+      console.error('Server error:', error);
+      process.exit(1);
+    }
+  });
+
+program.parse();

+ 240 - 0
src/mcp/server.ts

@@ -0,0 +1,240 @@
+// Load environment variables silently to avoid debug output
+import 'dotenv/config';
+
+import { Server } from '@modelcontextprotocol/sdk/server/index.js';
+import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
+import {
+  CallToolRequestSchema,
+  ListToolsRequestSchema,
+} from '@modelcontextprotocol/sdk/types.js';
+import { DocumentService } from '../services/documentService';
+
+const documentService = new DocumentService();
+
+const server = new Server(
+  {
+    name: 'docs-rag-mcp',
+    version: '1.0.0',
+  },
+  {
+    capabilities: {
+      tools: {
+        listChanged: true,
+      },
+    },
+  }
+);
+
+server.setRequestHandler(ListToolsRequestSchema, async () => {
+  return {
+    tools: [
+      {
+        name: 'add_document',
+        description: 'Add a document collection from markdown files in a folder',
+        inputSchema: {
+          type: 'object',
+          properties: {
+            name: {
+              type: 'string',
+              description: 'Document name for the collection',
+            },
+            folder: {
+              type: 'string',
+              description: 'Folder path containing markdown files',
+            },
+            recursive: {
+              type: 'boolean',
+              description: 'Scan folders recursively',
+              default: true,
+            },
+          },
+          required: ['name', 'folder'],
+        },
+      },
+      {
+        name: 'search_documents',
+        description: 'Search within a document collection using semantic search',
+        inputSchema: {
+          type: 'object',
+          properties: {
+            documentName: {
+              type: 'string',
+              description: 'Document name to search in',
+            },
+            query: {
+              type: 'string',
+              description: 'Search query text',
+            },
+            limit: {
+              type: 'number',
+              description: 'Maximum number of results to return',
+              default: 10,
+            },
+          },
+          required: ['documentName', 'query'],
+        },
+      },
+      {
+        name: 'list_collections',
+        description: 'List all available document collections',
+        inputSchema: {
+          type: 'object',
+          properties: {},
+        },
+      },
+      {
+        name: 'get_document_info',
+        description: 'Get information about a specific document collection',
+        inputSchema: {
+          type: 'object',
+          properties: {
+            documentName: {
+              type: 'string',
+              description: 'Document name',
+            },
+          },
+          required: ['documentName'],
+        },
+      },
+    ],
+  };
+});
+
+server.setRequestHandler(CallToolRequestSchema, async (request) => {
+  const { name, arguments: args } = request.params;
+
+  try {
+    switch (name) {
+      case 'add_document': {
+        const { name: docName, folder, recursive = true } = args as {
+          name: string;
+          folder: string;
+          recursive?: boolean;
+        };
+        
+        await documentService.addDocument({
+          name: docName,
+          folder,
+          recursive,
+        });
+        
+        return {
+          content: [
+            {
+              type: 'text',
+              text: `Document '${docName}' successfully added from folder '${folder}'`,
+            },
+          ],
+        };
+      }
+
+      case 'search_documents': {
+        const { documentName, query, limit = 10 } = args as {
+          documentName: string;
+          query: string;
+          limit?: number;
+        };
+        
+        const results = await documentService.searchDocuments(
+          documentName,
+          query,
+          limit
+        );
+        
+        const formattedResults = results.map((result, index) => ({
+          rank: index + 1,
+          score: result.score,
+          filename: result.payload.filename,
+          paragraphIndex: result.payload.paragraphIndex,
+          text: result.payload.text,
+        }));
+        
+        return {
+          content: [
+            {
+              type: 'text',
+              text: JSON.stringify(formattedResults, null, 2),
+            },
+          ],
+        };
+      }
+
+      case 'list_collections': {
+        const collections = await documentService.listCollections();
+        
+        return {
+          content: [
+            {
+              type: 'text',
+              text: JSON.stringify({ collections }, null, 2),
+            },
+          ],
+        };
+      }
+
+      case 'get_document_info': {
+        const { documentName } = args as { documentName: string };
+        
+        const info = await documentService.getDocumentInfo(documentName);
+        
+        return {
+          content: [
+            {
+              type: 'text',
+              text: JSON.stringify(info, null, 2),
+            },
+          ],
+        };
+      }
+
+      default:
+        throw new Error(`Unknown tool: ${name}`);
+    }
+  } catch (error) {
+    return {
+      content: [
+        {
+          type: 'text',
+          text: `Error: ${error instanceof Error ? error.message : String(error)}`,
+        },
+      ],
+      isError: true,
+    };
+  }
+});
+
+async function main() {
+  try {
+    const transport = new StdioServerTransport();
+    await server.connect(transport);
+    // Keep the process alive by listening for errors/closure
+    process.on('SIGINT', () => {
+      console.error('Received SIGINT, shutting down gracefully');
+      process.exit(0);
+    });
+    
+    process.on('SIGTERM', () => {
+      console.error('Received SIGTERM, shutting down gracefully');
+      process.exit(0);
+    });
+    
+    // Handle stdin close
+    process.stdin.on('close', () => {
+      console.error('Stdin closed, exiting');
+      process.exit(0);
+    });
+    
+    // Don't log to console.error after connection - it can interfere with stdio
+    // console.error('Docs RAG MCP server running on stdio');
+  } catch (error) {
+    console.error('Failed to start server:', error);
+    process.exit(1);
+  }
+}
+
+main().catch((error) => {
+  console.error('Server error:', error);
+  process.exit(1);
+});
+
+export { main as startMcpServer };

+ 22 - 0
src/mcp/simple-cli.ts

@@ -0,0 +1,22 @@
+#!/usr/bin/env node
+
+// Simple MCP server that bypasses commander for stdio mode
+// This avoids any stdin parsing conflicts with MCP protocol
+
+import { startMcpServer } from './server.js';
+
+// Handle command line arguments manually for simplicity
+const args = process.argv.slice(2);
+
+// Check if transport is specified and is stdio
+if (args.includes('--transport') && args.includes('stdio')) {
+  // Start MCP server directly
+  startMcpServer().catch((error) => {
+    console.error('Server error:', error);
+    process.exit(1);
+  });
+} else {
+  console.error('Error: --transport stdio is required');
+  console.error('Usage: docs-rag-mcp --transport stdio');
+  process.exit(1);
+}

+ 27 - 0
src/mcp/stdio.ts

@@ -0,0 +1,27 @@
+#!/usr/bin/env node
+
+// Direct MCP server entry point for stdio
+// This file bypasses any argument parsing to avoid stdio conflicts
+
+// Disable dotenv debug output
+process.env.DOTENV_SILENT = 'true';
+
+// Load environment variables silently
+import 'dotenv/config';
+
+import { startMcpServer } from './server.js';
+
+// Handle graceful shutdown
+process.on('SIGINT', () => {
+  process.exit(0);
+});
+
+process.on('SIGTERM', () => {
+  process.exit(0);
+});
+
+// Start the server immediately - let the MCP client handle initialization
+startMcpServer().catch((error) => {
+  console.error('Server startup failed:', error);
+  process.exit(1);
+});

+ 137 - 0
src/services/documentService.ts

@@ -0,0 +1,137 @@
+import { OllamaService } from './ollamaService';
+import { QdrantService } from './qdrantService';
+import { FileProcessor } from '../lib/fileProcessor';
+
+export interface AddDocumentOptions {
+  name: string;
+  folder: string;
+  recursive?: boolean;
+}
+
+export class DocumentService {
+  private ollamaService: OllamaService;
+  private qdrantService: QdrantService;
+
+  constructor() {
+    this.ollamaService = new OllamaService();
+    this.qdrantService = new QdrantService();
+  }
+
+  async addDocument(options: AddDocumentOptions): Promise<void> {
+    const { name, folder, recursive = true } = options;
+    
+    console.log(`Processing document: ${name}`);
+    console.log(`Scanning folder: ${folder}`);
+    console.log(`Recursive: ${recursive}`);
+
+    try {
+      // Find all markdown files
+      const files = await FileProcessor.findMarkdownFiles(folder, recursive);
+      console.log(`Found ${files.length} markdown files`);
+
+      if (files.length === 0) {
+        console.log('No markdown files found');
+        return;
+      }
+
+      // Create collection for this document
+      const collectionName = this.sanitizeCollectionName(name);
+      
+      // Get existing file hashes to check for deduplication
+      console.log('Checking for existing content...');
+      const existingHashes = await this.qdrantService.getExistingFileHashes(collectionName);
+      console.log(`Found ${existingHashes.size} existing file hashes in collection`);
+      
+      // Process files and identify what needs to be updated
+      const allChunks: any[] = [];
+      const allTexts: string[] = [];
+      let newFiles = 0;
+      let updatedFiles = 0;
+      let unchangedFiles = 0;
+
+      // Process all files
+      for (const filePath of files) {
+        console.log(`Processing file: ${filePath}`);
+        const chunks = await FileProcessor.processMarkdownFile(filePath);
+        const fileHash = chunks[0]?.metadata.hash;
+        
+        if (existingHashes.has(fileHash)) {
+          console.log(`  - Skipping unchanged file (hash: ${fileHash.substring(0, 8)}...)`);
+          unchangedFiles++;
+          continue;
+        }
+
+        // If file exists with different hash, delete old version
+        const existingFileChunks = chunks.length > 0 ? chunks[0].metadata : null;
+        if (existingFileChunks && existingHashes.has(existingFileChunks.hash)) {
+          await this.qdrantService.deletePointsByFileHash(collectionName, existingFileChunks.hash);
+        }
+
+        console.log(`  - Processing new/updated file (${chunks.length} paragraphs)`);
+        allChunks.push(...chunks);
+        allTexts.push(...chunks.map(chunk => chunk.text));
+        
+        if (existingHashes.has(fileHash)) {
+          updatedFiles++;
+        } else {
+          newFiles++;
+        }
+      }
+
+      console.log(`Summary: ${newFiles} new files, ${updatedFiles} updated files, ${unchangedFiles} unchanged files`);
+      console.log(`Total paragraphs to process: ${allChunks.length}`);
+
+      if (allChunks.length === 0) {
+        console.log('No new or updated content to process');
+        return;
+      }
+
+      // Create embeddings for all texts
+      console.log('Creating embeddings...');
+      const embeddings = await this.ollamaService.createEmbeddings(allTexts);
+      
+      // Create collection with appropriate vector size if it doesn't exist
+      await this.qdrantService.createCollection(collectionName, embeddings[0].length);
+      
+      // Store chunks in Qdrant
+      await this.qdrantService.storeChunks(collectionName, allChunks, embeddings);
+      
+      console.log(`Document '${name}' successfully updated in collection '${collectionName}'`);
+      console.log(`Processed ${allChunks.length} paragraphs from ${newFiles + updatedFiles} files`);
+    } catch (error) {
+      console.error('Error adding document:', error);
+      throw error;
+    }
+  }
+
+  async searchDocuments(documentName: string, query: string, limit: number = 10): Promise<any[]> {
+    const collectionName = this.sanitizeCollectionName(documentName);
+    
+    try {
+      const queryVector = await this.ollamaService.createEmbedding(query);
+      return await this.qdrantService.search(collectionName, queryVector, limit);
+    } catch (error) {
+      console.error('Error searching documents:', error);
+      throw error;
+    }
+  }
+
+  async listCollections(): Promise<string[]> {
+    try {
+      const collections = await this.qdrantService.getClient().getCollections();
+      return collections.collections.map(c => c.name);
+    } catch (error) {
+      console.error('Error listing collections:', error);
+      throw error;
+    }
+  }
+
+  async getDocumentInfo(documentName: string): Promise<any> {
+    const collectionName = this.sanitizeCollectionName(documentName);
+    return await this.qdrantService.getCollectionInfo(collectionName);
+  }
+
+  private sanitizeCollectionName(name: string): string {
+    return name.toLowerCase().replace(/[^a-z0-9_-]/g, '_');
+  }
+}

+ 59 - 0
src/services/ollamaService.ts

@@ -0,0 +1,59 @@
+import { config } from '../config';
+
+export interface EmbeddingResponse {
+  embedding: number[];
+}
+
+export class OllamaService {
+  private baseUrl: string;
+  private model: string;
+
+  constructor() {
+    this.baseUrl = config.ollama.url;
+    this.model = config.ollama.embeddingModel;
+  }
+
+  async createEmbedding(text: string): Promise<number[]> {
+    try {
+      const response = await fetch(`${this.baseUrl}/api/embeddings`, {
+        method: 'POST',
+        headers: {
+          'Content-Type': 'application/json',
+        },
+        body: JSON.stringify({
+          model: this.model,
+          prompt: text,
+        }),
+      });
+
+      if (!response.ok) {
+        throw new Error(`Ollama API error: ${response.statusText}`);
+      }
+
+      const data: EmbeddingResponse = await response.json();
+      return data.embedding;
+    } catch (error) {
+      console.error('Error creating embedding:', error);
+      throw error;
+    }
+  }
+
+  async createEmbeddings(texts: string[]): Promise<number[][]> {
+    const embeddings: number[][] = [];
+    
+    console.log(`Creating embeddings for ${texts.length} texts...`);
+    
+    for (let i = 0; i < texts.length; i++) {
+      const text = texts[i];
+      const embedding = await this.createEmbedding(text);
+      embeddings.push(embedding);
+      
+      // Progress indicator for large batches
+      if ((i + 1) % 10 === 0 || i === texts.length - 1) {
+        console.log(`Progress: ${i + 1}/${texts.length} embeddings created`);
+      }
+    }
+    
+    return embeddings;
+  }
+}

+ 171 - 0
src/services/parseService.ts

@@ -0,0 +1,171 @@
+import type { 
+  ILanguageParser, 
+  DocumentationConfig, 
+  ParseOptions, 
+  ParseResult,
+  LanguageInfo
+} from '../lib/parser/core/interfaces';
+import { CppParser } from '../lib/parser/parsers/cpp-parser';
+import { DocumentationGenerator } from '../lib/parser/core/documentation-generator';
+import { FileUtils } from '../lib/parser/utils/file-utils';
+import { resolve, basename } from 'path';
+
+export class ParseService {
+  private parsers: Map<string, ILanguageParser> = new Map();
+
+  constructor() {
+    this.registerDefaultParsers();
+  }
+
+  private registerDefaultParsers(): void {
+    this.parsers.set('cpp', new CppParser());
+  }
+
+  getSupportedLanguages(): LanguageInfo[] {
+    return Array.from(this.parsers.entries()).map(([key, parser]) => ({
+      name: key,
+      description: `${key.toUpperCase()} source code parser`,
+      extensions: parser.getFileExtensions()
+    }));
+  }
+
+  async generateDocumentation(options: ParseOptions): Promise<ParseResult> {
+    const config = this.createDefaultConfig(options);
+    const sourceFiles = await this.findSourceFiles(options.inputPath, options.languages);
+    
+    if (options.dryRun) {
+      return {
+        processedFiles: 0,
+        generatedFiles: 0,
+        filesToProcess: sourceFiles,
+        errors: [],
+        estimatedFiles: this.estimateOutputFiles(sourceFiles)
+      };
+    }
+
+    const nodes = [];
+    const errors = [];
+
+    for (const filePath of sourceFiles) {
+      try {
+        const fileNodes = await this.parseFile(filePath);
+        nodes.push(...fileNodes);
+      } catch (error: any) {
+        errors.push(`Failed to parse ${filePath}: ${error}`);
+      }
+    }
+
+    const generator = new DocumentationGenerator(config);
+    await generator.generate(nodes);
+
+    const generatedFiles = await this.countGeneratedFiles(config.outputDirectory);
+
+    return {
+      processedFiles: sourceFiles.length,
+      generatedFiles,
+      filesToProcess: sourceFiles,
+      errors,
+      estimatedFiles: sourceFiles.length
+    };
+  }
+
+  async validateConfiguration(options: {
+    inputPath: string;
+    configPath?: string;
+  }): Promise<{
+    inputValid: boolean;
+    configValid: boolean;
+    supportedFiles: string[];
+    unsupportedFiles: string[];
+    warnings: string[];
+  }> {
+    const inputValid = await FileUtils.exists(options.inputPath);
+    
+    let configValid = true;
+    let warnings: string[] = [];
+
+    // Basic validation - could be extended
+    if (options.configPath && !(await FileUtils.exists(options.configPath))) {
+      configValid = false;
+      warnings.push(`Configuration file not found: ${options.configPath}`);
+    }
+
+    const allFiles = await FileUtils.getAllFiles(options.inputPath);
+    const supportedFiles = allFiles.filter((file: string) => {
+      const ext = file.substring(file.lastIndexOf('.'));
+      return Array.from(this.parsers.values()).some(parser => 
+        parser.getFileExtensions().includes(ext)
+      );
+    });
+
+    const unsupportedFiles = allFiles.filter((file: string) => !supportedFiles.includes(file));
+
+    return {
+      inputValid,
+      configValid,
+      supportedFiles,
+      unsupportedFiles,
+      warnings
+    };
+  }
+
+  private createDefaultConfig(options: ParseOptions): DocumentationConfig {
+    return {
+      outputDirectory: options.outputPath,
+      indexTitle: 'Source Code Documentation',
+      generatorName: 'docs-rag-parser',
+      generateIndex: true,
+      generateModuleIndexes: true,
+      includePrivate: options.includePrivate || false,
+      includeSourceLinks: true,
+      sourceRootPath: options.inputPath,
+      theme: 'material'
+    };
+  }
+
+  private async findSourceFiles(inputPath: string, languages: string[]): Promise<string[]> {
+    const allFiles = await FileUtils.getAllFiles(inputPath);
+    
+    return allFiles.filter((file: string) => {
+      const parser = this.getParserForFile(file);
+      return parser && languages.includes(parser.getLanguage());
+    });
+  }
+
+  private getParserForFile(filePath: string): ILanguageParser | null {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    
+    for (const parser of this.parsers.values()) {
+      if (parser.getFileExtensions().includes(ext)) {
+        return parser;
+      }
+    }
+    
+    return null;
+  }
+
+  private async parseFile(filePath: string): Promise<any[]> {
+    const parser = this.getParserForFile(filePath);
+    if (!parser) {
+      throw new Error(`No parser found for file: ${filePath}`);
+    }
+
+    return await parser.parseFile(filePath);
+  }
+
+  private estimateOutputFiles(sourceFiles: string[]): number {
+    const uniqueNames = new Set(sourceFiles.map(file => 
+      basename(file).replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '')
+    ));
+    return uniqueNames.size + Math.floor(sourceFiles.length * 2);
+  }
+
+  private async countGeneratedFiles(outputDirectory: string): Promise<number> {
+    try {
+      const files = await FileUtils.getAllFiles(outputDirectory);
+      return files.filter((file: string) => file.endsWith('.md')).length;
+    } catch {
+      return 0;
+    }
+  }
+}

+ 184 - 0
src/services/qdrantService.ts

@@ -0,0 +1,184 @@
+import { QdrantClient } from '@qdrant/qdrant-js';
+import { config } from '../config';
+import { DocumentChunk, FileMetadata } from '../lib/fileProcessor';
+
+export interface QdrantPoint {
+  id: number | string;
+  vector: number[];
+  payload: {
+    text: string;
+    fileHash: string;
+    filename: string;
+    lastModified: number;
+    filePath: string;
+    paragraphIndex: number;
+  };
+}
+
+export class QdrantService {
+  private client: QdrantClient;
+
+  constructor() {
+    this.client = new QdrantClient({
+      url: config.qdrant.url,
+      apiKey: config.qdrant.apiKey || undefined,
+    });
+  }
+
+  async createCollection(collectionName: string, vectorSize: number): Promise<void> {
+    try {
+      const collections = await this.client.getCollections();
+      const exists = collections.collections.some(c => c.name === collectionName);
+
+      if (!exists) {
+        await this.client.createCollection(collectionName, {
+          vectors: {
+            size: vectorSize,
+            distance: 'Cosine',
+          },
+        });
+        console.log(`Collection '${collectionName}' created successfully`);
+      } else {
+        console.log(`Collection '${collectionName}' already exists`);
+      }
+    } catch (error) {
+      console.error('Error creating collection:', error);
+      throw error;
+    }
+  }
+
+  async storeChunks(collectionName: string, chunks: DocumentChunk[], embeddings: number[][]): Promise<void> {
+    if (chunks.length !== embeddings.length) {
+      throw new Error('Chunks and embeddings count must match');
+    }
+
+    const batchSize = 100; // Process in batches to avoid memory issues
+    const totalBatches = Math.ceil(chunks.length / batchSize);
+
+    for (let batchIndex = 0; batchIndex < totalBatches; batchIndex++) {
+      const startIdx = batchIndex * batchSize;
+      const endIdx = Math.min((batchIndex + 1) * batchSize, chunks.length);
+      
+      const batchChunks = chunks.slice(startIdx, endIdx);
+      const batchEmbeddings = embeddings.slice(startIdx, endIdx);
+
+      const points: QdrantPoint[] = batchChunks.map((chunk, index) => ({
+        id: startIdx + index,
+        vector: batchEmbeddings[index],
+        payload: {
+          text: chunk.text,
+          fileHash: chunk.metadata.hash,
+          filename: chunk.metadata.filename,
+          lastModified: chunk.metadata.lastModified,
+          filePath: chunk.metadata.filePath,
+          paragraphIndex: chunk.paragraphIndex,
+        },
+      }));
+
+      try {
+        await this.client.upsert(collectionName, {
+          wait: true,
+          points: points,
+        });
+        console.log(`Batch ${batchIndex + 1}/${totalBatches} stored (${endIdx} chunks total)`);
+      } catch (error) {
+        console.error('Error storing batch:', error);
+        throw error;
+      }
+    }
+
+    console.log(`Successfully stored all ${chunks.length} chunks in collection '${collectionName}'`);
+  }
+
+  async search(collectionName: string, queryVector: number[], limit: number = 10): Promise<any[]> {
+    try {
+      const searchResult = await this.client.search(collectionName, {
+        vector: queryVector,
+        limit,
+        with_payload: true,
+      });
+
+      return searchResult.map(result => ({
+        id: result.id,
+        score: result.score,
+        payload: result.payload,
+      }));
+    } catch (error) {
+      console.error('Error searching:', error);
+      throw error;
+    }
+  }
+
+  getClient() {
+    return this.client;
+  }
+
+  async getCollectionInfo(collectionName: string): Promise<any> {
+    try {
+      return await this.client.getCollection(collectionName);
+    } catch (error) {
+      console.error('Error getting collection info:', error);
+      throw error;
+    }
+  }
+
+  async deleteCollection(collectionName: string): Promise<void> {
+    try {
+      await this.client.deleteCollection(collectionName);
+      console.log(`Collection '${collectionName}' deleted`);
+    } catch (error) {
+      console.error('Error deleting collection:', error);
+      throw error;
+    }
+  }
+
+  async getExistingFileHashes(collectionName: string): Promise<Set<string>> {
+    try {
+      // Check if collection exists first
+      const collections = await this.getClient().getCollections();
+      const collectionExists = collections.collections.some(c => c.name === collectionName);
+      
+      if (!collectionExists) {
+        return new Set<string>();
+      }
+
+      // Get all points in the collection with only the fileHash field
+      const scrollResult = await this.client.scroll(collectionName, {
+        limit: 10000, // Adjust based on expected collection size
+        with_payload: ['fileHash'],
+        with_vector: false,
+      });
+
+      const hashes = new Set<string>();
+      scrollResult.points.forEach((point: any) => {
+        if (point.payload?.fileHash) {
+          hashes.add(point.payload.fileHash);
+        }
+      });
+
+      return hashes;
+    } catch (error) {
+      console.error('Error getting existing file hashes:', error);
+      return new Set<string>();
+    }
+  }
+
+  async deletePointsByFileHash(collectionName: string, fileHash: string): Promise<void> {
+    try {
+      await this.client.delete(collectionName, {
+        filter: {
+          must: [
+            {
+              key: 'fileHash',
+              match: { value: fileHash }
+            }
+          ]
+        }
+      });
+      console.log(`Deleted existing points for file hash: ${fileHash}`);
+    } catch (error) {
+      console.error('Error deleting points by file hash:', error);
+      throw error;
+    }
+  }
+}

+ 92 - 0
src/types/parser-types.ts

@@ -0,0 +1,92 @@
+export interface SourceLocation {
+  filePath: string;
+  line: number;
+  column: number;
+  endLine?: number;
+  endColumn?: number;
+}
+
+export interface DocumentationComment {
+  type: 'doxline' | 'doxyblock' | 'javadoc' | 'unknown';
+  rawContent: string;
+  brief: string;
+  detailed: string;
+  tags: DocTag[];
+  location: SourceLocation;
+}
+
+export interface DocTag {
+  name: string;
+  value: string;
+}
+
+export interface ASTNode {
+  type: 'namespace' | 'class' | 'struct' | 'function' | 'method' | 
+        'variable' | 'enum' | 'enum_value' | 'template' | 'module';
+  name: string;
+  documentation: DocumentationComment;
+  location: SourceLocation;
+  children: ASTNode[];
+  [key: string]: any; // Allow type-specific properties
+}
+
+export interface DocumentationConfig {
+  outputDirectory: string;
+  indexTitle: string;
+  generatorName: string;
+  generateIndex: boolean;
+  generateModuleIndexes: boolean;
+  includePrivate: boolean;
+  includeSourceLinks: boolean;
+  sourceRootPath?: string;
+  theme: 'material' | 'github' | 'default';
+}
+
+export interface ParserConfig {
+  languages: string[];
+  includePatterns: string[];
+  excludePatterns: string[];
+  outputPath: string;
+  watchMode: boolean;
+  incremental: boolean;
+}
+
+export interface ParseOptions {
+  inputPath: string;
+  outputPath: string;
+  languages: string[];
+  configPath?: string;
+  watch?: boolean;
+  incremental?: boolean;
+  includePrivate?: boolean;
+  dryRun?: boolean;
+}
+
+export interface ParseResult {
+  processedFiles: number;
+  generatedFiles: number;
+  filesToProcess: string[];
+  errors: string[];
+  estimatedFiles: number;
+}
+
+export interface ValidationResult {
+  inputValid: boolean;
+  configValid: boolean;
+  supportedFiles: string[];
+  unsupportedFiles: string[];
+  warnings: string[];
+}
+
+export interface LanguageInfo {
+  name: string;
+  description: string;
+  extensions: string[];
+}
+
+export interface ILanguageParser {
+  getLanguage(): string;
+  getFileExtensions(): string[];
+  canParse(filePath: string): boolean;
+  parseFile(filePath: string): Promise<ASTNode[]>;
+}

+ 572 - 0
technical-specification.md

@@ -0,0 +1,572 @@
+# Dynamic Source Code Parser - Technical Specification
+
+## Core Interfaces
+
+### Parser Interface
+```cpp
+// parser_interface.h
+#pragma once
+
+#include <memory>
+#include <vector>
+#include <string>
+
+// Forward declarations
+class ASTNode;
+struct SourceLocation;
+
+/**
+ * @brief Interface for language-specific parsers
+ * 
+ * Each language parser implements this interface to provide
+ * language-agnostic access to parsed source code structures.
+ */
+class ILanguageParser {
+public:
+    virtual ~ILanguageParser() = default;
+    
+    /**
+     * @brief Parse a single source file and return AST nodes
+     * @param filePath Path to the source file
+     * @return Vector of AST nodes representing the file contents
+     */
+    virtual std::vector<std::unique_ptr<ASTNode>> parseFile(const std::string& filePath) = 0;
+    
+    /**
+     * @brief Get the programming language name
+     * @return String identifier for the language
+     */
+    virtual std::string getLanguage() const = 0;
+    
+    /**
+     * @brief Get file extensions supported by this parser
+     * @return Vector of file extensions (e.g., {".cpp", ".h", ".hpp"})
+     */
+    virtual std::vector<std::string> getFileExtensions() const = 0;
+    
+    /**
+     * @brief Check if a file can be parsed by this parser
+     * @param filePath Path to check
+     * @return true if the file extension is supported
+     */
+    virtual bool canParse(const std::string& filePath) const = 0;
+};
+```
+
+### AST Node Hierarchy
+```cpp
+// ast_nodes.h
+#pragma once
+
+#include <string>
+#include <vector>
+#include <memory>
+#include <optional>
+
+/**
+ * @brief Source code location information
+ */
+struct SourceLocation {
+    std::string filePath;
+    int line = 0;
+    int column = 0;
+    int endLine = 0;
+    int endColumn = 0;
+};
+
+/**
+ * @brief Documentation comment with tags
+ */
+struct DocumentationComment {
+    enum class Type {
+        DOXYLINE,    /// < line comment
+        DOXYBLOCK,   /** */ block comment
+        JAVADOC,     /** JavaDoc style */
+        UNKNOWN
+    };
+    
+    Type type = Type::UNKNOWN;
+    std::string rawContent;
+    std::string brief;
+    std::string detailed;
+    std::vector<std::string> tags;  // @param, @return, @brief, etc.
+    SourceLocation location;
+};
+
+/**
+ * @brief Base class for all AST nodes
+ */
+class ASTNode {
+public:
+    enum class NodeType {
+        NAMESPACE,
+        CLASS,
+        STRUCT,
+        FUNCTION,
+        METHOD,
+        VARIABLE,
+        ENUM,
+        ENUM_VALUE,
+        TEMPLATE,
+        MODULE,
+        UNKNOWN
+    };
+    
+protected:
+    NodeType nodeType;
+    std::string name;
+    DocumentationComment documentation;
+    SourceLocation location;
+    std::vector<std::unique_ptr<ASTNode>> children;
+    
+public:
+    explicit ASTNode(NodeType type) : nodeType(type) {}
+    virtual ~ASTNode() = default;
+    
+    // Getters
+    NodeType getType() const { return nodeType; }
+    const std::string& getName() const { return name; }
+    const DocumentationComment& getDocumentation() const { return documentation; }
+    const SourceLocation& getLocation() const { return location; }
+    const std::vector<std::unique_ptr<ASTNode>>& getChildren() const { return children; }
+    
+    // Setters
+    void setName(const std::string& n) { name = n; }
+    void setDocumentation(const DocumentationComment& doc) { documentation = doc; }
+    void setLocation(const SourceLocation& loc) { location = loc; }
+    
+    // Child management
+    void addChild(std::unique_ptr<ASTNode> child) { children.push_back(std::move(child)); }
+    
+    // Visitor pattern support
+    virtual void accept(class ASTVisitor& visitor) = 0;
+};
+
+/**
+ * @brief Namespace or module node
+ */
+class NamespaceNode : public ASTNode {
+public:
+    NamespaceNode() : ASTNode(NodeType::NAMESPACE) {}
+    void accept(ASTVisitor& visitor) override;
+};
+
+/**
+ * @brief Class or struct definition
+ */
+class ClassNode : public ASTNode {
+private:
+    bool isStruct = false;
+    std::vector<std::string> baseClasses;
+    std::vector<std::string> templateParameters;
+    std::string accessSpecifier;  // public, protected, private
+    
+public:
+    ClassNode() : ASTNode(NodeType::CLASS) {}
+    
+    bool getIsStruct() const { return isStruct; }
+    void setIsStruct(bool s) { isStruct = s; }
+    
+    const std::vector<std::string>& getBaseClasses() const { return baseClasses; }
+    void addBaseClass(const std::string& base) { baseClasses.push_back(base); }
+    
+    const std::vector<std::string>& getTemplateParameters() const { return templateParameters; }
+    void addTemplateParameter(const std::string& param) { templateParameters.push_back(param); }
+    
+    const std::string& getAccessSpecifier() const { return accessSpecifier; }
+    void setAccessSpecifier(const std::string& access) { accessSpecifier = access; }
+    
+    void accept(ASTVisitor& visitor) override;
+};
+
+/**
+ * @brief Function or method parameter
+ */
+struct Parameter {
+    std::string type;
+    std::string name;
+    std::string defaultValue;
+    DocumentationComment documentation;
+};
+
+/**
+ * @brief Function or method definition
+ */
+class FunctionNode : public ASTNode {
+private:
+    std::string returnType;
+    std::vector<Parameter> parameters;
+    bool isStatic = false;
+    bool isVirtual = false;
+    bool isConst = false;
+    bool isConstructor = false;
+    bool isDestructor = false;
+    std::string accessSpecifier;
+    
+public:
+    FunctionNode() : ASTNode(NodeType::FUNCTION) {}
+    
+    const std::string& getReturnType() const { return returnType; }
+    void setReturnType(const std::string& type) { returnType = type; }
+    
+    const std::vector<Parameter>& getParameters() const { return parameters; }
+    void addParameter(const Parameter& param) { parameters.push_back(param); }
+    
+    bool getIsStatic() const { return isStatic; }
+    void setIsStatic(bool s) { isStatic = s; }
+    
+    bool getIsVirtual() const { return isVirtual; }
+    void setIsVirtual(bool v) { isVirtual = v; }
+    
+    bool getIsConst() const { return isConst; }
+    void setIsConst(bool c) { isConst = c; }
+    
+    bool getIsConstructor() const { return isConstructor; }
+    void setIsConstructor(bool c) { isConstructor = c; }
+    
+    bool getIsDestructor() const { return isDestructor; }
+    void setIsDestructor(bool d) { isDestructor = d; }
+    
+    const std::string& getAccessSpecifier() const { return accessSpecifier; }
+    void setAccessSpecifier(const std::string& access) { accessSpecifier = access; }
+    
+    void accept(ASTVisitor& visitor) override;
+};
+
+/**
+ * @brief Variable or field declaration
+ */
+class VariableNode : public ASTNode {
+private:
+    std::string type;
+    std::string defaultValue;
+    bool isStatic = false;
+    bool isConst = false;
+    std::string accessSpecifier;
+    
+public:
+    VariableNode() : ASTNode(NodeType::VARIABLE) {}
+    
+    const std::string& getType() const { return type; }
+    void setType(const std::string& t) { type = t; }
+    
+    const std::string& getDefaultValue() const { return defaultValue; }
+    void setDefaultValue(const std::string& value) { defaultValue = value; }
+    
+    bool getIsStatic() const { return isStatic; }
+    void setIsStatic(bool s) { isStatic = s; }
+    
+    bool getIsConst() const { return isConst; }
+    void setIsConst(bool c) { isConst = c; }
+    
+    const std::string& getAccessSpecifier() const { return accessSpecifier; }
+    void setAccessSpecifier(const std::string& access) { accessSpecifier = access; }
+    
+    void accept(ASTVisitor& visitor) override;
+};
+
+/**
+ * @brief Enum definition
+ */
+class EnumNode : public ASTNode {
+private:
+    bool isEnumClass = false;
+    std::string underlyingType;
+    std::vector<std::pair<std::string, std::string>> values; // name, value
+    
+public:
+    EnumNode() : ASTNode(NodeType::ENUM) {}
+    
+    bool getIsEnumClass() const { return isEnumClass; }
+    void setIsEnumClass(bool e) { isEnumClass = e; }
+    
+    const std::string& getUnderlyingType() const { return underlyingType; }
+    void setUnderlyingType(const std::string& type) { underlyingType = type; }
+    
+    const std::vector<std::pair<std::string, std::string>>& getValues() const { return values; }
+    void addValue(const std::string& name, const std::string& value = "") {
+        values.emplace_back(name, value);
+    }
+    
+    void accept(ASTVisitor& visitor) override;
+};
+```
+
+### Visitor Pattern Implementation
+```cpp
+// visitor.h
+#pragma once
+
+#include "ast_nodes.h"
+
+/**
+ * @brief Visitor interface for AST traversal
+ */
+class ASTVisitor {
+public:
+    virtual ~ASTVisitor() = default;
+    
+    // Node visitors
+    virtual void visit(NamespaceNode& node) = 0;
+    virtual void visit(ClassNode& node) = 0;
+    virtual void visit(FunctionNode& node) = 0;
+    virtual void visit(VariableNode& node) = 0;
+    virtual void visit(EnumNode& node) = 0;
+    
+    // Default traversal behavior
+    virtual void traverse(ASTNode& node) {
+        for (auto& child : node.getChildren()) {
+            child->accept(*this);
+        }
+    }
+};
+
+// Visitor method implementations
+inline void NamespaceNode::accept(ASTVisitor& visitor) {
+    visitor.visit(*this);
+    visitor.traverse(*this);
+}
+
+inline void ClassNode::accept(ASTVisitor& visitor) {
+    visitor.visit(*this);
+    visitor.traverse(*this);
+}
+
+inline void FunctionNode::accept(ASTVisitor& visitor) {
+    visitor.visit(*this);
+    visitor.traverse(*this);
+}
+
+inline void VariableNode::accept(ASTVisitor& visitor) {
+    visitor.visit(*this);
+    visitor.traverse(*this);
+}
+
+inline void EnumNode::accept(ASTVisitor& visitor) {
+    visitor.visit(*this);
+    visitor.traverse(*this);
+}
+```
+
+### Documentation Generator
+```cpp
+// documentation_generator.h
+#pragma once
+
+#include "ast_nodes.h"
+#include "visitor.h"
+#include <string>
+#include <vector>
+#include <fstream>
+#include <filesystem>
+
+/**
+ * @brief Configuration for documentation generation
+ */
+struct DocumentationConfig {
+    std::string outputDirectory = "docs";
+    std::string indexTitle = "Documentation";
+    std::string generatorName = "docs-parser";
+    bool generateIndex = true;
+    bool generateModuleIndexes = true;
+    bool includePrivate = false;
+    bool includeSourceLinks = false;
+    std::string sourceRootPath;
+    std::string theme = "material";  // material, github, etc.
+};
+
+/**
+ * @brief Markdown documentation generator
+ */
+class DocumentationGenerator : public ASTVisitor {
+private:
+    DocumentationConfig config;
+    std::filesystem::path outputPath;
+    std::vector<std::string> moduleStack;
+    std::ofstream currentFile;
+    std::string currentContent;
+    
+    struct ModuleInfo {
+        std::string name;
+        std::string path;
+        std::vector<std::string> classes;
+        std::vector<std::string> functions;
+        std::vector<std::string> submodules;
+    };
+    
+    std::vector<ModuleInfo> modules;
+    
+public:
+    explicit DocumentationGenerator(const DocumentationConfig& cfg);
+    
+    /**
+     * @brief Generate documentation from parsed AST
+     * @param nodes Root AST nodes from parser
+     */
+    void generate(const std::vector<std::unique_ptr<ASTNode>>& nodes);
+    
+    // Visitor methods
+    void visit(NamespaceNode& node) override;
+    void visit(ClassNode& node) override;
+    void visit(FunctionNode& node) override;
+    void visit(VariableNode& node) override;
+    void visit(EnumNode& node) override;
+    
+private:
+    // File operations
+    void createDirectoryStructure();
+    void openFile(const std::filesystem::path& filePath);
+    void closeFile();
+    void writeContent(const std::string& content);
+    
+    // Generation methods
+    void generateIndex(const std::vector<std::unique_ptr<ASTNode>>& nodes);
+    void generateModuleIndexes();
+    void generateClassFile(ClassNode& classNode);
+    void generateNamespaceFile(NamespaceNode& namespaceNode);
+    
+    // Content formatting
+    std::string generateFunctionTable(const std::vector<FunctionNode*>& functions);
+    std::string generateFunctionDetails(const std::vector<FunctionNode*>& functions);
+    std::string generateClassHeader(ClassNode& classNode);
+    std::string generateInheritanceDiagram(ClassNode& classNode);
+    std::string formatDocumentation(const DocumentationComment& doc);
+    std::string formatParameters(const std::vector<Parameter>& params);
+    std::string escapeMarkdown(const std::string& text);
+    
+    // Utility methods
+    std::string getModulePath(const std::string& moduleName);
+    std::string getAnchorLink(const std::string& name);
+    std::string getMaterialIcon(const std::string& type);
+    std::string formatFunctionSignature(const FunctionNode& func);
+    
+    // Module management
+    void enterModule(const std::string& moduleName);
+    void exitModule();
+    ModuleInfo& getCurrentModule();
+    void addToCurrentModule(const std::string& type, const std::string& name);
+};
+```
+
+### C++ Parser Implementation Sketch
+```cpp
+// cpp_parser.h
+#pragma once
+
+#include "parser_interface.h"
+#include <regex>
+#include <map>
+
+/**
+ * @brief C++ language parser
+ */
+class CppParser : public ILanguageParser {
+private:
+    // Token types
+    enum class TokenType {
+        KEYWORD,
+        IDENTIFIER,
+        SYMBOL,
+        COMMENT,
+        PREPROCESSOR,
+        STRING_LITERAL,
+        NUMBER,
+        WHITESPACE,
+        UNKNOWN
+    };
+    
+    struct Token {
+        TokenType type;
+        std::string value;
+        SourceLocation location;
+    };
+    
+    // Comment tracking
+    struct CommentInfo {
+        DocumentationComment comment;
+        bool attached = false;
+    };
+    
+    std::map<std::pair<int, int>, CommentInfo> pendingComments;
+    
+public:
+    std::vector<std::unique_ptr<ASTNode>> parseFile(const std::string& filePath) override;
+    std::string getLanguage() const override { return "cpp"; }
+    std::vector<std::string> getFileExtensions() const override {
+        return {".cpp", ".h", ".hpp", ".cxx", ".cc", ".c"};
+    }
+    bool canParse(const std::string& filePath) const override;
+    
+private:
+    // Tokenization
+    std::vector<Token> tokenize(const std::string& source, const std::string& filePath);
+    
+    // Parsing methods
+    std::vector<std::unique_ptr<ASTNode>> parseTokens(const std::vector<Token>& tokens);
+    std::unique_ptr<ClassNode> parseClass(const std::vector<Token>& tokens, size_t& pos);
+    std::unique_ptr<FunctionNode> parseFunction(const std::vector<Token>& tokens, size_t& pos);
+    std::unique_ptr<NamespaceNode> parseNamespace(const std::vector<Token>& tokens, size_t& pos);
+    std::unique_ptr<VariableNode> parseVariable(const std::vector<Token>& tokens, size_t& pos);
+    std::unique_ptr<EnumNode> parseEnum(const std::vector<Token>& tokens, size_t& pos);
+    
+    // Comment processing
+    void extractComments(const std::string& source);
+    DocumentationComment parseComment(const std::string& content, SourceLocation location);
+    void attachComments(ASTNode* node, int line);
+    
+    // Utility methods
+    bool isKeyword(const std::string& token) const;
+    bool isTypeQualifier(const std::string& token) const;
+    bool isAccessSpecifier(const std::string& token) const;
+    std::string parseQualifiedName(const std::vector<Token>& tokens, size_t& pos);
+    Parameter parseParameter(const std::vector<Token>& tokens, size_t& pos);
+    
+    // Regular expressions
+    std::regex commentRegex = std::regex(R"((/\*\*.*?\*/|///.*?$|//!.*?$))", std::regex::dotall);
+    std::regex doxygenTagRegex = std::regex(R"(@(\w+)(?:\s+(.+?))?(?=\s+@|\s*$))");
+};
+```
+
+## Usage Example
+
+```cpp
+// main.cpp
+#include "cpp_parser.h"
+#include "documentation_generator.h"
+
+int main(int argc, char* argv[]) {
+    if (argc < 3) {
+        std::cerr << "Usage: docs-parser <input> <output>" << std::endl;
+        return 1;
+    }
+    
+    // Configure documentation generation
+    DocumentationConfig config;
+    config.outputDirectory = argv[2];
+    config.indexTitle = "My Project Documentation";
+    config.generatorName = "docs-parser";
+    
+    // Create C++ parser
+    CppParser parser;
+    DocumentationGenerator generator(config);
+    
+    // Parse source files
+    std::vector<std::unique_ptr<ASTNode>> allNodes;
+    std::vector<std::string> sourceFiles = findSourceFiles(argv[1], {"cpp", "h", "hpp"});
+    
+    for (const auto& file : sourceFiles) {
+        auto nodes = parser.parseFile(file);
+        for (auto& node : nodes) {
+            allNodes.push_back(std::move(node));
+        }
+    }
+    
+    // Generate documentation
+    generator.generate(allNodes);
+    
+    std::cout << "Documentation generated in " << config.outputDirectory << std::endl;
+    return 0;
+}
+```
+
+This technical specification provides the foundation for implementing a robust, extensible documentation parser with clean separation of concerns and a pluggable architecture for supporting multiple programming languages.

+ 28 - 0
test-output/index.md

@@ -0,0 +1,28 @@
+---
+generator: docs-rag-parser
+---
+
+# Source Code Documentation
+
+:material-package: [test-calculator](test-calculator/index.md)
+:   A simple calculator class
+This class provides basic arithmetic operations for everyday calculations.
+
+Features:
+- Addition and subtraction
+- Multiplication and division
+- Memory storage functionality Global utility function for calculations
+This function demonstrates global documentation patterns.
+
+## Types
+
+| Name | Description |
+| ---- | ----------- |
+| [Calculator](test-calculator/Calculator.md) | A simple calculator class
+This class provides basic arithmetic operations for everyday calculations.
+
+Features:
+- Addition and subtraction
+- Multiplication and division
+- Memory storage functionality |
+

+ 68 - 0
test-output/test-calculator/Calculator.md

@@ -0,0 +1,68 @@
+---
+generator: docs-rag-parser
+---
+
+# Calculator
+
+**class Calculator**
+
+@brief A simple calculator class
+This class provides basic arithmetic operations for everyday calculations.
+
+Features:
+- Addition and subtraction
+- Multiplication and division
+- Memory storage functionality
+
+
+
+## Functions
+
+| Name | Description |
+| --- | --- |
+| [Calculator](#calculator) | Default constructor that initializes memory to zero |
+| [add](#add) | Add two numbers together |
+| [multiply](#multiply) | Multiply two numbers |
+| [storeInMemory](#storeinmemory) | Store a value in memory |
+
+## Function Details
+
+### Calculator<a name="calculator"></a>
+!!! function "Calculator();"
+
+    @brief Default constructor that initializes memory to zero
+    
+    
+
+### add<a name="add"></a>
+!!! function "double add(double a, double b);"
+
+    @brief Add two numbers together
+    
+    @param a First number to add
+    @param b Second number to add
+    @return Sum of a and b
+    @example double result = calc.add(5.0, 3.0); // result = 8.0
+    
+    
+
+### multiply<a name="multiply"></a>
+!!! function "double multiply(double a, double b);"
+
+    @brief Multiply two numbers
+    
+    @param a First number to multiply
+    @param b Second number to multiply
+    @return Product of a and b
+    
+    
+
+### storeInMemory<a name="storeinmemory"></a>
+!!! function "void storeInMemory(double value);"
+
+    @brief Store a value in memory
+    
+    @param value The value to store
+    
+    
+

+ 18 - 0
test-output/test-calculator/index.md

@@ -0,0 +1,18 @@
+---
+generator: docs-rag-parser
+---
+
+# test-calculator
+
+## Types
+
+| Name | Description |
+| ---- | ----------- |
+| [Calculator](Calculator.md) | A simple calculator class
+This class provides basic arithmetic operations for everyday calculations.
+
+Features:
+- Addition and subtraction
+- Multiplication and division
+- Memory storage functionality |
+

+ 26 - 0
test-output/test-calculator/square.md

@@ -0,0 +1,26 @@
+---
+generator: docs-rag-parser
+---
+
+# square
+
+**function square**
+
+@brief Global utility function for calculations
+This function demonstrates global documentation patterns.
+
+@param value The value to square
+@return The squared value
+
+
+
+## Signature
+
+```cpp
+double square(double value);
+```
+
+## Source Location
+
+File: `/data/docs-rag/test-calculator.h`:54
+

+ 23 - 0
tsconfig.json

@@ -0,0 +1,23 @@
+{
+  "compilerOptions": {
+    "target": "ES2022",
+    "module": "CommonJS",
+    "moduleResolution": "node",
+    "allowSyntheticDefaultImports": true,
+    "esModuleInterop": true,
+    "allowJs": true,
+    "strict": true,
+    "forceConsistentCasingInFileNames": true,
+    "skipLibCheck": true,
+    "declaration": true,
+    "outDir": "./dist",
+    "rootDir": "./src",
+    "baseUrl": ".",
+    "paths": {
+      "@/*": ["./src/*"],
+      "@types/*": ["./src/types/*"]
+    }
+  },
+  "include": ["src/**/*"],
+  "exclude": ["node_modules", "dist"]
+}

+ 1003 - 0
typescript-implementation.md

@@ -0,0 +1,1003 @@
+# TypeScript Source Code Parser - Quick Start Implementation
+
+## Project Integration
+
+This implementation integrates directly into the existing `docs-rag` TypeScript project.
+
+## 1. Update package.json
+
+Add new dependencies to `package.json`:
+
+```json
+{
+  "dependencies": {
+    "@types/node": "^24.10.1",
+    "commander": "^14.0.2",
+    "fs-extra": "^11.3.2",
+    "glob": "^13.0.0",
+    "typescript": "^5.9.3",
+    "zod": "^4.1.12",
+    "chokidar": "^4.0.1"
+  },
+  "scripts": {
+    "build": "tsc",
+    "start": "node dist/cli/index.js",
+    "dev": "tsx src/cli/index.ts",
+    "mcp": "tsx src/mcp/server.ts",
+    "cli": "node dist/cli/index.js",
+    "parse": "node dist/cli/index.js parse",
+    "parse:dev": "tsx src/cli/index.ts parse"
+  }
+}
+```
+
+## 2. Core Implementation Files
+
+### Create Directory Structure
+
+```bash
+mkdir -p src/lib/parser/{core,parsers,ast,utils}
+mkdir -p src/types
+```
+
+### Type Definitions (src/types/parser-types.ts)
+
+```typescript
+export interface SourceLocation {
+  filePath: string;
+  line: number;
+  column: number;
+  endLine?: number;
+  endColumn?: number;
+}
+
+export interface DocumentationComment {
+  type: 'doxline' | 'doxyblock' | 'javadoc' | 'unknown';
+  rawContent: string;
+  brief: string;
+  detailed: string;
+  tags: DocTag[];
+  location: SourceLocation;
+}
+
+export interface DocTag {
+  name: string;
+  value: string;
+}
+
+export interface ASTNode {
+  type: 'namespace' | 'class' | 'struct' | 'function' | 'method' | 
+        'variable' | 'enum' | 'enum_value' | 'template' | 'module';
+  name: string;
+  documentation: DocumentationComment;
+  location: SourceLocation;
+  children: ASTNode[];
+  [key: string]: any; // Allow type-specific properties
+}
+
+export interface DocumentationConfig {
+  outputDirectory: string;
+  indexTitle: string;
+  generatorName: string;
+  generateIndex: boolean;
+  generateModuleIndexes: boolean;
+  includePrivate: boolean;
+  includeSourceLinks: boolean;
+  sourceRootPath?: string;
+  theme: 'material' | 'github' | 'default';
+}
+
+export interface ParserConfig {
+  languages: string[];
+  includePatterns: string[];
+  excludePatterns: string[];
+  outputPath: string;
+  watchMode: boolean;
+  incremental: boolean;
+}
+```
+
+### Parser Interface (src/lib/parser/core/interfaces.ts)
+
+```typescript
+import { ASTNode, DocumentationComment, SourceLocation, DocumentationConfig } from '../../../types/parser-types';
+
+export interface ILanguageParser {
+  getLanguage(): string;
+  getFileExtensions(): string[];
+  canParse(filePath: string): boolean;
+  parseFile(filePath: string): Promise<ASTNode[]>;
+}
+
+export interface ParseOptions {
+  inputPath: string;
+  outputPath: string;
+  languages: string[];
+  configPath?: string;
+  watch?: boolean;
+  incremental?: boolean;
+  includePrivate?: boolean;
+  dryRun?: boolean;
+}
+
+export interface ParseResult {
+  processedFiles: number;
+  generatedFiles: number;
+  filesToProcess: string[];
+  errors: string[];
+  estimatedFiles: number;
+}
+
+export interface ValidationResult {
+  inputValid: boolean;
+  configValid: boolean;
+  supportedFiles: string[];
+  unsupportedFiles: string[];
+  warnings: string[];
+}
+
+export interface LanguageInfo {
+  name: string;
+  description: string;
+  extensions: string[];
+}
+```
+
+### File Utilities (src/lib/parser/utils/file-utils.ts)
+
+```typescript
+import { promises as fs } from 'fs';
+import { join, resolve } from 'path';
+import { glob } from 'glob';
+
+export class FileUtils {
+  static async readFile(filePath: string): Promise<string> {
+    return await fs.readFile(filePath, 'utf-8');
+  }
+
+  static async writeFile(filePath: string, content: string): Promise<void> {
+    const dir = filePath.substring(0, filePath.lastIndexOf('/'));
+    await this.ensureDirectory(dir);
+    await fs.writeFile(filePath, content, 'utf-8');
+  }
+
+  static async ensureDirectory(dirPath: string): Promise<void> {
+    try {
+      await fs.access(dirPath);
+    } catch {
+      await fs.mkdir(dirPath, { recursive: true });
+    }
+  }
+
+  static async exists(filePath: string): Promise<boolean> {
+    try {
+      await fs.access(filePath);
+      return true;
+    } catch {
+      return false;
+    }
+  }
+
+  static async getAllFiles(dirPath: string): Promise<string[]> {
+    const files = await glob(join(dirPath, '**/*'), {
+      nodir: true,
+      absolute: true
+    });
+    return files;
+  }
+
+  static async getAllSourceFiles(dirPath: string, extensions: string[]): Promise<string[]> {
+    const pattern = join(dirPath, '**/*{${extensions.join(',')}}');
+    const files = await glob(pattern, {
+      nodir: true,
+      absolute: true
+    });
+    return files;
+  }
+}
+```
+
+### Markdown Utilities (src/lib/parser/utils/markdown-utils.ts)
+
+```typescript
+export class MarkdownUtils {
+  static escape(text: string): string {
+    return text
+      .replace(/\\/g, '\\\\')
+      .replace(/#/g, '\\#')
+      .replace(/\*/g, '\\*')
+      .replace(/_/g, '\\_')
+      .replace(/`/g, '\\`')
+      .replace(/\[/g, '\\[')
+      .replace(/\]/g, '\\]');
+  }
+
+  static generateAnchor(text: string): string {
+    return text
+      .toLowerCase()
+      .replace(/[^a-z0-9\s-]/g, '')
+      .replace(/\s+/g, '-')
+      .replace(/-+/g, '-')
+      .replace(/^-|-$/g, '');
+  }
+
+  static formatCodeBlock(code: string, language?: string): string {
+    const lang = language || '';
+    return `\`\`\`${lang}\n${code}\n\`\`\``;
+  }
+
+  static formatTable(headers: string[], rows: string[][]): string {
+    if (rows.length === 0) return '';
+
+    let table = '| ' + headers.join(' | ') + ' |\n';
+    table += '| ' + headers.map(() => '---').join(' | ') + ' |\n';
+
+    for (const row of rows) {
+      table += '| ' + row.map(cell => cell || '').join(' | ') + ' |\n';
+    }
+
+    return table;
+  }
+}
+```
+
+### Base Parser (src/lib/parser/parsers/base-parser.ts)
+
+```typescript
+import { ILanguageParser } from '../core/interfaces';
+
+export abstract class BaseParser implements ILanguageParser {
+  abstract getLanguage(): string;
+  abstract getFileExtensions(): string[];
+  abstract parseFile(filePath: string): Promise<any[]>;
+
+  canParse(filePath: string): boolean {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    return this.getFileExtensions().includes(ext);
+  }
+
+  protected createEmptyDocumentation() {
+    return {
+      type: 'unknown' as const,
+      rawContent: '',
+      brief: '',
+      detailed: '',
+      tags: [],
+      location: { filePath: '', line: 0, column: 0 }
+    };
+  }
+
+  protected extractBrief(text: string): string {
+    const sentences = text.split(/[.!?]/);
+    if (sentences.length > 1 && sentences[0].length < 100) {
+      return sentences[0].trim() + '.';
+    }
+    
+    const lines = text.split('\n');
+    return lines[0].trim() || text.substring(0, 80).trim();
+  }
+}
+```
+
+### C++ Parser (src/lib/parser/parsers/cpp-parser.ts)
+
+```typescript
+import { BaseParser } from './base-parser';
+import { FileUtils } from '../utils/file-utils';
+import { ASTNode, DocumentationComment } from '../../../types/parser-types';
+
+export class CppParser extends BaseParser {
+  getLanguage(): string {
+    return 'cpp';
+  }
+
+  getFileExtensions(): string[] {
+    return ['.cpp', '.h', '.hpp', '.cxx', '.cc', '.c'];
+  }
+
+  async parseFile(filePath: string): Promise<ASTNode[]> {
+    const content = await FileUtils.readFile(filePath);
+    const lines = content.split('\n');
+    
+    const comments = this.extractComments(lines);
+    const nodes = await this.parseCodeElements(lines, comments, filePath);
+    
+    return nodes;
+  }
+
+  private extractComments(lines: string[]): Map<number, DocumentationComment> {
+    const comments = new Map<number, DocumentationComment>();
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      const trimmed = line.trim();
+      
+      if (trimmed.startsWith('/**')) {
+        const comment = this.parseBlockComment(lines, i);
+        if (comment) {
+          comments.set(i + 1, comment);
+          i += this.getCommentHeight(comment.rawContent) - 1;
+        }
+      } else if (trimmed.startsWith('///') || trimmed.startsWith('//!')) {
+        const comment = this.parseLineComment(lines, i);
+        if (comment) {
+          comments.set(i + 1, comment);
+        }
+      }
+    }
+    
+    return comments;
+  }
+
+  private parseBlockComment(lines: string[], startIndex: number): DocumentationComment | null {
+    let content = '';
+    let i = startIndex;
+    
+    while (i < lines.length && !lines[i].includes('*/')) {
+      content += lines[i] + '\n';
+      i++;
+    }
+    if (i < lines.length) {
+      content += lines[i] + '\n';
+    }
+    
+    return this.parseCommentContent(content, 'doxyblock', startIndex + 1);
+  }
+
+  private parseLineComment(lines: string[], startIndex: number): DocumentationComment | null {
+    const line = lines[startIndex];
+    return this.parseCommentContent(line, 'doxline', startIndex + 1);
+  }
+
+  private parseCommentContent(rawContent: string, type: 'doxline' | 'doxyblock', line: number): DocumentationComment {
+    const cleaned = rawContent
+      .replace(/\/\*\*|\/\*|\*\/|\/\/\/|\/\!/g, '')
+      .split('\n')
+      .map(line => line.trim().replace(/^\*\s?/, ''))
+      .join('\n')
+      .trim();
+
+    const tagRegex = /@(\w+)(?:\s+(.+?))?(?=\s+@|$)/gs;
+    const tags: { name: string; value: string }[] = [];
+    let tagMatch;
+    
+    while ((tagMatch = tagRegex.exec(cleaned)) !== null) {
+      tags.push({
+        name: tagMatch[1],
+        value: tagMatch[2]?.trim() || ''
+      });
+    }
+
+    const beforeTags = cleaned.split('@')[0].trim();
+    const brief = this.extractBrief(beforeTags);
+
+    return {
+      type,
+      rawContent,
+      brief,
+      detailed: beforeTags.substring(brief.length).trim(),
+      tags,
+      location: { filePath: '', line, column: 0 }
+    };
+  }
+
+  private getCommentHeight(content: string): number {
+    return content.split('\n').length;
+  }
+
+  private async parseCodeElements(
+    lines: string[], 
+    comments: Map<number, DocumentationComment>, 
+    filePath: string
+  ): Promise<ASTNode[]> {
+    const nodes: ASTNode[] = [];
+    
+    for (let i = 0; i < lines.length; i++) {
+      const line = lines[i];
+      
+      const classMatch = line.match(/^\s*(class|struct)\s+(\w+)/);
+      if (classMatch) {
+        const node = await this.parseClass(lines, i, comments, filePath);
+        if (node) {
+          nodes.push(node);
+        }
+      }
+      
+      const functionMatch = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+      if (functionMatch && !line.includes('class')) {
+        const node = await this.parseFunction(lines, i, comments, filePath);
+        if (node) {
+          nodes.push(node);
+        }
+      }
+    }
+    
+    return nodes;
+  }
+
+  private async parseClass(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(class|struct)\s+(\w+)/);
+    
+    if (!match) return null;
+    
+    const [, type, name] = match;
+    
+    let comment = comments.get(startIndex);
+    if (!comment) {
+      comment = comments.get(startIndex - 1);
+    }
+    
+    return {
+      type: type as 'class' | 'struct',
+      name,
+      documentation: comment || this.createEmptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      isStruct: type === 'struct'
+    };
+  }
+
+  private async parseFunction(
+    lines: string[], 
+    startIndex: number,
+    comments: Map<number, DocumentationComment>,
+    filePath: string
+  ): Promise<ASTNode | null> {
+    const line = lines[startIndex];
+    const match = line.match(/^\s*(?:\w+\s+)*(\w+)\s*\([^)]*\)\s*(?:\{|;)/);
+    
+    if (!match) return null;
+    
+    const [, name] = match;
+    
+    let comment = comments.get(startIndex);
+    if (!comment) {
+      comment = comments.get(startIndex - 1);
+    }
+    
+    return {
+      type: 'function',
+      name,
+      documentation: comment || this.createEmptyDocumentation(),
+      location: { filePath, line: startIndex + 1, column: 0 },
+      children: [],
+      signature: line.trim()
+    };
+  }
+}
+```
+
+### Documentation Generator (src/lib/parser/core/documentation-generator.ts)
+
+```typescript
+import { ASTNode, DocumentationConfig } from '../../../types/parser-types';
+import { FileUtils } from '../utils/file-utils';
+import { MarkdownUtils } from '../utils/markdown-utils';
+
+export class DocumentationGenerator {
+  private config: DocumentationConfig;
+
+  constructor(config: DocumentationConfig) {
+    this.config = config;
+  }
+
+  async generate(nodes: ASTNode[]): Promise<void> {
+    await FileUtils.ensureDirectory(this.config.outputDirectory);
+    
+    if (this.config.generateIndex) {
+      await this.generateIndex(nodes);
+    }
+    
+    await this.generateModuleDocumentation(nodes);
+    await this.generateNodeFiles(nodes);
+  }
+
+  private async generateIndex(nodes: ASTNode[]): Promise<void> {
+    const modules = this.organizeByModule(nodes);
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${this.config.indexTitle}
+
+`;
+
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      content += this.generateModuleSection(moduleName, moduleNodes);
+    }
+
+    const indexPath = `${this.config.outputDirectory}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private organizeByModule(nodes: ASTNode[]): Map<string, ASTNode[]> {
+    const modules = new Map<string, ASTNode[]>();
+    
+    for (const node of nodes) {
+      const moduleName = this.extractModuleName(node);
+      if (!modules.has(moduleName)) {
+        modules.set(moduleName, []);
+      }
+      modules.get(moduleName)!.push(node);
+    }
+    
+    return modules;
+  }
+
+  private extractModuleName(node: ASTNode): string {
+    if (node.type === 'namespace') {
+      return node.name;
+    }
+    
+    const pathParts = node.location.filePath.split('/');
+    const fileName = pathParts[pathParts.length - 1];
+    const moduleName = fileName.replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '');
+    
+    return moduleName;
+  }
+
+  private generateModuleSection(moduleName: string, nodes: ASTNode[]): string {
+    let section = `:material-package: [${moduleName}](${moduleName}/index.md)
+:   ${this.generateModuleDescription(nodes)}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      section += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        section += `| [${cls.name}](${moduleName}/${cls.name}.md) | ${desc} |\n`;
+      }
+      section += '\n';
+    }
+
+    return section;
+  }
+
+  private async generateModuleDocumentation(nodes: ASTNode[]): Promise<void> {
+    const modules = this.organizeByModule(nodes);
+    
+    for (const [moduleName, moduleNodes] of modules.entries()) {
+      await this.generateModuleFile(moduleName, moduleNodes);
+    }
+  }
+
+  private async generateModuleFile(moduleName: string, nodes: ASTNode[]): Promise<void> {
+    const modulePath = `${this.config.outputDirectory}/${moduleName}`;
+    await FileUtils.ensureDirectory(modulePath);
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${moduleName}
+
+`;
+
+    const classes = nodes.filter(n => n.type === 'class' || n.type === 'struct');
+    const functions = nodes.filter(n => n.type === 'function' || n.type === 'method');
+
+    if (classes.length > 0) {
+      content += '## Types\n\n| Name | Description |\n| ---- | ----------- |\n';
+      for (const cls of classes) {
+        const desc = cls.documentation.brief || '';
+        content += `| [${cls.name}](${cls.name}.md) | ${desc} |\n`;
+      }
+      content += '\n';
+    }
+
+    const indexPath = `${this.config.outputDirectory}/${moduleName}/index.md`;
+    await FileUtils.writeFile(indexPath, content);
+  }
+
+  private async generateNodeFiles(nodes: ASTNode[]): Promise<void> {
+    for (const node of nodes) {
+      if (node.type === 'class' || node.type === 'struct') {
+        await this.generateClassFile(node);
+      } else if (node.type === 'function' || node.type === 'method') {
+        await this.generateFunctionFile(node);
+      }
+    }
+  }
+
+  private async generateClassFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    let content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**${node.type} ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+`;
+
+    const methods = node.children.filter(n => n.type === 'function' || n.type === 'method');
+    if (methods.length > 0) {
+      content += '## Functions\n\n';
+      content += this.generateFunctionTable(methods);
+      content += '\n## Function Details\n\n';
+      content += this.generateFunctionDetails(methods);
+    }
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private async generateFunctionFile(node: ASTNode): Promise<void> {
+    const moduleName = this.extractModuleName(node);
+    const filePath = `${this.config.outputDirectory}/${moduleName}/${node.name}.md`;
+    
+    const content = `---
+generator: ${this.config.generatorName}
+---
+
+# ${node.name}
+
+**function ${node.name}**
+
+${this.formatDocumentation(node.documentation)}
+
+## Signature
+
+${MarkdownUtils.formatCodeBlock(node.signature || node.name, 'cpp')}
+
+## Source Location
+
+File: \`${node.location.filePath}\`:${node.location.line}
+
+`;
+
+    await FileUtils.writeFile(filePath, content);
+  }
+
+  private generateFunctionTable(functions: ASTNode[]): string {
+    const headers = ['Name', 'Description'];
+    const rows = functions.map(func => {
+      const desc = func.documentation.brief || '';
+      const anchor = MarkdownUtils.generateAnchor(func.name);
+      return [`[${func.name}](#${anchor})`, desc];
+    });
+    
+    return MarkdownUtils.formatTable(headers, rows);
+  }
+
+  private generateFunctionDetails(functions: ASTNode[]): string {
+    let details = '';
+    
+    for (const func of functions) {
+      const anchor = MarkdownUtils.generateAnchor(func.name);
+      details += `### ${func.name}<a name="${anchor}"></a>\n`;
+      details += `!!! function "${this.formatFunctionSignature(func)}"\n\n`;
+      details += `    ${this.formatDocumentation(func.documentation).replace(/\n/g, '\n    ')}\n\n`;
+    }
+    
+    return details;
+  }
+
+  private formatDocumentation(doc: DocumentationComment): string {
+    if (!doc.brief && !doc.detailed) return '';
+    
+    let formatted = '';
+    if (doc.brief) formatted += `@brief ${doc.brief}\n\n`;
+    if (doc.detailed) formatted += `${doc.detailed}\n\n`;
+    
+    return formatted;
+  }
+
+  private formatFunctionSignature(func: ASTNode): string {
+    return func.signature || `${func.name}()`;
+  }
+
+  private generateModuleDescription(nodes: ASTNode[]): string {
+    const descriptions = nodes
+      .map(n => n.documentation.brief)
+      .filter(Boolean)
+      .slice(0, 2);
+    
+    return descriptions.join(' ') || 'Module containing various components and utilities.';
+  }
+}
+```
+
+## 3. CLI Integration
+
+### Update Main CLI (src/cli/index.ts)
+
+```typescript
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { DocumentService } from '../services/documentService';
+import { existsSync } from 'fs';
+import { parserCommands } from './parser-commands';
+
+const program = new Command();
+const documentService = new DocumentService();
+
+program
+  .name('docs-rag')
+  .description('CLI tool for managing markdown documents in Qdrant and parsing source code')
+  .version('1.0.0');
+
+// ... existing commands remain unchanged ...
+
+// Add parser commands
+parserCommands(program);
+
+program.parse();
+```
+
+### Parser Commands (src/cli/parser-commands.ts)
+
+```typescript
+#!/usr/bin/env node
+
+import { Command } from 'commander';
+import { ParseService } from '../services/parseService';
+import { existsSync } from 'fs';
+import { resolve } from 'path';
+
+const parseService = new ParseService();
+
+export const parserCommands = (program: Command) => {
+  program
+    .command('parse')
+    .description('Generate documentation from source code comments')
+    .requiredOption('-i, --input <path>', 'Input directory containing source files')
+    .requiredOption('-o, --output <path>', 'Output directory for generated documentation')
+    .option('-l, --languages <languages>', 'Comma-separated list of languages to parse', 'cpp')
+    .option('-w, --watch', 'Watch for file changes and regenerate', false)
+    .option('--dry-run', 'Show what would be parsed without generating files', false)
+    .action(async (options) => {
+      if (!existsSync(options.input)) {
+        console.error(`Error: Input directory '${options.input}' does not exist`);
+        process.exit(1);
+      }
+
+      try {
+        const result = await parseService.generateDocumentation({
+          inputPath: resolve(options.input),
+          outputPath: resolve(options.output),
+          languages: options.languages.split(',').map((l: string) => l.trim()),
+          watch: options.watch,
+          dryRun: options.dryRun
+        });
+
+        if (options.dryRun) {
+          console.log('Dry run results:');
+          console.log(`  Files to process: ${result.filesToProcess.length}`);
+          result.filesToProcess.forEach(file => console.log(`    - ${file}`));
+        } else {
+          console.log(`Documentation generated successfully!`);
+          console.log(`  Processed ${result.processedFiles} files`);
+          console.log(`  Generated ${result.generatedFiles} documentation files`);
+          console.log(`  Output directory: ${options.output}`);
+        }
+      } catch (error) {
+        console.error('Error:', error);
+        process.exit(1);
+      }
+    });
+
+  program
+    .command('parse-list-languages')
+    .description('List all supported parser languages')
+    .action(() => {
+      const languages = parseService.getSupportedLanguages();
+      console.log('Supported languages:');
+      languages.forEach(lang => {
+        console.log(`  ${lang.name}: ${lang.description}`);
+        console.log(`    Extensions: ${lang.extensions.join(', ')}`);
+        console.log();
+      });
+    });
+};
+```
+
+### Parse Service (src/services/parseService.ts)
+
+```typescript
+import { 
+  ILanguageParser, 
+  DocumentationConfig, 
+  ParseOptions, 
+  ParseResult,
+  ValidationResult,
+  LanguageInfo
+} from '../lib/parser/core/interfaces';
+import { CppParser } from '../lib/parser/parsers/cpp-parser';
+import { DocumentationGenerator } from '../lib/parser/core/documentation-generator';
+import { FileUtils } from '../lib/parser/utils/file-utils';
+
+export class ParseService {
+  private parsers: Map<string, ILanguageParser> = new Map();
+
+  constructor() {
+    this.registerDefaultParsers();
+  }
+
+  private registerDefaultParsers(): void {
+    this.parsers.set('cpp', new CppParser());
+  }
+
+  getSupportedLanguages(): LanguageInfo[] {
+    return Array.from(this.parsers.entries()).map(([key, parser]) => ({
+      name: key,
+      description: `${key.toUpperCase()} source code parser`,
+      extensions: parser.getFileExtensions()
+    }));
+  }
+
+  async generateDocumentation(options: ParseOptions): Promise<ParseResult> {
+    const config = this.createDefaultConfig(options);
+    const sourceFiles = await this.findSourceFiles(options.inputPath, options.languages);
+    
+    if (options.dryRun) {
+      return {
+        processedFiles: 0,
+        generatedFiles: 0,
+        filesToProcess: sourceFiles,
+        errors: [],
+        estimatedFiles: this.estimateOutputFiles(sourceFiles)
+      };
+    }
+
+    const nodes = [];
+    const errors = [];
+
+    for (const filePath of sourceFiles) {
+      try {
+        const fileNodes = await this.parseFile(filePath);
+        nodes.push(...fileNodes);
+      } catch (error) {
+        errors.push(`Failed to parse ${filePath}: ${error}`);
+      }
+    }
+
+    const generator = new DocumentationGenerator(config);
+    await generator.generate(nodes);
+
+    const generatedFiles = await this.countGeneratedFiles(config.outputDirectory);
+
+    return {
+      processedFiles: sourceFiles.length,
+      generatedFiles,
+      filesToProcess: sourceFiles,
+      errors
+    };
+  }
+
+  private createDefaultConfig(options: ParseOptions): DocumentationConfig {
+    return {
+      outputDirectory: options.outputPath,
+      indexTitle: 'Source Code Documentation',
+      generatorName: 'docs-rag-parser',
+      generateIndex: true,
+      generateModuleIndexes: true,
+      includePrivate: options.includePrivate || false,
+      includeSourceLinks: true,
+      sourceRootPath: options.inputPath,
+      theme: 'material'
+    };
+  }
+
+  private async findSourceFiles(inputPath: string, languages: string[]): Promise<string[]> {
+    const allFiles = await FileUtils.getAllFiles(inputPath);
+    
+    return allFiles.filter(file => {
+      const parser = this.getParserForFile(file);
+      return parser && languages.includes(parser.getLanguage());
+    });
+  }
+
+  private getParserForFile(filePath: string): ILanguageParser | null {
+    const ext = filePath.substring(filePath.lastIndexOf('.'));
+    
+    for (const parser of this.parsers.values()) {
+      if (parser.getFileExtensions().includes(ext)) {
+        return parser;
+      }
+    }
+    
+    return null;
+  }
+
+  private async parseFile(filePath: string): Promise<any[]> {
+    const parser = this.getParserForFile(filePath);
+    if (!parser) {
+      throw new Error(`No parser found for file: ${filePath}`);
+    }
+
+    return await parser.parseFile(filePath);
+  }
+
+  private estimateOutputFiles(sourceFiles: string[]): number {
+    const uniqueNames = new Set(sourceFiles.map(file => 
+      require('path').basename(file).replace(/\.(cpp|h|hpp|cxx|cc|c)$/, '')
+    ));
+    return uniqueNames.size + Math.floor(sourceFiles.length * 2);
+  }
+
+  private async countGeneratedFiles(outputDirectory: string): Promise<number> {
+    try {
+      const files = await FileUtils.getAllFiles(outputDirectory);
+      return files.filter(file => file.endsWith('.md')).length;
+    } catch {
+      return 0;
+    }
+  }
+}
+```
+
+## 4. Build and Run
+
+```bash
+# Install dependencies
+npm install
+
+# Build the project
+npm run build
+
+# Run the parser
+npm run parse -- -i /path/to/cpp/source -o ./output/docs
+
+# Dry run to see what would be processed
+npm run parse -- -i /path/to/cpp/source -o ./output/docs --dry-run
+
+# List supported languages
+node dist/cli/index.js parse-list-languages
+```
+
+## 5. Test Example
+
+Create a test C++ file:
+
+```cpp
+/**
+ * @brief A simple calculator class
+ * This class provides basic arithmetic operations.
+ */
+class Calculator {
+public:
+    /// @brief Add two numbers
+    int add(int a, int b);
+    
+    /**
+     * @brief Multiply two numbers
+     * @param a First number
+     * @param b Second number
+     * @return Product of a and b
+     */
+    int multiply(int a, int b);
+};
+```
+
+Run the parser:
+
+```bash
+npm run parse -- -i . ./test-docs
+```
+
+Expected output structure:
+```
+test-docs/
+├── index.md
+└── Calculator/
+    ├── index.md
+    └── Calculator.md
+```
+
+This implementation provides a solid foundation for a TypeScript-based source code parser integrated into the existing docs-rag project.