Driflyte MCP Server

Driflyte MCP Server

Bridging AI assistants with deep, topic-aware knowledge from web and code sources.

9
Stars
2
Forks
9
Watchers
2
Issues
Driflyte MCP Server acts as a bridge between AI-powered assistants and diverse, topic-aware content sources by exposing a Model Context Protocol (MCP) server. It enables retrieval-augmented generation workflows by crawling, indexing, and serving topic-specific documents from web pages and GitHub repositories. The system is extensible, with planned support for additional knowledge sources, and is designed for easy integration with popular AI tools such as ChatGPT, Claude, and VS Code.

Key Features

Recursively crawls and indexes web pages
Crawls GitHub repositories, issues, and discussions
Tags documents with topics for precise retrieval
Supports both STDIO and streamable HTTP transport protocols
Works with popular MCP clients like ChatGPT, Claude, Cursor, VS Code
Configurable rate limiting for fair usage
Extensible to future content sources (Slack, Google Docs, etc.)
No authentication or registration required
Built for RAG workflows
AI-powered and AI-native design

Use Cases

Enhancing AI assistant responses with grounded, topic-specific context
Indexing and searching knowledge from web pages and GitHub projects
Powering retrieval-augmented generation (RAG) workflows in AI tools
Seamlessly connecting external documentation to developer IDEs
Conducting deep research from diverse and recursively crawled online sources
Augmenting conversational AI with up-to-date external knowledge
Retrieving and surfacing relevant code or documentation from repositories
Supporting domain-specific question answering for enterprise knowledge bases
Providing contextually relevant data to LLM-based copilots
Rapidly prototyping AI tools that require external knowledge grounding

README

Driflyte MCP Server

Build Status NPM Version License MCP Badge

MCP Server for Driflyte.

The Driflyte MCP Server exposes tools that allow AI assistants to query and retrieve topic-specific knowledge from recursively crawled and indexed web pages. With this MCP server, Driflyte acts as a bridge between diverse, topic-aware content sources (web, GitHub, and more) and AI-powered reasoning, enabling richer, more accurate answers.

What It Does

  • Deep Web Crawling: Recursively follows links to crawl and index web pages.
  • GitHub Integration: Crawls repositories, issues, and discussions.
  • Extensible Resource Support: Future support planned for Slack, Microsoft Teams, Google Docs/Drive, Confluence, JIRA, Zendesk, Salesforce, and more.
  • Topic-Aware Indexing: Each document is tagged with one or more topics, enabling targeted, topic-specific retrieval.
  • Designed for RAG with RAG: The server itself is built with Retrieval-Augmented Generation (RAG) in mind, and it powers RAG workflows by providing assistants with high-quality, topic-specific documents as grounding context.
  • Designed for AI with AI: The system is not just for AI assistants — it is also designed and evolved using AI itself, making it an AI-native component for intelligent knowledge retrieval.

Usage & Limits

  • Free Access: Driflyte is currently free to use.
  • No Signup Required: You can start using it immediately — no registration or subscription needed.
  • Rate Limits: To ensure fair usage, requests are limited by IP:
    • 100 API requests per 5 minutes per IP address.
  • Future changes to usage policies and limits may be introduced as new features and resource integrations become available.

Prerequisites

  • Node.js 18+
  • An AI assistant (with MCP client) like Cursor, Claude (Desktop or Code), VS Code, Windsurf, etc ...

Configurations

CLI Arguments

Driflyte MCP server supports the following CLI arguments for configuration:

  • --transport <stdio|streamable-http> - Configures the transport protocol (defaults to stdio).
  • --port <number> – Configures the port number to listen on when using streamable-http transport (defaults to 3000).

Quick Start

This MCP server (using STDIO or Streamable HTTP transport) can be added to any MCP Client like VS Code, Claude, Cursor, Windsurf Github Copilot via the @driflyte/mcp-server NPM package.

ChatGPT

  • Navigate to Settings under your profile and enable Developer Mode under the Connectors option.
  • In the chat panel, click the + icon, and from the dropdown, select Developer Mode. You’ll see an option to add sources/connectors.
  • Enter the following MCP Server details and then click Create:
    • Name: Driflyte
    • MCP Server URL: https://mcp.driflyte.com/openai
    • Authentication: No authentication
    • Trust Setting: Check I trust this application

See How to set up a remote MCP server and connect it to ChatGPT deep research and MCP server tools now in ChatGPT – developer mode for more info.

Claude Code

Run the following command. See Claude Code MCP docs for more info.

Local Server

bash
claude mcp add driflyte -- npx -y @driflye/mcp-server

Remote Server

bash
claude mcp add --transport http driflyte https://mcp.driflyte.com/mcp

Claude Desktop

Local Server

Add the following configuration into the claude_desktop_config.json file. See the Claude Desktop MCP docs for more info.

json
{
  "mcpServers": {
    "driflyte": {
      "command": "npx",
      "args": ["-y", "@driflyte/mcp-server"]
    }
  }
}

Remote Server

Go to the Settings > Connectors > Add Custom Connector in the Claude Desktop and add the new MCP server with the following fields:

  • Name: Driflyte
  • Remote MCP server URL: https://mcp.driflyte.com/mcp

Copilot Coding Agent

Add the following configuration to the mcpServers section of your Copilot Coding Agent configuration through Repository > Settings > Copilot > Coding agent > MCP configuration. See the Copilot Coding Agent MCP docs for more info.

Local Server

json
{
  "mcpServers": {
    "driflyte": {
      "type": "local",
      "command": "npx",
      "args": ["-y", "@driflyte/mcp-server"]
    }
  }
}

Remote Server

json
{
  "mcpServers": {
    "driflyte": {
      "type": "http",
      "url": "https://mcp.driflyte.com/mcp"
    }
  }
}

Cursor

Add the following configuration into the ~/.cursor/mcp.json file (or .cursor/mcp.json in your project folder). Or setup by 🖱️One Click Installation. See the Cursor MCP docs for more info.

Local Server

json
{
  "mcpServers": {
    "driflyte": {
      "command": "npx",
      "args": ["-y", "@driflyte/mcp-server"]
    }
  }
}

Remote Server

json
{
  "mcpServers": {
    "driflyte": {
      "url": "https://mcp.driflyte.com/mcp"
    }
  }
}

Gemini CLI

Add the following configuration into the ~/.gemini/settings.json file. See the Gemini CLI MCP docs for more info.

Local Server

json
{
  "mcpServers": {
    "driflyte": {
      "command": "npx",
      "args": ["-y", "@driflyte/mcp-server"]
    }
  }
}

Remote Server

json
{
  "mcpServers": {
    "driflyte": {
      "httpUrl": "https://mcp.driflyte.com/mcp"
    }
  }
}

Smithery

Run the following command. You can find your Smithery API key here. See the Smithery CLI docs for more info.

bash
npx -y @smithery/cli install @serkan-ozal/driflyte-mcp-server --client <SMITHERY-CLIENT-NAME> --key <SMITHERY-API-KEY>

VS Code

Add the following configuration into the .vscode/mcp.json file. Or setup by 🖱️One Click Installation. See the VS Code MCP docs for more info.

Local Server

json
{
  "mcp": {
    "servers": {
      "driflyte": {
        "type": "stdio",
        "command": "npx",
        "args": ["-y", "@driflyte/mcp-server"]
      }
    }
  }
}

Remote Server

json
{
  "mcp": {
    "servers": {
      "driflyte": {
        "type": "http",
        "url": "https://mcp.driflyte.com/mcp"
      }
    }
  }
}

Windsurf

Add the following configuration into the ~/.codeium/windsurf/mcp_config.json file. See the Windsurf MCP docs for more info.

Local Server

json
{
  "mcpServers": {
    "driflyte": {
      "command": "npx",
      "args": ["-y", "@driflyte/mcp-server"]
    }
  }
}

Remote Server

json
{
  "mcpServers": {
    "driflyte": {
      "serverUrl": "https://mcp.driflyte.com/mcp"
    }
  }
}

Components

Tools

  • list-topics: Returns a list of topics for which resources (web pages, etc ...) have been crawled and content is available. This allows AI assistants to discover the most relevant and up-to-date subject areas currently indexed by the crawler.
    • Input Schema: No input parameter supported.
    • Output Schema:
      • topics:
        • Optinal: false
        • Type: Array<string>
        • Description: List of the supported topics.
  • search: Given a list of topics and a user question, this tool retrieves the top-K most relevant documents from the crawled content. It is designed to help AI assistants surface the most contextually appropriate and up-to-date information for a specific topic and query. This enables more informed and accurate responses based on real-world, topic-tagged web content.
    • Input Schema:
      • topics
        • Optinal: false
        • Type: Array<string>
        • Description: A list of one or more topic identifiers to constrain the search space. Only documents tagged with at least one of these topics will be considered.
      • query
        • Optinal: false
        • Type: string
        • Description: The natural language query or question for which relevant information is being sought. This will be used to rank documents by semantic relevance.
      • topK
        • Optinal: true
        • Type: number
        • Default Value: 10
        • Min Value: 1
        • Max Value: 30
        • Description: The maximum number of relevant documents to return. Results are sorted by descending relevance score.
    • Output Schema:
      • documents:
        • Optional: false
        • Type: Array<Document>
        • Description: Matched documents to the search query.
        • Type: Document:
          • content - Optinal: false - Type: string - Description: Related content (full or partial) of the matched document.
          • metadata - Optinal: false - Type: Map<string, any> - Description: Metadata of the document and related content in key-value format.
          • score - Optinal: false - Type: number - Min Value: 0 - Max Value: 1 - Description: Similarity score (between 0 and 1) for the content of the document.

Resources

N/A

Roadmap

  • Support more content types (.pdf, .ppt/.pptx, .doc/.docx, and many others applicable including audio and video file formats ...)
  • Integrate with more data sources (Slack, Teams, Google Docs/Drive, Confluence, JIRA, Zendesk, Salesforce, etc ...))
  • And more topics with their resources

Issues and Feedback

Issues Closed issues

Please use GitHub Issues for any bug report, feature request and support.

Contribution

Pull requests Closed pull requests Contributors

If you would like to contribute, please

  • Fork the repository on GitHub and clone your fork.
  • Create a branch for your changes and make your changes on it.
  • Send a pull request by explaining clearly what is your contribution.

Tip: Please check the existing pull requests for similar contributions and consider submit an issue to discuss the proposed feature before writing code.

License

Licensed under MIT.

Star History

Star History Chart

Repository Owner

Repository Details

Language TypeScript
Default Branch master
Size 333 KB
Contributors 2
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
62.55%
JavaScript
29.58%
Shell
6.33%
Dockerfile
1.54%

Tags

Topics

ai crawler mcp model-context-protocol opentelemetry rag

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • GitHub MCP Server

    GitHub MCP Server

    Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.

    GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.

    • 24,418
    • MCP
    • github/github-mcp-server
  • @growi/mcp-server

    @growi/mcp-server

    Bridge GROWI wiki content to AI models with context-aware access and management.

    @growi/mcp-server acts as a Model Context Protocol (MCP) server that enables AI models to access, search, and manage GROWI wiki content within an organization. It facilitates seamless connection between multiple GROWI instances and language models, enhancing information retrieval and knowledge management capabilities. The platform provides comprehensive tools for page, tag, comment, and revision management as well as share link and user activity tracking. Its flexible configuration allows simultaneous operation with several GROWI apps for scalable deployment.

    • 10
    • MCP
    • growilabs/growi-mcp-server
  • MyMCP Server (All-in-One Model Context Protocol)

    MyMCP Server (All-in-One Model Context Protocol)

    Powerful and extensible Model Context Protocol server with developer and productivity integrations.

    MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.

    • 93
    • MCP
    • nguyenvanduocit/all-in-one-model-context-protocol
  • Google Workspace MCP Server

    Google Workspace MCP Server

    Full natural language control of Google Workspace through the Model Context Protocol.

    Google Workspace MCP Server enables comprehensive natural language interaction with Google services such as Calendar, Drive, Gmail, Docs, Sheets, Slides, Forms, Tasks, and Chat via any MCP-compatible client or AI assistant. It supports both single-user and secure multi-user OAuth 2.1 authentication, providing a production-ready backend for custom apps. Built on FastMCP, it delivers high performance and advanced context handling, offering deep integration with the entire Google Workspace suite.

    • 890
    • MCP
    • taylorwilsdon/google_workspace_mcp
  • AgentQL MCP Server

    AgentQL MCP Server

    MCP-compliant server for structured web data extraction using AgentQL.

    AgentQL MCP Server acts as a Model Context Protocol (MCP) server that leverages AgentQL's data extraction capabilities to fetch structured information from web pages. It allows integration with applications supporting MCP, such as Claude Desktop, VS Code, and Cursor, by providing an accessible interface for extracting structured data based on user-defined prompts. With configurable API key support and streamlined installation, it simplifies the process of connecting web data extraction workflows to AI tools.

    • 120
    • MCP
    • tinyfish-io/agentql-mcp
  • Klavis

    Klavis

    One MCP server for AI agents to handle thousands of tools.

    Klavis provides an MCP (Model Context Protocol) server with over 100 prebuilt integrations for AI agents, enabling seamless connectivity with various tools and services. It offers both cloud-hosted and self-hosted deployment options and includes out-of-the-box OAuth support for secure authentication. Klavis is designed to act as an intelligent connector, streamlining workflow automation and enhancing agent capability through standardized context management.

    • 5,447
    • MCP
    • Klavis-AI/klavis
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results