LLM Context

LLM Context

Reduce friction when providing context to LLMs with smart file selection and rule-based filtering.

283
Stars
26
Forks
283
Watchers
4
Issues
LLM Context streamlines the process of sharing relevant project files and context with large language models. It employs smart file selection and customizable rule-based filtering to ensure only the most pertinent information is provided. The tool supports Model Context Protocol (MCP), allowing AI models to access additional files seamlessly through standardized commands. Integration with MCP enables instant project context sharing during AI conversations, enhancing productivity and collaboration.

Key Features

Smart file selection based on project rules
Customizable rule system for project-specific contexts
Direct integration with Model Context Protocol (MCP)
Instant formatted context generation
Rule-based filtering and file inclusion/exclusion
Support for multiple context categories (prompt, filter, instruction, style, excerpt)
AI-assisted rule creation
Project-specific customization capabilities
Command-line interface for easy operations
Non-MCP environment support

Use Cases

Efficiently sharing project context with LLMs
Enabling AI-assisted code reviews and debugging
Rapid onboarding of AI models to new codebases
Automating context preparation for AI coding assistants
Customizing context for specific development tasks
Reducing manual effort in file selection for AI queries
Supporting team collaboration with standardized context delivery
Improving accuracy of AI responses by tailoring provided context
Integrating AI directly into development workflows
Filtering non-essential files to maintain context relevance

README

LLM Context

License PyPI version Downloads

Reduce friction when providing context to LLMs. Share relevant project files instantly through smart selection and rule-based filtering.

The Problem

Getting project context into LLM chats is tedious:

  • Manually copying/pasting files takes forever
  • Hard to identify which files are relevant
  • Including too much hits context limits, too little misses important details
  • AI requests for additional files require manual fetching
  • Repeating this process for every conversation

The Solution

bash
lc-select # Smart file selection
lc-context # Instant formatted context
# Paste and work - AI can access additional files seamlessly

Result: From "I need to share my project" to productive AI collaboration in seconds.

Note: This project was developed in collaboration with several Claude Sonnets (3.5, 3.6, 3.7 and 4.0), as well as Groks (3 and 4), using LLM Context itself to share code during development. All code in the repository is heavily human-curated (by me 😇, @restlessronin).

Installation

bash
uv tool install "llm-context>=0.5.0"

Quick Start

Basic Usage

bash
# One-time setup
cd your-project
lc-init
# Daily usage
lc-select
lc-context

MCP Integration (Recommended)

jsonc
{
  "mcpServers": {
    "llm-context": {
      "command": "uvx",
      "args": ["--from", "llm-context", "lc-mcp"]
    }
  }
}

With MCP, AI can access additional files directly during conversations.

Project Customization

bash
# Create project-specific filters
cat > .llm-context/rules/flt-repo-base.md << 'EOF'
---
compose:
  filters: [lc/flt-base]
gitignores:
  full-files: ["*.md", "/tests", "/node_modules"]
---
EOF
# Customize main development rule
cat > .llm-context/rules/prm-code.md << 'EOF'
---
instructions: [lc/ins-developer, lc/sty-python]
compose:
  filters: [flt-repo-base]
  excerpters: [lc/exc-base]
---
Additional project-specific guidelines and context.
EOF

Core Commands

Command Purpose
lc-init Initialize project configuration
lc-select Select files based on current rule
lc-context Generate and copy context
lc-context -nt Generate context for non-MCP environments
lc-set-rule <name> Switch between rules
lc-missing Handle file and context requests (non-MCP)

Rule System

Rules use a systematic five-category structure:

  • Prompt Rules (prm-): Generate project contexts (e.g., lc/prm-developer, lc/prm-rule-create)
  • Filter Rules (flt-): Control file inclusion (e.g., lc/flt-base, lc/flt-no-files)
  • Instruction Rules (ins-): Provide guidelines (e.g., lc/ins-developer, lc/ins-rule-framework)
  • Style Rules (sty-): Enforce coding standards (e.g., lc/sty-python, lc/sty-code)
  • Excerpt Rules (exc-): Configure extractions for context reduction (e.g., lc/exc-base)

Example Rule

yaml
---
description: "Debug API authentication issues"
compose:
  filters: [lc/flt-no-files]
  excerpters: [lc/exc-base]
also-include:
  full-files: ["/src/auth/**", "/tests/auth/**"]
---
Focus on authentication system and related tests.

AI-Assisted Rule Creation

Let AI create focused rules for specific tasks:

bash
# Automatic with Claude Skills (recommended)
lc-init  # Installs skill globally
# Then in Claude: "Create a rule for [your task]"
bash
# Or prompt-based (any LLM)
lc-set-rule lc/prm-rule-create
lc-context -nt
# Describe your task to the AI

Both approaches analyze your codebase and generate optimized rules that can significantly reduce context size.

Workflow Patterns

Daily Development

bash
lc-set-rule lc/prm-developer
lc-select
lc-context
# AI can review changes, access additional files as needed

Focused Tasks

bash
# Let AI help create minimal context
lc-set-rule lc/prm-rule-create
lc-context -nt
# Work with AI to create task-specific rule using tmp-prm- prefix

MCP Benefits

  • Code review: AI examines your changes for completeness/correctness
  • Additional files: AI accesses initially excluded files when needed
  • Change tracking: See what's been modified during conversations
  • Zero friction: No manual file operations during development discussions

Key Features

  • Smart File Selection: Rules automatically include/exclude appropriate files
  • Instant Context Generation: Formatted context copied to clipboard in seconds
  • MCP Integration: AI can access additional files without manual intervention
  • Systematic Rule Organization: Five-category system for clear rule composition
  • AI-Assisted Rule Creation: Let AI help create minimal context for specific tasks
  • Code Excerpting: Extractions of significant content to reduce context while preserving structure

Learn More

License

Apache License, Version 2.0. See LICENSE for details.

Star History

Star History Chart

Repository Owner

cyberchitta
cyberchitta

Organization

Repository Details

Language Python
Default Branch main
Size 879 KB
Contributors 2
License Apache License 2.0
MCP Verified Nov 12, 2025

Programming Languages

Python
85.43%
Jinja
8.45%
Tree-sitter Query
6.12%

Tags

Topics

claude-desktop cli coding model-context-protocol tools

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP CLI

    MCP CLI

    A powerful CLI for seamless interaction with Model Context Protocol servers and advanced LLMs.

    MCP CLI is a modular command-line interface designed for interacting with Model Context Protocol (MCP) servers and managing conversations with large language models. It integrates with the CHUK Tool Processor and CHUK-LLM to provide real-time chat, interactive command shells, and automation capabilities. The system supports a wide array of AI providers and models, advanced tool usage, context management, and performance metrics. Rich output formatting, concurrent tool execution, and flexible configuration make it suitable for both end-users and developers.

    • 1,755
    • MCP
    • chrishayuk/mcp-cli
  • VideoDB Agent Toolkit

    VideoDB Agent Toolkit

    AI Agent toolkit that exposes VideoDB context to LLMs with MCP support

    VideoDB Agent Toolkit provides tools for exposing VideoDB context to large language models (LLMs) and agents, enabling integration with AI-driven IDEs and chat agents. It automates context generation, metadata management, and discoverability by offering structured context files like llms.txt and llms-full.txt, and standardized access via the Model Context Protocol (MCP). The toolkit ensures synchronization of SDK versions, comprehensive documentation, and best practices for seamless AI-powered workflows.

    • 43
    • MCP
    • video-db/agent-toolkit
  • MarkItDown

    MarkItDown

    Convert diverse files into Markdown for seamless LLM integration.

    MarkItDown is a lightweight Python utility for converting a wide range of file types—including PDF, Office documents, images, audio, websites, and more—into structured Markdown optimized for language models and text analysis tools. It includes an implementation of the Model Context Protocol (MCP) to facilitate integration with LLM applications, such as Claude Desktop. MarkItDown supports context-aware document conversions, prioritizing preservation of hierarchy and meaningful content, and can be used via CLI or as a library.

    • 82,918
    • MCP
    • microsoft/markitdown
  • Wanaku MCP Router

    Wanaku MCP Router

    A router connecting AI-enabled applications through the Model Context Protocol.

    Wanaku MCP Router serves as a middleware router facilitating standardized context exchange between AI-enabled applications and large language models via the Model Context Protocol (MCP). It streamlines context provisioning, allowing seamless integration and communication in multi-model AI environments. The tool aims to unify and optimize the way applications provide relevant context to LLMs, leveraging open protocol standards.

    • 87
    • MCP
    • wanaku-ai/wanaku
  • Lucidity MCP

    Lucidity MCP

    Intelligent prompt-based code quality analysis for AI coding assistants.

    Lucidity MCP is a Model Context Protocol (MCP) server that empowers AI coding assistants to deliver high-quality code through intelligent, prompt-driven analysis. It offers comprehensive detection of code issues across multiple quality dimensions, providing structured and actionable feedback. With language-agnostic capabilities, extensible framework, and flexible transport options, Lucidity MCP seamlessly integrates into developer workflows and AI systems.

    • 72
    • MCP
    • hyperb1iss/lucidity-mcp
  • Context7 MCP

    Context7 MCP

    Up-to-date code docs for every AI prompt.

    Context7 MCP delivers current, version-specific documentation and code examples directly into large language model prompts. By integrating with model workflows, it ensures responses are accurate and based on the latest source material, reducing outdated and hallucinated code. Users can fetch relevant API documentation and examples by simply adding a directive to their prompts. This allows for more reliable, context-rich answers tailored to real-world programming scenarios.

    • 36,881
    • MCP
    • upstash/context7
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results