just-mcp
A production-ready MCP server for Justfile command integration with LLMs.
Key Features
Use Cases
README
just-mcp
π A way to let LLMs speak Just
A production-ready MCP server that provides seamless integration with Just command runner, enabling AI assistants to discover, execute, and introspect Justfile recipes through the standardized MCP protocol.
π― Why Just + MCP = Better Agent Execution
Context-Saving Abstraction
If it isn't immediately obvious, the benefit of having LLMs use Just vs. bash is that running Just commands (via MCP) provides a context-saving abstraction where they don't need to waste context opening/reading bash files, Python scripts, or other build artifacts. The LLM via MCP simply gets the command, parameters, and hints - it's in their memory as "these are commands available to you."
Eliminates the Justfile Learning Curve
No more watching LLMs execute just -l to get command lists, inevitably start reading the justfile, then try to write justfile syntax (like it's a Makefile), corrupt the justfile, and create a bad experience. Just's evolving syntax simply doesn't have a large enough corpus in frontier models today - we need more popular repos with justfiles in the training dataset.
Safer Than Raw Bash Access
Just-mcp is fundamentally safer than bash. If you read HackerNews, there's a story at least once daily about operators whose LLMs start forgetting, hallucinating, and eventually breaking down - deleting files and doing nasty unwanted things. Giving LLMs unsupervised, unrestricted bash access without carefully monitoring context consumption is a recipe for disaster.
Using Justfile fixes that. Even if the LLM modifies its own justfile, the next context is memoized by the justfile (hopefully in an idempotent git repo). This abstraction shields the llm from the command line complexity where hallucinations or attention tracking the current working directory cause it to go over the rails and off the cliff.
Powerful Agent Execution Tool
Just-mcp is perfect for anybody doing agent execution:
- Ultra-low overhead - probably better than every other tool
- Human-friendly - justfiles are easy for humans and low overhead for LLMs
- Quick and dirty - while some prefer full Python FastAPI servers, just-mcp is just easy-as
- sm0l model friendly - works great with self-hostable GPU/CPU open source models with 8k-32k context limits
Built-in Safety Patterns
Just has useful patterns for introducing:
- Transparent logging without distracting the agent
- Secondary model inspection - use sm0l models to scan commands asking "is this harmful?" before execution
- Python decorator-like patterns for command validation
- Idempotent execution backed by git repos
b00t
b00t mcp create just-mcp -- bash just-mcp --stdio "${REPO_ROOT}"
b00t mcp export just-mcp
π Current Status: 67% Complete (8/12 core tasks)
β Implemented Features
- ποΈ Complete MCP Server - Full rmcp 0.3.0 integration with MCP 2024-11-05 protocol
- π Recipe Discovery - Parse and list all available Justfile recipes
- β‘ Recipe Execution - Execute recipes with parameters and capture structured output
- π Recipe Introspection - Get detailed recipe information, parameters, and documentation
- β Justfile Validation - Syntax and semantic validation with error reporting
- π Environment Management - Comprehensive .env file support and variable expansion
- π§ͺ Full Test Coverage - 33 passing tests across integration and unit test suites
π― MCP Tools Available
list_recipes- List all available recipes in the justfilerun_recipe- Execute a specific recipe with optional argumentsget_recipe_info- Get detailed information about a specific recipevalidate_justfile- Validate the justfile for syntax and semantic errors
π Quick Start
Installation & Setup
# Clone and build
git clone <repository-url>
cd just-mcp
cargo build --release
# Test the server
cargo run -- --stdio
Claude Desktop Integration
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"just-mcp": {
"command": "/path/to/just-mcp",
"args": ["--stdio"]
}
}
}
Usage Examples
# Run as MCP server
just-mcp --stdio
# Run in specific directory
just-mcp --directory /path/to/project --stdio
π§ͺ Testing
Comprehensive Test Suite
# Run all tests (33 tests)
cargo test
# Run specific test suites
cargo test --test basic_mcp_test # Protocol compliance testing
cargo test --test mcp_integration_working # SDK integration testing
Test Architecture
basic_mcp_test.rs- Direct protocol compliance testing using raw JSON-RPCmcp_integration_working.rs- Type-safe SDK integration testing with rmcp client- Unit tests - 25+ tests covering parser, executor, validator, and environment modules
π Architecture
Project Structure
just-mcp/
βββ src/main.rs # CLI binary
βββ just-mcp-lib/ # Core library
β βββ parser.rs # Justfile parsing
β βββ executor.rs # Recipe execution
β βββ validator.rs # Validation logic
β βββ environment.rs # Environment management
β βββ mcp_server.rs # MCP protocol implementation
βββ tests/ # Integration tests
βββ justfile # Demo recipes
Tech Stack
- Rust 1.82+ with async/await support
- rmcp 0.3.0 - Official MCP SDK for Rust
- serde/serde_json - JSON serialization
- snafu - Structured error handling
- tokio - Async runtime
π Development Roadmap
π― Next Priority Tasks (Remaining 33%)
- LSP-Style Completion System - Intelligent autocompletion for recipes and parameters
- Enhanced Diagnostics - Advanced syntax error reporting and suggestions
- Virtual File System - Support for stdin, remote sources, and in-memory buffers
- Release Preparation - Documentation, CI/CD, and crate publication
π Future Enhancements
- Plugin system for custom recipe types
- Integration with other build tools
- Performance optimizations for large justfiles
- Advanced dependency visualization
π Usage Patterns
Recipe Execution
// List available recipes
await client.callTool("list_recipes", {});
// Execute recipe with parameters
await client.callTool("run_recipe", {
"recipe_name": "build",
"args": "[\"--release\"]"
});
// Get recipe information
await client.callTool("get_recipe_info", {
"recipe_name": "test"
});
Validation
// Validate justfile
await client.callTool("validate_justfile", {
"justfile_path": "./custom.justfile"
});
π€ Contributing
This project follows the b00t development methodology:
- TDD Approach - Tests first, implementation second
- Feature Branches - Never work directly on main branch
- Structured Errors - Use snafu for error management
- Git Workflow - Clean commits with descriptive messages
Development Commands
just build # Build the project
just test # Run tests
just server # Start MCP server
just clean # Clean build artifacts
π License
This project is licensed under LICENSE.
π Release Setup & CI/CD
β Completed Setup
Cocogitto & Conventional Commits
- Installed cocogitto for conventional commit enforcement
- Configured
cog.tomlwith proper commit types and changelog settings - Set up git hooks for commit message linting (
commit-msg) and pre-push testing
GitHub Actions CI/CD
- CI Pipeline (
ci.yml): Multi-platform testing (Ubuntu, Windows, macOS), formatting, clippy, commit linting - Release Pipeline (
release.yml): Automated versioning, changelog generation, GitHub releases, and crates.io publishing
Crates.io Preparation
- Updated both
Cargo.tomlfiles with complete metadata (description, keywords, categories, license, etc.) - Added proper exclusions for development-only files
- Verified MIT license is in place
Documentation & Structure
- README.md is production-ready with installation and usage instructions
- Created initial
CHANGELOG.mdfor release tracking - Updated
.gitignorewith Rust-specific entries
π Production Deployment
Development Workflow:
- All commits must follow conventional commit format (enforced by git hooks)
- Use
feat:,fix:,docs:, etc. prefixes for automatic versioning - Push to
mainbranch triggers automated releases and crates.io publishing - Library tests pass β (25/25) with comprehensive test coverage
Release Process:
- Automated Versioning: Cocogitto analyzes commit messages for semantic versioning
- GitHub Releases: Automatic changelog generation and GitHub release creation
- Crates.io Publishing: Library crate (
just-mcp-lib) publishes first, then binary crate (just-mcp) - CI/CD Pipeline: Multi-platform testing (Ubuntu, Windows, macOS) with formatting and clippy checks
Installation:
# Install from crates.io
cargo install just-mcp
# Or download from GitHub releases
wget https://github.com/promptexecution/just-mcp/releases/latest/download/just-mcp
π Related Projects
- Just - The command runner this integrates with
- Model Context Protocol - The protocol specification
- rmcp - Official Rust MCP SDK
Friends of just-mcp
- just-vscode - VSCode extension with LSP integration for enhanced Just authoring
- just-awesome-agents - Collection of patterns and tools for agent execution with Just# Test change to trigger pre-push hook
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
AIM Guard MCP
AI-powered security and safety server for Model Context Protocol environments.
AIM Guard MCP is a server implementing the Model Context Protocol (MCP), providing AI-powered security analysis and safety instruction tools tailored for AI agents. It offers features such as contextual security instructions, harmful content detection, API key scanning, and prompt injection detection, all designed to guard and protect interactions with various MCPs and external services. Built for fast integration, it connects with the AIM Intelligence API and is compatible with any MCP-compliant AI assistant.
- β 13
- MCP
- AIM-Intelligence/AIM-MCP
MCP Claude Code
Claude Code-like functionality via the Model Context Protocol.
Implements a server utilizing the Model Context Protocol to enable Claude Code functionality, allowing AI agents to perform advanced codebase analysis, modification, and command execution. Supports code understanding, file management, and integration with various LLM providers. Offers specialized tools for searching, editing, and delegating tasks, with robust support for Jupyter notebooks. Designed for seamless collaboration with MCP clients including Claude Desktop.
- β 281
- MCP
- SDGLBL/mcp-claude-code
MCP Language Server
Bridge codebase navigation tools to AI models using MCP-enabled language servers.
MCP Language Server implements the Model Context Protocol, allowing MCP-enabled clients, such as LLMs, to interact with language servers for codebase navigation. It exposes standard language server featuresβlike go to definition, references, rename, and diagnosticsβover MCP for seamless integration with AI tooling. The server supports multiple languages by serving as a proxy to underlying language servers, including gopls, rust-analyzer, and pyright.
- β 1,256
- MCP
- isaacphi/mcp-language-server
Vibe Check MCP
Plug & play agent oversight tool to keep LLMs aligned, reflective, and safe.
Vibe Check MCP provides a mentor layer over large language model agents to prevent over-engineering and promote optimal, minimal pathways. Leveraging research-backed oversight, it integrates seamlessly as an MCP server with support for STDIO and streamable HTTP transport. The platform enhances agent reliability, improves task success rates, and significantly reduces harmful actions. Designed for easy plug-and-play with MCP-aware clients, it is trusted across multiple MCP platforms and registries.
- β 315
- MCP
- PV-Bhat/vibe-check-mcp-server
Insforge MCP Server
A Model Context Protocol server for seamless integration with Insforge and compatible AI clients.
Insforge MCP Server implements the Model Context Protocol (MCP), enabling smooth integration with various AI tools and clients. It allows users to configure and manage connections to the Insforge platform, providing automated and manual installation methods. The server supports multiple AI clients such as Claude Code, Cursor, Windsurf, Cline, Roo Code, and Trae via standardized context management. Documentation and configuration guidelines are available for further customization and usage.
- β 3
- MCP
- InsForge/insforge-mcp
MCP Server for ZenML
Expose ZenML data and pipeline operations via the Model Context Protocol.
Implements a Model Context Protocol (MCP) server for interfacing with the ZenML API, enabling standardized access to ZenML resources for AI applications. Provides tools for reading data about users, stacks, pipelines, runs, and artifacts, as well as triggering new pipeline runs if templates are available. Includes robust testing, automated quality checks, and supports secure connection from compatible MCP clients. Designed for easy integration with ZenML instances, supporting both local and remote ZenML deployments.
- β 32
- MCP
- zenml-io/mcp-zenml
Didn't find tool you were looking for?