MCP Prompt Engine

MCP Prompt Engine

A dynamic MCP server for managing and serving reusable, logic-driven AI prompt templates.

15
Stars
2
Forks
15
Watchers
1
Issues
MCP Prompt Engine is a Model Control Protocol (MCP) server designed to manage and serve dynamic prompt templates using Go's text/template system. It enables users to create reusable, logic-driven prompt templates with support for variables, partials, and conditionals. The engine interacts seamlessly with any compatible MCP client, providing prompt arguments, rich CLI tools, and automatic hot-reloading of templates. Docker support and intelligent argument parsing enhance its integration and deployment capabilities.

Key Features

Compatible with all MCP clients supporting prompt standards
Dynamic prompt templates using Go's text/template syntax
Support for reusable partial templates
Automatic exposure of template variables as prompt arguments
Hot-reload of prompt files without server restart
Modern command-line interface for prompt management and validation
Smart argument parsing including JSON support
Automatic injection of environment variables as argument fallbacks
Dockerized deployment for easy integration
Validation of template syntax and structure

Use Cases

Managing a central repository of AI prompt templates for developers
Serving dynamic prompts to AI coding assistants like Claude Code or Gemini CLI
Validating and rendering custom prompts during LLM workflow development
Integrating reusable partials for standardized prompt logic
Deploying prompt management servers in cloud environments using Docker
Automating prompt argument population via environment variables
Enabling hot-reload for rapid template iteration in production environments
Supporting multi-user teams with shared prompt templates
Providing prompt management tools for VSCode extensions and other MCP-based tools
Seamless prompt delivery to LLM-powered applications and plugins

README

MCP Prompt Engine

Go Report Card GitHub release (latest by date) License: MIT Go.Dev reference

A Model Control Protocol (MCP) server for managing and serving dynamic prompt templates using elegant and powerful text template engine. Create reusable, logic-driven prompts with variables, partials, and conditionals that can be served to any compatible MCP client like Claude Code, Claude Desktop, Gemini CLI, VSCode with Copilot, etc.

Key Features

  • MCP Compatible: Works out-of-the-box with any MCP client that supports prompts.
  • Powerful Go Templates: Utilizes the full power of Go text/template syntax, including variables, conditionals, loops, and more.
  • Reusable Partials: Define common components in partial templates (e.g., _header.tmpl) and reuse them across your prompts.
  • Prompt Arguments: All template variables are automatically exposed as MCP prompt arguments, allowing dynamic input from clients.
  • Hot-Reload: Automatically detects changes to your prompt files and reloads them without restarting the server.
  • Rich CLI: A modern command-line interface to list, validate, and render templates for easy development and testing.
  • Smart Argument Handling:
    • Automatically parses JSON arguments (booleans, numbers, arrays, objects).
    • Injects environment variables as fallbacks for template arguments.
  • Containerized: Full Docker support for easy deployment and integration.

Getting Started

1. Installation

Install using Go:

bash
go install github.com/vasayxtx/mcp-prompt-engine@latest

(For other methods like Docker or pre-built binaries, see the Installation section below.)

2. Create a Prompt

Create a prompts directory and add a template file. Let's create a prompt to help write a Git commit message.

First, create a reusable partial named prompts/_git_commit_role.tmpl:

```go
{{ define "_git_commit_role" }}
You are an expert programmer specializing in writing clear, concise, and conventional Git commit messages.
Commit message must strictly follow the Conventional Commits specification.

The final commit message you generate must be formatted exactly as follows:

```
<type>: A brief, imperative-tense summary of changes

[Optional longer description, explaining the "why" of the change. Use dash points for clarity.]
```
{{ if .type -}}
Use {{.type}} as a type.
{{ end }}
{{ end }}
```

Now, create a main prompt prompts/git_stage_commit.tmpl that uses this partial: ```go {{- /* Commit currently staged changes */ -}}

{{- template "_git_commit_role" . -}}

Your task is to commit all currently staged changes.
To understand the context, analyze the staged code using the command: `git diff --staged`
Based on that analysis, commit staged changes using a suitable commit message.
```

3. Validate Your Prompt

Validate your prompt to ensure it has no syntax errors:

bash
mcp-prompt-engine validate git_stage_commit
✓ git_stage_commit.tmpl - Valid

4. Connect MCP Server to Your Client

Add MCP Server to your MCP client. See Connecting to Clients for configuration examples.

5. Use Your Prompt

Your git_stage_commit prompt will now be available in your client!

For example, in Claude Desktop, you can select the git_stage_commit prompt, provide the type MCP Prompt argument and get a generated prompt that will help you to do a commit with a perfect message.

In Claude Code or Gemini CLI, you can start typing /git_stage_commit and it will suggest the prompt with the provided arguments that will be executed after you select it.


Installation

Pre-built Binaries

Download the latest release for your OS from the GitHub Releases page.

Build from Source

bash
git clone https://github.com/vasayxtx/mcp-prompt-engine.git
cd mcp-prompt-engine
make build

Docker

A pre-built Docker image is available. Mount your local prompts and logs directories to the container.

bash
# Pull and run the pre-built image from GHCR
docker run -i --rm \
  -v /path/to/your/prompts:/app/prompts:ro \
  -v /path/to/your/logs:/app/logs \
  ghcr.io/vasayxtx/mcp-prompt-engine

You can also build the image locally with make docker-build.


Usage

Creating Prompt Templates

Create a directory to store your prompt templates. Each template should be a .tmpl file using Go's text/template syntax with the following format:

go
{{/* Brief description of the prompt */}}
Your prompt text here with {{.template_variable}} placeholders.

The first line comment ({{/* description */}}) is used as the prompt description, and the rest of the file is the prompt template.

Partial templates should be prefixed with an underscore (e.g., _header.tmpl) and can be included in other templates using {{template "partial_name" .}}.

Template Syntax

The server uses Go's text/template engine, which provides powerful templating capabilities:

  • Variables: {{.variable_name}} - Access template variables
  • Built-in variables:
    • {{.date}} - Current date and time
  • Conditionals: {{if .condition}}...{{end}}, {{if .condition}}...{{else}}...{{end}}
  • Logical operators: {{if and .condition1 .condition2}}...{{end}}, {{if or .condition1 .condition2}}...{{end}}
  • Loops: {{range .items}}...{{end}}
  • Template inclusion: {{template "partial_name" .}} or {{template "partial_name" dict "key" "value"}}

See the Go text/template documentation for more details on syntax and features.

JSON Argument Parsing

The server automatically parses argument values as JSON when possible, enabling rich data types in templates:

  • Booleans: true, false → Go boolean values
  • Numbers: 42, 3.14 → Go numeric values
  • Arrays: ["item1", "item2"] → Go slices for use with {{range}}
  • Objects: {"key": "value"} → Go maps for structured data
  • Strings: Invalid JSON falls back to string values

This allows for advanced template operations like:

go
{{range .items}}Item: {{.}}{{end}}
{{if .enabled}}Feature is enabled{{end}}
{{.config.timeout}} seconds

To disable JSON parsing and treat all arguments as strings, use the --disable-json-args flag for the serve and render commands.

CLI Commands

The CLI is your main tool for managing and testing templates. By default, it looks for templates in the ./prompts directory, but you can specify a different directory with the --prompts flag.

1. List Templates

bash
# See a simple list of available prompts
mcp-prompt-engine list

# See a detailed view with descriptions and variables
mcp-prompt-engine list --verbose

2. Render a Template

Render a prompt directly in your terminal, providing arguments with the -a or --arg flag. It will automatically inject environment variables as fallbacks for any missing arguments. For example, if you have an environment variable TYPE=fix, it will be injected into the template as {{.type}}.

bash
# Render the git commit prompt, providing the 'type' variable
mcp-prompt-engine render git_stage_commit --arg type=feat

3. Validate Templates

Check all your templates for syntax errors. The command will return an error if any template is invalid.

bash
# Validate all templates in the directory
mcp-prompt-engine validate

# Validate a single template
mcp-prompt-engine validate git_stage_commit

4. Start the Server

Run the MCP server to make your prompts available to clients.

bash
# Run with default settings (looks for ./prompts)
mcp-prompt-engine serve

# Specify a different prompts directory and a log file
mcp-prompt-engine --prompts /path/to/prompts serve --log-file ./server.log

Connecting to Clients

To use this engine with any client that supports MCP Prompts, add a new entry to its MCP servers configuration.

Global configuration locations (MacOS):

  • Claude Code: ~/.claude.json (mcpServers section)
  • Claude Desktop: ~/Library/Application\ Support/Claude/claude_desktop_config.json (mcpServers section)
  • Gemini CLI: ~/.gemini/settings.json (mcpServers section)

Example for a local binary:

json
{
  "prompts": {
    "command": "/path/to/your/mcp-prompt-engine",
    "args": [
      "--prompts", "/path/to/your/prompts",
      "serve",
      "--quiet"
    ]
  }
}

Example for Docker:

json
{
  "mcp-prompt-engine-docker": {
    "command": "docker",
    "args": [
      "run", "-i", "--rm",
      "-v", "/path/to/your/prompts:/app/prompts:ro",
      "-v", "/path/to/your/logs:/app/logs",
      "ghcr.io/vasayxtx/mcp-prompt-engine"
    ]
  }
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Star History

Star History Chart

Repository Owner

vasayxtx
vasayxtx

User

Repository Details

Language Go
Default Branch main
Size 136 KB
Contributors 1
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Go
97.42%
Makefile
1.84%
Dockerfile
0.75%

Tags

Topics

go go-template golang mcp mcp-prompt mcp-server

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • mcp-difyworkflow-server

    mcp-difyworkflow-server

    MCP server for managing and executing multiple Dify workflows on demand.

    mcp-difyworkflow-server is an MCP-compliant server tool that facilitates the querying and invocation of custom Dify workflows. It supports dynamic execution of multiple workflows by interfacing with the Dify platform, enabling users to manage workflow credentials and operations efficiently. Configuration allows mapping of workflows to API keys, and commands to list or execute available workflows are provided.

    • 58
    • MCP
    • gotoolkits/mcp-difyworkflow-server
  • godoc-mcp-server

    godoc-mcp-server

    Provides Go package documentation from pkg.go.dev to LLMs as an MCP server.

    godoc-mcp-server enables searching Golang packages and obtains their documentation from pkg.go.dev, serving the information to language models via the Model Context Protocol. Communication occurs over standard input/output, supporting efficient retrieval of package information, including support for subpackages and usage instructions. The tool includes local caching and features tailored to LLM integration scenarios.

    • 32
    • MCP
    • yikakia/godoc-mcp-server
  • PMCP

    PMCP

    Golang Model Context Protocol server for natural language Prometheus queries

    PMCP implements a Model Context Protocol (MCP) server in Go, enabling natural language access and manipulation of Prometheus metrics. It maintains full consistency with the Prometheus HTTP API and supports a robust, type-safe interface for seamless integration with MCP-compatible clients. The server offers complete Prometheus API coverage and supports multiple transport methods, including HTTP and Server-Sent Events. Its modular architecture is designed for performance, extensibility, and effective error handling.

    • 3
    • MCP
    • yshngg/pmcp
  • Defang

    Defang

    Develop Once, Deploy Anywhere.

    Defang provides a command-line interface (CLI) and Model Context Protocol (MCP) Server that enable seamless deployment of applications from local development environments to the cloud. It supports integration with popular IDEs such as VS Code, Cursor, Windsurf, and Claude, allowing users to manage and deploy their workflows efficiently. Defang delivers secure, scalable deployments with built-in support for Docker Compose and Pulumi, and offers samples for Golang, Python, and Node.js projects. Its AI-powered features enable developers to generate and launch cloud services effortlessly.

    • 144
    • MCP
    • DefangLabs/defang
  • Multi-Model Advisor

    Multi-Model Advisor

    A Model Context Protocol server combining multiple AI model perspectives for richer advice.

    Multi-Model Advisor is a Model Context Protocol (MCP) server that queries several Ollama AI models, each with customizable roles, and combines their responses to deliver diverse perspectives on a single question. It allows users to assign unique system prompts to each model and integrate seamlessly with tools like Claude for Desktop. Configuration is flexible via environment variables, supporting easy setup and model management. The tool is designed to provide a 'council of advisors' experience by synthesizing opinions from multiple local models.

    • 67
    • MCP
    • YuChenSSR/multi-ai-advisor-mcp
  • PiloTY

    PiloTY

    AI Pilot for PTY Operations via the Model Context Protocol

    PiloTY is an MCP server that enables AI agents to control interactive terminals as if they were human users. It provides stateful, context-preserving terminal sessions that support interactive programs, SSH connections, and background process management. The system allows secure integration with AI platforms like Claude Code or Claude Desktop to translate natural language instructions into complex terminal workflows. Designed for extensibility and real-world development scenarios, PiloTY empowers agents to manage remote environments, debug interactively, and automate multi-step operations.

    • 12
    • MCP
    • yiwenlu66/PiloTY
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results