Langfuse Prompt Management MCP Server

Langfuse Prompt Management MCP Server

Bridge Langfuse prompts into the Model Context Protocol ecosystem for seamless management and discovery.

146
Stars
37
Forks
146
Watchers
9
Issues
Langfuse Prompt Management MCP Server enables users to access and manage prompts stored in Langfuse via standardized endpoints defined by the Model Context Protocol (MCP). It allows prompt discovery, retrieval, and compilation, translating Langfuse's prompt formats into MCP-compliant objects. The server supports cursor-based pagination, exposes a range of prompt-related operations, and includes compatibility features for MCP clients without prompt support. Integration instructions are provided for popular platforms like Claude Desktop and Cursor.

Key Features

Implements MCP Prompts specification for discovery and retrieval
Supports listing all available prompts with optional pagination
Allows retrieval and compilation of specific prompts with variables
Transforms Langfuse prompts into MCP compliant prompt objects
Exports additional MCP tool commands for broader client compatibility
Handles both text and chat prompts from Langfuse
Includes integration instructions for Claude Desktop and Cursor editors
Requires and supports Langfuse API authentication via environment variables
Pagination support via cursor mechanism
Identifies and lists only prompts labeled as 'production' in Langfuse

Use Cases

Centralized management of AI prompt templates in Langfuse leveraging MCP
Seamless integration of Langfuse prompt libraries into MCP-enabled clients
Standardized prompt retrieval and compilation for context-aware AI applications
Discovering and exploring available prompts in a Langfuse instance via MCP tools
Batch processing or bulk management of prompts through paginated listing
Custom prompt variable injection and dynamic prompt compilation
Augmenting tools like Claude Desktop and Cursor with Langfuse prompt access
Extending proprietary AI systems with interoperable prompt management using MCP
Bridging existing Langfuse workflows into new MCP-supported contexts
Enhancing collaborative AI development environments through shared prompt access

README

Langfuse Prompt Management MCP Server

Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.

Demo

Quick demo of Langfuse Prompts MCP in Claude Desktop (unmute for voice-over explanations):

https://github.com/user-attachments/assets/61da79af-07c2-4f69-b28c-ca7c6e606405

Features

MCP Prompt

This server implements the MCP Prompts specification for prompt discovery and retrieval.

  • prompts/list: List all available prompts

    • Optional cursor-based pagination
    • Returns prompt names and their required arguments, limitation: all arguments are assumed to be optional and do not include descriptions as variables do not have specification in Langfuse
    • Includes next cursor for pagination if there's more than 1 page of prompts
  • prompts/get: Get a specific prompt

    • Transforms Langfuse prompts (text and chat) into MCP prompt objects
    • Compiles prompt with provided variables

Tools

To increase compatibility with other MCP clients that do not support the prompt capability, the server also exports tools that replicate the functionality of the MCP Prompts.

  • get-prompts: List available prompts

    • Optional cursor parameter for pagination
    • Returns a list of prompts with their arguments
  • get-prompt: Retrieve and compile a specific prompt

    • Required name parameter: Name of the prompt to retrieve
    • Optional arguments parameter: JSON object with prompt variables

Development

bash
npm install

# build current file
npm run build

# test in mcp inspector
npx @modelcontextprotocol/inspector node ./build/index.js

Usage

Step 1: Build

bash
npm install
npm run build

Step 2: Add the server to your MCP servers:

Claude Desktop

Configure Claude for Desktop by editing claude_desktop_config.json

json
{
  "mcpServers": {
    "langfuse": {
      "command": "node",
      "args": ["<absolute-path>/build/index.js"],
      "env": {
        "LANGFUSE_PUBLIC_KEY": "your-public-key",
        "LANGFUSE_SECRET_KEY": "your-secret-key",
        "LANGFUSE_BASEURL": "https://cloud.langfuse.com"
      }
    }
  }
}

Make sure to replace the environment variables with your actual Langfuse API keys. The server will now be available to use in Claude Desktop.

Cursor

Add new server to Cursor:

  • Name: Langfuse Prompts
  • Type: command
  • Command:
    bash
    LANGFUSE_PUBLIC_KEY="your-public-key" LANGFUSE_SECRET_KEY="your-secret-key" LANGFUSE_BASEURL="https://cloud.langfuse.com" node absolute-path/build/index.js
    

Limitations

The MCP Server is a work in progress and has some limitations:

  • Only prompts with a production label in Langfuse are returned
  • All arguments are assumed to be optional and do not include descriptions as variables do not have specification in Langfuse
  • List operations require fetching each prompt individually in the background to extract the arguments, this works but is not efficient

Contributions are welcome! Please open an issue or a PR (repo) if you have any suggestions or feedback.

Star History

Star History Chart

Repository Owner

langfuse
langfuse

Organization

Repository Details

Language TypeScript
Default Branch main
Size 122 KB
Contributors 1
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
100%

Tags

Topics

langfuse llm llmops mcp model-context-protocol prompt-management prompting

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MCP Prompt Engine

    MCP Prompt Engine

    A dynamic MCP server for managing and serving reusable, logic-driven AI prompt templates.

    MCP Prompt Engine is a Model Control Protocol (MCP) server designed to manage and serve dynamic prompt templates using Go's text/template system. It enables users to create reusable, logic-driven prompt templates with support for variables, partials, and conditionals. The engine interacts seamlessly with any compatible MCP client, providing prompt arguments, rich CLI tools, and automatic hot-reloading of templates. Docker support and intelligent argument parsing enhance its integration and deployment capabilities.

    • 15
    • MCP
    • vasayxtx/mcp-prompt-engine
  • LlamaCloud MCP Server

    LlamaCloud MCP Server

    Connect multiple LlamaCloud indexes as tools for your MCP client.

    LlamaCloud MCP Server is a TypeScript-based implementation of a Model Context Protocol server that allows users to connect multiple managed indexes from LlamaCloud as separate tools in MCP-compatible clients. Each tool is defined via command-line parameters, enabling flexible and dynamic access to different document indexes. The server automatically generates tool interfaces, each capable of querying its respective LlamaCloud index, with customizable parameters such as index name, description, and result limits. Designed for seamless integration, it works with clients like Claude Desktop, Windsurf, and Cursor.

    • 82
    • MCP
    • run-llama/mcp-server-llamacloud
  • Context7 MCP

    Context7 MCP

    Up-to-date code docs for every AI prompt.

    Context7 MCP delivers current, version-specific documentation and code examples directly into large language model prompts. By integrating with model workflows, it ensures responses are accurate and based on the latest source material, reducing outdated and hallucinated code. Users can fetch relevant API documentation and examples by simply adding a directive to their prompts. This allows for more reliable, context-rich answers tailored to real-world programming scenarios.

    • 36,881
    • MCP
    • upstash/context7
  • piapi-mcp-server

    piapi-mcp-server

    TypeScript-based MCP server for PiAPI media content generation

    piapi-mcp-server is a TypeScript implementation of a Model Context Protocol (MCP) server that connects with PiAPI to enable media generation workflows from MCP-compatible applications. It handles image, video, music, TTS, 3D, and voice generation tasks using a wide range of supported models like Midjourney, Flux, Kling, LumaLabs, Udio, and more. Designed for easy integration with clients such as Claude Desktop, it includes an interactive MCP Inspector for development, testing, and debugging.

    • 62
    • MCP
    • apinetwork/piapi-mcp-server
  • box-mcp-server

    box-mcp-server

    Expose your Box files to AI with a Model Context Protocol server.

    box-mcp-server lets users connect their Box accounts to AI applications via the Model Context Protocol (MCP). It securely authenticates to Box with enterprise credentials or developer tokens and serves file search and reading capabilities to downstream clients. Designed for use with Claude Desktop and the MCP Inspector, it provides seamless integration of Box documents into AI workflows.

    • 10
    • MCP
    • hmk/box-mcp-server
  • mcp-server-chatsum

    mcp-server-chatsum

    Summarize and query chat messages using the MCP Server protocol.

    mcp-server-chatsum is an MCP Server designed to summarize and query chat messages. It provides tools to interact with chat data, enabling users to extract and summarize message content based on specified prompts. The server can be integrated with Claude Desktop and supports communication over stdio, offering dedicated debugging tools via the MCP Inspector. Environment variable support and database integration ensure flexible deployment for chat data management.

    • 1,024
    • MCP
    • chatmcp/mcp-server-chatsum
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results