Semgrep MCP Server

Semgrep MCP Server

A Model Context Protocol server powered by Semgrep for seamless code analysis integration.

611
Stars
54
Forks
611
Watchers
29
Issues
Semgrep MCP Server implements the Model Context Protocol (MCP) to enable efficient and standardized communication for code analysis tasks. It facilitates integration with platforms like LM Studio, Cursor, and Visual Studio Code, providing both Docker and Python (PyPI) deployment options. The tool is now maintained in the main Semgrep repository with continued updates, enhancing compatibility and support across developer tools.

Key Features

Implements the Model Context Protocol (MCP)
Semgrep-powered code analysis
Integration with popular platforms (LM Studio, Cursor, VS Code)
Supports Docker and Python (PyPI) deployment
Contextual handling for AI model integration
Standardized protocol communication
Easy installation and updates via main Semgrep repository
Documentation and community support
Quick deployment through badges and platform-specific links
Enhanced compatibility across development environments

Use Cases

Automated static code analysis in CI/CD pipelines
Integration of Semgrep analysis into AI-driven developer tools
Standardized context management between tools and models
Security checks and vulnerability detection via MCP workflow
Context sharing across IDE extensions and plugins
Enabling third-party applications to invoke code analysis features
Cybersecurity auditing within development workflows
Developer productivity enhancement through integrated analysis services
Custom MCP server deployments for enterprise environments
Rapid prototyping and testing of context-aware coding models

README

⚠️ The Semgrep MCP server has been moved from a standalone repo to the main semgrep repository! ⚠️

This repository has been deprecated, and further updates to the Semgrep MCP server will be made via the official semgrep binary.

Semgrep MCP Server

Add MCP Server semgrep to LM Studio Install in Cursor Install in VS Code UV Install in VS Code Docker Install in VS Code semgrep.ai PyPI Docker Install in VS Code Insiders Install in VS Code Insiders

A Model Context Protocol (MCP) server for using Semgrep to scan code for security vulnerabilities. Secure your vibe coding! 😅

Model Context Protocol (MCP) is a standardized API for LLMs, Agents, and IDEs like Cursor, VS Code, Windsurf, or anything that supports MCP, to get specialized help, get context, and harness the power of tools. Semgrep is a fast, deterministic static analysis tool that semantically understands many languages and comes with over 5,000 rules. 🛠️

[!NOTE] This beta project is under active development. We would love your feedback, bug reports, feature requests, and code. Join the #mcp community Slack channel!

Contents

Getting started

Run the Python package as a CLI command using uv:

bash
uvx semgrep-mcp # see --help for more options

Or, run as a Docker container:

bash
docker run -i --rm ghcr.io/semgrep/mcp -t stdio

Cursor

Example mcp.json

json
{
  "mcpServers": {
    "semgrep": {
      "command": "uvx",
      "args": ["semgrep-mcp"],
      "env": {
        "SEMGREP_APP_TOKEN": "<token>"
      }
    }
  }
}

Add an instruction to your .cursor/rules to use automatically:

text
Always scan code generated using Semgrep for security vulnerabilities

ChatGPT

  1. Go to the Connector Settings page (direct link)
  2. Name the connection Semgrep
  3. Set MCP Server URL to https://mcp.semgrep.ai/sse
  4. Set Authentication to No authentication
  5. Check the I trust this application checkbox
  6. Click Create

See more details at the official docs.

Hosted Server

[!WARNING] mcp.semgrep.ai is an experimental server that may break unexpectedly. It will rapidly gain new functionality.🚀

Cursor

  1. Cmd + Shift + J to open Cursor Settings
  2. Select MCP Tools
  3. Click New MCP Server.
json
{
  "mcpServers": {
    "semgrep": {
      "type": "streamable-http",
      "url": "https://mcp.semgrep.ai/mcp"
    }
  }
}

Demo

API

Tools

Enable LLMs to perform actions, make deterministic computations, and interact with external services.

Scan Code

  • security_check: Scan code for security vulnerabilities
  • semgrep_scan: Scan code files for security vulnerabilities with a given config string
  • semgrep_scan_with_custom_rule: Scan code files using a custom Semgrep rule

Understand Code

  • get_abstract_syntax_tree: Output the Abstract Syntax Tree (AST) of code

Cloud Platform (login and Semgrep token required)

  • semgrep_findings: Fetch Semgrep findings from the Semgrep AppSec Platform API

Meta

  • supported_languages: Return the list of languages Semgrep supports
  • semgrep_rule_schema: Fetches the latest semgrep rule JSON Schema

Prompts

Reusable prompts to standardize common LLM interactions.

  • write_custom_semgrep_rule: Return a prompt to help write a Semgrep rule

Resources

Expose data and content to LLMs

  • semgrep://rule/schema: Specification of the Semgrep rule YAML syntax using JSON schema
  • semgrep://rule/{rule_id}/yaml: Full Semgrep rule in YAML format from the Semgrep registry

Usage

This Python package is published to PyPI as semgrep-mcp and can be installed and run with pip, pipx, uv, poetry, or any Python package manager.

text
$ pipx install semgrep-mcp
$ semgrep-mcp --help

Usage: semgrep-mcp [OPTIONS]

  Entry point for the MCP server

  Supports both stdio and sse transports. For stdio, it will read from stdin
  and write to stdout. For sse, it will start an HTTP server on port 8000.

Options:
  -v, --version                Show version and exit.
  -t, --transport [stdio|sse]  Transport protocol to use (stdio or sse)
  -h, --help                   Show this message and exit.

Standard Input/Output (stdio)

The stdio transport enables communication through standard input and output streams. This is particularly useful for local integrations and command-line tools. See the spec for more details.

Python

bash
semgrep-mcp

By default, the Python package will run in stdio mode. Because it's using the standard input and output streams, it will look like the tool is hanging without any output, but this is expected.

Docker

This server is published to Github's Container Registry (ghcr.io/semgrep/mcp)

docker run -i --rm ghcr.io/semgrep/mcp -t stdio

By default, the Docker container is in SSE mode, so you will have to include -t stdio after the image name and run with -i to run in interactive mode.

Streamable HTTP

Streamable HTTP enables streaming responses over JSON RPC via HTTP POST requests. See the spec for more details.

By default, the server listens on 127.0.0.1:8000/mcp for client connections. To change any of this, set FASTMCP_* environment variables. The server must be running for clients to connect to it.

Python

bash
semgrep-mcp -t streamable-http

By default, the Python package will run in stdio mode, so you will have to include -t streamable-http.

Docker

docker run -p 8000:0000 ghcr.io/semgrep/mcp

Server-sent events (SSE)

[!WARNING] The MCP communiity considers this a legacy transport portcol and is really intended for backwards compatibility. Streamable HTTP is the recommended replacement.

SSE transport enables server-to-client streaming with Server-Send Events for client-to-server and server-to-client communication. See the spec for more details.

By default, the server listens on 127.0.0.1:8000/sse for client connections. To change any of this, set FASTMCP_* environment variables. The server must be running for clients to connect to it.

Python

bash
semgrep-mcp -t sse

By default, the Python package will run in stdio mode, so you will have to include -t sse.

Docker

docker run -p 8000:0000 ghcr.io/semgrep/mcp -t sse

Semgrep AppSec Platform

Optionally, to connect to Semgrep AppSec Platform:

  1. Login or sign up
  2. Generate a token from Settings
  3. Add the token to your environment variables:
    • CLI (export SEMGREP_APP_TOKEN=<token>)

    • Docker (docker run -e SEMGREP_APP_TOKEN=<token>)

    • MCP config JSON

json
    "env": {
      "SEMGREP_APP_TOKEN": "<token>"
    }

[!TIP] Please reach out for support if needed. ☎️

Integrations

Cursor IDE

Add the following JSON block to your ~/.cursor/mcp.json global or .cursor/mcp.json project-specific configuration file:

json
{
  "mcpServers": {
    "semgrep": {
      "command": "uvx",
      "args": ["semgrep-mcp"]
    }
  }
}

cursor MCP settings

See cursor docs for more info.

VS Code / Copilot

Click the install buttons at the top of this README for the quickest installation.

Manual Configuration

Add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P and typing Preferences: Open User Settings (JSON).

json
{
  "mcp": {
    "servers": {
      "semgrep": {
        "command": "uvx",
        "args": ["semgrep-mcp"]
      }
    }
  }
}

Optionally, you can add it to a file called .vscode/mcp.json in your workspace:

json
{
  "servers": {
    "semgrep": {
      "command": "uvx",
        "args": ["semgrep-mcp"]
    }
  }
}

Using Docker

json
{
  "mcp": {
    "servers": {
      "semgrep": {
        "command": "docker",
        "args": [
          "run",
          "-i",
          "--rm",
          "ghcr.io/semgrep/mcp",
          "-t",
          "stdio"
        ]
      }
    }
  }
}

See VS Code docs for more info.

Windsurf

Add the following JSON block to your ~/.codeium/windsurf/mcp_config.json file:

json
{
  "mcpServers": {
    "semgrep": {
      "command": "uvx",
      "args": ["semgrep-mcp"]
    }
  }
}

See Windsurf docs for more info.

Claude Desktop

Here is a short video showing Claude Desktop using this server to write a custom rule.

Add the following JSON block to your claude_desktop_config.json file:

json
{
  "mcpServers": {
    "semgrep": {
      "command": "uvx",
      "args": ["semgrep-mcp"]
    }
  }
}

See Anthropic docs for more info.

Claude Code

bash
claude mcp add semgrep uvx semgrep-mcp

See Claude Code docs for more info.

OpenAI

See the offical docs:

Agents SDK

python
async with MCPServerStdio(
    params={
        "command": "uvx",
        "args": ["semgrep-mcp"],
    }
) as server:
    tools = await server.list_tools()

See OpenAI Agents SDK docs for more info.

Custom clients

Example Python SSE client

See a full example in examples/sse_client.py

python
from mcp.client.session import ClientSession
from mcp.client.sse import sse_client


async def main():
    async with sse_client("http://localhost:8000/sse") as (read_stream, write_stream):
        async with ClientSession(read_stream, write_stream) as session:
            await session.initialize()
            results = await session.call_tool(
                "semgrep_scan",
                {
                    "code_files": [
                        {
                            "path": "hello_world.py",
                            "content": "def hello(): print('Hello, World!')",
                        }
                    ]
                },
            )
            print(results)

[!TIP] Some client libraries want the URL: http://localhost:8000/sse and others only want the HOST: localhost:8000. Try out the URL in a web browser to confirm the server is running, and there are no network issues.

See official SDK docs for more info.

Contributing, community, and running from source

[!NOTE] We love your feedback, bug reports, feature requests, and code. Join the #mcp community Slack channel!

See CONTRIBUTING.md for more info and details on how to run from the MCP server from source code.

Similar tools 🔍

Community projects 🌟

MCP server registries


Made with ❤️ by the Semgrep Team

Star History

Star History Chart

Repository Owner

semgrep
semgrep

Organization

Repository Details

Language Python
Default Branch main
Size 950 KB
Contributors 17
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
91.1%
Makefile
5.72%
Dockerfile
1.95%
Smarty
1.22%

Tags

Topics

mcp semgrep

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • MyMCP Server (All-in-One Model Context Protocol)

    MyMCP Server (All-in-One Model Context Protocol)

    Powerful and extensible Model Context Protocol server with developer and productivity integrations.

    MyMCP Server is a robust Model Context Protocol (MCP) server implementation that integrates with services like GitLab, Jira, Confluence, YouTube, Google Workspace, and more. It provides AI-powered search, contextual tool execution, and workflow automation for development and productivity tasks. The system supports extensive configuration and enables selective activation of grouped toolsets for various environments. Installation and deployment are streamlined, with both automated and manual setup options available.

    • 93
    • MCP
    • nguyenvanduocit/all-in-one-model-context-protocol
  • GitHub MCP Server

    GitHub MCP Server

    Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.

    GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.

    • 24,418
    • MCP
    • github/github-mcp-server
  • Unichat MCP Server

    Unichat MCP Server

    Universal MCP server providing context-aware AI chat and code tools across major model vendors.

    Unichat MCP Server enables sending standardized requests to leading AI model vendors, including OpenAI, MistralAI, Anthropic, xAI, Google AI, DeepSeek, Alibaba, and Inception, utilizing the Model Context Protocol. It features unified endpoints for chat interactions and provides specialized tools for code review, documentation generation, code explanation, and programmatic code reworking. The server is designed for seamless integration with platforms like Claude Desktop and installation via Smithery. Vendor API keys are required for secure access to supported providers.

    • 37
    • MCP
    • amidabuddha/unichat-mcp-server
  • FastMCP

    FastMCP

    The fast, Pythonic way to build MCP servers and clients.

    FastMCP is a production-ready framework for building Model Context Protocol (MCP) applications in Python. It streamlines the creation of MCP servers and clients, providing advanced features such as enterprise authentication, composable tools, OpenAPI/FastAPI generation, server proxying, deployment tools, and comprehensive client libraries. Designed for ease of use, it offers both standard protocol support and robust utilities for production deployments.

    • 20,201
    • MCP
    • jlowin/fastmcp
  • TeslaMate MCP Server

    TeslaMate MCP Server

    Query your TeslaMate data using the Model Context Protocol

    TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

    • 106
    • MCP
    • cobanov/teslamate-mcp
  • mcp-graphql

    mcp-graphql

    Enables LLMs to interact dynamically with GraphQL APIs via Model Context Protocol.

    mcp-graphql provides a Model Context Protocol (MCP) server that allows large language models to discover and interact with GraphQL APIs. The implementation facilitates schema introspection, exposes the GraphQL schema as a resource, and enables secure query and mutation execution based on configuration. It supports configuration through environment variables, automated or manual installation options, and offers flexibility in using local or remote schema files. By default, mutation operations are disabled for security, but can be enabled if required.

    • 319
    • MCP
    • blurrah/mcp-graphql
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results