Braintree MCP Server

Braintree MCP Server

Enables AI assistants to securely access and manage Braintree payment processing via MCP.

3
Stars
3
Forks
3
Watchers
1
Issues
Braintree MCP Server implements the Model Context Protocol (MCP) to provide AI assistants with structured, direct access to PayPal Braintree's payment services. It supports both STDIO and Server-Sent Events (SSE) transports, facilitating integration with AI clients and standalone web-based deployments. The server allows execution of GraphQL queries against Braintree, supports multi-client access, and handles authentication through environment-based configuration. Designed for secure, extensible, and automated payment operations in AI-driven workflows.

Key Features

Implements the Model Context Protocol for AI integrations
Supports both STDIO and SSE (Server-Sent Events) transports
Enables execution of GraphQL queries against Braintree
Multi-client support with persistent web server mode
Environment variable-based authentication
Modular tool design for payment operations and testing
Configurable host and port for SSE server
Academic citation requirement for research usage
Python 3.13+ compatibility
Simple connectivity testing via ping tool

Use Cases

Integrating payment processing capabilities into AI assistants
Automating transaction management for Braintree accounts
Providing AI-driven customer data management over payments
Enabling secure financial operations from AI chat interfaces
Deploying as a local server for multi-client access to Braintree APIs
Facilitating research projects requiring programmable payment operations
Testing Braintree API credentials and connectivity
Executing custom GraphQL queries for advanced reporting
Developing tools that bridge AI assistants and payment gateways
Supporting chatbot workflows that require live payment functionality

README

Braintree MCP Server

An unofficial Model Context Protocol (MCP) server for interacting with PayPal Braintree payment processing services.

License and Citation

This project is available under the MIT License with an Academic Citation Requirement. This means you can freely use, modify, and distribute the code, but any academic or scientific publication that uses this software must provide appropriate attribution.

For academic/research use:

If you use this software in a research project that leads to a publication, presentation, or report, you must cite this work according to the format provided in CITATION.md.

For commercial/non-academic use:

Commercial and non-academic use follows the standard MIT License terms without the citation requirement.

By using this software, you agree to these terms. See LICENSE.md for the complete license text.

Server Versions

There are two versions of the Braintree MCP server available:

1. STDIO Transport Server (braintree_server.py)

  • Uses standard input/output (STDIO) for communication
  • Designed for integrations with Claude Desktop and other MCP clients that support STDIO
  • Each client session spawns a new server process
  • The server terminates when the client disconnects

Usage with Claude Desktop:

  1. Configure claude_desktop_config.json to point to this server
  2. Open Claude Desktop and select the Braintree tool

2. SSE Transport Server (braintree_sse_server.py)

  • Uses Server-Sent Events (SSE) for communication
  • Designed as a standalone web server that can handle multiple client connections
  • Server runs persistently until manually stopped
  • Binds to 127.0.0.1:8001 by default (configurable)

Manual Usage:

bash
python braintree_sse_server.py

Connecting to the SSE server: Use an MCP client that supports SSE transport and connect to http://127.0.0.1:8001/sse

Overview

This server implements the Model Context Protocol (MCP) specification to provide AI assistant models with direct, structured access to Braintree's payment processing capabilities via GraphQL API. It enables AI systems to perform payment operations like fetching transactions, creating payments, and managing customer data through MCP tools.

Installation

  1. Clone this repository
bash
git clone https://github.com/yourusername/braintree-mcp-server.git
cd braintree-mcp-server
  1. Set up a Python 3.13+ environment
bash
# If using pyenv
pyenv install 3.13.0
pyenv local 3.13.0

# Or using another method to ensure Python 3.13+
  1. Install dependencies
bash
pip install -e .

Configuration

Create a .env file in the project root with your Braintree credentials:

BRAINTREE_MERCHANT_ID=your_merchant_id
BRAINTREE_PUBLIC_KEY=your_public_key
BRAINTREE_PRIVATE_KEY=your_private_key
BRAINTREE_ENVIRONMENT=sandbox  # or production

You can obtain these credentials from your Braintree Control Panel.

Usage

Running the server

Default STDIO Transport

bash
python braintree_server.py

The server runs using stdio transport by default, which is suitable for integration with AI assistant systems that support MCP.

Server-Sent Events (SSE) Transport

bash
python braintree_sse_server.py

The SSE server provides a web-based transport layer that allows multiple persistent client connections. This is useful for standalone deployments where multiple clients need to access the Braintree functionality.

Default configuration:

  • Host: 127.0.0.1 (localhost)
  • Port: 8001
  • Environment: Defined in your .env file

See requirements.txt for the required dependencies.

Available MCP Tools

braintree_ping

Simple connectivity test to check if your Braintree credentials are working.

python
response = await braintree_ping()
# Returns "pong" if successful

braintree_execute_graphql

Execute arbitrary GraphQL queries against the Braintree API.

python
query = """
query GetTransactionDetails($id: ID!) {
  node(id: $id) {
    ... on Transaction {
      id
      status
      amount {
        value
        currencyCode
      }
      createdAt
    }
  }
}
"""

variables = {"id": "transaction_id_here"}

response = await braintree_execute_graphql(query, variables)
# Returns JSON response from Braintree

Common GraphQL Operations

Fetch Customer

graphql
query GetCustomer($id: ID!) {
  node(id: $id) {
    ... on Customer {
      id
      firstName
      lastName
      email
      paymentMethods {
        edges {
          node {
            id
            details {
              ... on CreditCardDetails {
                last4
                expirationMonth
                expirationYear
                cardType
              }
            }
          }
        }
      }
    }
  }
}

Create Transaction

graphql
mutation CreateTransaction($input: ChargePaymentMethodInput!) {
  chargePaymentMethod(input: $input) {
    transaction {
      id
      status
      amount {
        value
        currencyCode
      }
    }
  }
}

With variables:

json
{
  "input": {
    "paymentMethodId": "payment_method_id_here",
    "transaction": {
      "amount": "10.00",
      "orderId": "order123",
      "options": {
        "submitForSettlement": true
      }
    }
  }
}

Troubleshooting

  • Ensure your Braintree credentials are correct in the .env file
  • Verify your network connection can reach Braintree's API endpoints
  • Check for any rate limiting or permission issues with your Braintree account

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 56 KB
Contributors 1
License Other
MCP Verified Nov 12, 2025

Programming Languages

Python
100%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Vectara MCP Server

    Vectara MCP Server

    Secure RAG server enabling seamless AI integration via Model Context Protocol.

    Vectara MCP Server implements the open Model Context Protocol to enable AI systems and agentic applications to connect securely with Vectara's Trusted RAG platform. It supports multiple transport modes, including secure HTTP, Server-Sent Events (SSE), and local STDIO for development. The server provides fast, reliable retrieval-augmented generation (RAG) operations with built-in authentication, rate limiting, and optional CORS configuration. Integration is compatible with Claude Desktop and any other MCP client.

    • 25
    • MCP
    • vectara/vectara-mcp
  • Substrate MCP Server

    Substrate MCP Server

    A Rust-based MCP server for dynamic Substrate blockchain operations.

    Substrate MCP Server provides a Model Context Protocol (MCP) compliant server enabling dynamic interaction with Substrate blockchains. It supports querying balances, pallets, and storage, as well as submitting transactions and accessing block and event data. The server is fully configurable via environment variables and designed for seamless integration with tools such as Cursor, Claude, and development dashboards. Built in Rust, it interfaces with Substrate nodes using the subxt crate.

    • 11
    • MCP
    • ThomasMarches/substrate-mcp-rs
  • Offorte MCP Server

    Offorte MCP Server

    Bridge AI agents with Offorte proposal automation via the Model Context Protocol.

    Offorte MCP Server enables external AI models to create and send proposals through Offorte by implementing the Model Context Protocol. It facilitates automation workflows between AI agents and Offorte's proposal engine, supporting seamless integration with chat interfaces and autonomous systems. The server provides a suite of tools for managing contacts, proposals, templates, and automation sets, streamlining the proposal creation and delivery process via standardized context handling. Designed for extensibility and real-world automation, it leverages Offorte's public API to empower intelligent business proposals.

    • 4
    • MCP
    • offorte/offorte-mcp-server
  • Bitcoin & Lightning Network MCP Server

    Bitcoin & Lightning Network MCP Server

    Enable AI models to safely interact with Bitcoin and Lightning Network in a standardized way.

    The Bitcoin & Lightning Network MCP Server implements the Model Context Protocol, allowing AI models to interface with Bitcoin and Lightning Network functionalities such as key generation, address validation, transaction decoding, blockchain queries, and lightning payments. It provides standardized endpoints for AI model integration, including support for Claude Desktop and Goose. The solution supports querying blockchain data, parsing transactions and invoices, and managing cryptographic operations in a secure and extensible manner.

    • 65
    • MCP
    • AbdelStark/bitcoin-mcp
  • Weather MCP Server

    Weather MCP Server

    A Model Context Protocol server delivering weather and air quality data via multiple transport modes.

    Weather MCP Server is a Model Context Protocol (MCP) implementation that provides comprehensive weather and air quality information using the Open-Meteo API. It supports various transport modes including standard stdio for desktop clients, HTTP Server-Sent Events (SSE), and Streamable HTTP for modern web integration. The server offers both real-time and historical weather metrics, as well as timezone and time conversion functionalities. Installation and integration options are available for both MCP desktop clients and web applications.

    • 26
    • MCP
    • isdaniel/mcp_weather_server
  • Daisys MCP server

    Daisys MCP server

    A beta server implementation for the Model Context Protocol supporting audio context with Daisys integration.

    Daisys MCP server provides a beta implementation of the Model Context Protocol (MCP), enabling seamless integration between the Daisys AI platform and various MCP clients. It allows users to connect MCP-compatible clients to Daisys by configurable authentication and environment settings, with out-of-the-box support for audio file storage and playback. The server is designed to be extensible, including support for both user-level deployments and developer contributions, with best practices for secure authentication and dependency management.

    • 10
    • MCP
    • daisys-ai/daisys-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results