Telephony MCP Server

Telephony MCP Server

MCP-based telephony server enabling LLMs to make voice calls and send SMS via Vonage.

12
Stars
3
Forks
12
Watchers
0
Issues
Telephony MCP Server provides Model Context Protocol-based tools that allow integration of telephony operations, such as voice calls and SMS sending, with Large Language Model applications. Built with Python and leveraging the Vonage API, it exposes callable functions that LLMs can use to interact with the real world. The server defines tools like 'voice_call' and 'send_sms' for secure, programmatic communication via FastAPI interfaces. It is designed for agentic and tool-using AI systems to extend their operational capabilities beyond text generation.

Key Features

Model Context Protocol (MCP) tool definitions for telephony
Voice calling via Vonage API
SMS sending and receiving via Vonage API
Integrates with LLM agents (GPT, Claude, etc.)
FastAPI-based HTTP server for tool invocation
Tool calling workflows for function execution
Support for secure operation with API credential handling
Demo integrations with Claude and GitHub Copilot
Speech recognition integration capabilities
Extensible framework for adding new MCP tools

Use Cases

Agentic telephone conversations powered by LLMs
Automated outbound calling from AI assistants
Sending automated SMS notifications or alerts
Two-way SMS-based chatbots using LLMs
Voice call initiation from customer service bots
Enabling LLMs to perform real-world telephony actions
Interactive voice and SMS support for AI-driven apps
Prototyping LLM-driven telephony agents
Connecting enterprise AI workflows to telephony APIs
Building conversational interfaces that bridge digital and phone communications

README

Black Ruff

📖 Blog Post: Learn more about this project in the detailed blog post: Telephony MCP Server for Agentic AI and Language Models

Telephony MCP Server

Demo Using Claude Desktop

Agentic Telephony Conversation with Speech Recognition

Use SMS during mid-conversation

SMS Enquiry (Send and Receive)

Demo Using GitHub Copilot

Introduction

This directory contains MCP (Model Context Protocol) Server tools for telephony operations, such as making voice calls and sending SMS messages using the Vonage API. These tools are designed to be integrated with Large Language Model (LLM) applications, enabling LLMs to perform real-world actions beyond simple text generation.

LLMs and Tool Integration

LLMs (Large Language Models) are advanced token generators—they can generate text, images, or even video based on input prompts. However, their core capability is limited to generating content; they cannot access external data or perform actions in the real world on their own.

To extend their functionality, LLMs can be connected to external tools. For example, when a user asks, "What's the weather today?" the LLM can invoke a backend API tool like get_weather(city) via a system prompt, parse the response, and return the result to the user. This tool-calling mechanism transforms a basic LLM into a powerful LLM Application.

Tool Calling with MCP and LangChain

  • LangChain is a popular framework for developing applications powered by LLMs. It provides a collection of pre-built tools (called a Toolkit) that LLMs can use to interact with external systems.
  • MCP (Model Context Protocol) follows the same concept: it offers a collection of pre-built tools and a framework for writing new tools and handling function calling.
  • Both frameworks allow LLMs to invoke tools, parse their outputs, and integrate the results into their responses.

How This Works

  1. Tool Definition: In this project, tools like voice_call and send_sms are defined using the MCP framework. Each tool is a function that can be called by an LLM application.
  2. LLM Application: When integrated with an LLM (such as OpenAI's GPT, Anthropic's Claude, etc.), the LLM can decide to call these tools based on user prompts.
  3. Execution Flow:
    • The LLM receives a prompt (e.g., "Call Alice and say hello").
    • The LLM determines that a tool invocation is needed and calls the appropriate MCP tool (e.g., voice_call).
    • The tool executes (e.g., initiates a phone call via Vonage) and returns the result.
    • The LLM parses the response and presents it to the user.

Running the MCP Tools

Prerequisites

  • Python 3.13+
  • MCP CLI (mcp[cli]), FastAPI, httpx, pyjwt, python-dotenv, uvicorn, pydantic (see pyproject.toml for details)
  • Vonage API credentials (API key, secret, application ID, private key)
  • Public URL for callback server (for production use)

Setup

  1. Install dependencies:

    bash
    pip install -r requirements.txt
    

    Or, if using Poetry:

    bash
    poetry install
    
  2. Configure environment variables:

    • Create a .env file with your Vonage credentials:

      VONAGE_API_KEY=your_api_key
      VONAGE_API_SECRET=your_api_secret
      VONAGE_APPLICATION_ID=your_app_id
      VONAGE_PRIVATE_KEY_PATH=path/to/private.key
      VONAGE_LVN=your_virtual_number
      VONAGE_API_URL=https://api.nexmo.com/v1/calls
      VONAGE_SMS_URL=https://rest.nexmo.com/sms/json
      CALLBACK_SERVER_URL=https://your-public-url  # URL for Vonage event callbacks
      

      For the CALLBACK_SERVER_URL:

      • In development: You can use http://localhost:8080 (default if not specified)
      • In production: Use a public URL (such as an ngrok URL or your deployed server)
  3. Run the MCP server:

    bash
    python telephony_server.py
    

    The server will start and expose the defined tools for LLM applications.

Running with Docker

You can also run the telephony MCP server using Docker:

  1. Build and start the Docker container:

    bash
    docker compose up --build
    

    Or to run in the background:

    bash
    docker compose up --build -d
    
  2. Stop the Docker container:

    bash
    docker compose down
    
  3. View logs from the Docker container:

    bash
    docker compose logs -f
    

Using with LLM Applications

  • Direct Integration: Connect your LLM application (e.g., using LangChain via Adapter or a custom MCP client) to the running MCP server. The LLM can now invoke telephony tools as needed.
  • Example: When the LLM receives a prompt like "Dial this number +123 and read latest news from today", it will call the voice_call tool, passing the required parameters.
  • Example: When the LLM receives a prompt like "Call this number using a British accent", it will call the voice_call tool with specific language and style parameters.
  • Example: When the LLM receives a prompt like "Text the news instead", it will call the send_sms tool, passing the required parameters.

Using with Claude Desktop or other MCP clients

To configure an MCP client (like Claude Desktop) to use your telephony MCP server:

  1. Update your MCP client configuration file (e.g., claude_desktop_config.json):

    json
    {
      "mcpServers": {
        "telephony": {
          "command": "docker",
          "args": ["run", "-i", "--rm", "--init", "-e", "DOCKER_CONTAINER=true", "telephony-mcp-server"]
        }
      }
    }
    
  2. Build the Docker image (if not using docker compose):

    bash
    docker build -t telephony-mcp-server .
    
  3. Restart your MCP client to apply the changes.

Key Concepts

  • LLMs are content generators: They generate text, images, or video, but need external tools for actions like web search, telephony, or database access.
  • Tool calling: LLMs can invoke backend APIs (tools) to fetch data or perform actions, then parse and present the results.
  • Frameworks: Both LangChain and MCP provide a structure for defining, registering, and invoking tools from LLMs.
  • MCP: Helps you write new tools and manage function calling, making it easy to extend LLM applications with custom capabilities.

Callback Server for Vonage Events

The Telephony MCP Server also includes a Vonage Callback Server that listens on port 8080. This server is used to receive event notifications from Vonage Voice API, which are sent when voice calls are initiated, completed, or encounter errors.

Features

  • Receives and stores Vonage event callbacks
  • Provides endpoints to view and manage stored events
  • Runs as a separate service within the same application

Endpoints

  • GET / - Health check endpoint
  • POST /event - Main endpoint for receiving Vonage callbacks
  • GET /events - List all stored events (with pagination)
  • GET /events/{event_id} - Get a specific event by ID
  • DELETE /events - Clear all stored events

Configuration

To use the callback server with Vonage Voice API, you need to set the CALLBACK_SERVER_URL environment variable to your server's public URL. This URL will be used as the event_url parameter in Vonage API calls.

bash
export CALLBACK_SERVER_URL="https://your-public-url"

For local development, you can use a service like ngrok to expose your local server to the internet:

bash
ngrok http 8080

Then set the CALLBACK_SERVER_URL to the ngrok URL.

Star History

Star History Chart

Repository Owner

khan2a
khan2a

User

Repository Details

Language Python
Default Branch main
Size 12,499 KB
Contributors 2
License MIT License
MCP Verified Nov 11, 2025

Programming Languages

Python
99.1%
Dockerfile
0.9%

Tags

Topics

mcp mcp-server python telephony vonage vonage-voice

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Didlogic MCP Server

    Didlogic MCP Server

    Standardized MCP interface for Didlogic API services

    Didlogic MCP Server provides a Model Context Protocol-compliant server that enables Large Language Models to interact with Didlogic services through a standardized interface. It supports multiple transport modes, including STDIO, HTTP, and SSE, and offers tools for managing SIP accounts, balances, IP restrictions, and transaction histories. The server ensures secure API access via environment variables or bearer tokens and is easily configurable for LLM platforms, such as Claude.

    • 3
    • MCP
    • UserAd/didlogic_mcp
  • FastMCP

    FastMCP

    The fast, Pythonic way to build MCP servers and clients.

    FastMCP is a production-ready framework for building Model Context Protocol (MCP) applications in Python. It streamlines the creation of MCP servers and clients, providing advanced features such as enterprise authentication, composable tools, OpenAPI/FastAPI generation, server proxying, deployment tools, and comprehensive client libraries. Designed for ease of use, it offers both standard protocol support and robust utilities for production deployments.

    • 20,201
    • MCP
    • jlowin/fastmcp
  • HarmonyOS MCP Server

    HarmonyOS MCP Server

    Enables HarmonyOS device manipulation via the Model Context Protocol.

    HarmonyOS MCP Server provides an MCP-compatible server that allows programmatic control of HarmonyOS devices. It integrates with tools and frameworks such as OpenAI's openai-agents SDK and LangGraph to facilitate LLM-powered automation workflows. The server supports execution through standard interfaces and can be used with agent platforms to process natural language instructions for device actions. Its design allows for seamless interaction with HarmonyOS systems using the Model Context Protocol.

    • 25
    • MCP
    • XixianLiang/HarmonyOS-mcp-server
  • TrackMage MCP Server

    TrackMage MCP Server

    Shipment and logistics tracking MCP server with multi-carrier support.

    TrackMage MCP Server implements the Model Context Protocol (MCP) to provide shipment tracking, logistics management, and API integration for over 1,600 carriers worldwide. It allows integration with major LLMs, supports resources such as workspaces, shipments, orders, carriers, and tracking statuses, and offers tools to create, update, and monitor shipments and orders. The server supports OAuth-based authentication, flexible configuration via environment variables, and can be deployed locally for customized logistics operations.

    • 1
    • MCP
    • trackmage/trackmage-mcp-server
  • LINE Bot MCP Server

    LINE Bot MCP Server

    MCP server connecting LINE Messaging API with AI agents

    Provides a Model Context Protocol (MCP) server implementation for integrating AI agents with the LINE Messaging API. Enables sending text and flex messages, accessing user profiles, and managing features like rich menus via MCP-compatible endpoints. Designed for connecting AI-driven context management with LINE Official Accounts for experimental and production scenarios.

    • 493
    • MCP
    • line/line-bot-mcp-server
  • MCP-Geo

    MCP-Geo

    Geocoding and reverse geocoding MCP server for LLMs.

    MCP-Geo provides geocoding and reverse geocoding capabilities to AI models using the Model Context Protocol, powered by the GeoPY library. It offers various tools such as address lookup, reverse lookup from coordinates, distance calculations, and batch processing of locations, all accessible via standard MCP tool interfaces. Safety features like rate limiting and robust error handling ensure reliable and compliant usage of geocoding services. The server is compatible with environments like Claude Desktop and can be easily configured elsewhere.

    • 28
    • MCP
    • webcoderz/MCP-Geo
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results