MCP Internet Speed Test
Standardized internet speed and network performance testing for AI models via MCP.
Key Features
Use Cases
README
MCP Internet Speed Test
An implementation of a Model Context Protocol (MCP) for internet speed testing. It allows AI models and agents to measure, analyze, and report network performance metrics through a standardized interface.
📦 Available on PyPI: https://pypi.org/project/mcp-internet-speed-test/
🚀 Quick Start:
pip install mcp-internet-speed-test
mcp-internet-speed-test
What is MCP?
The Model Context Protocol (MCP) provides a standardized way for Large Language Models (LLMs) to interact with external tools and data sources. Think of it as the "USB-C for AI applications" - a common interface that allows AI systems to access real-world capabilities and information.
Features
- Smart Incremental Testing: Uses SpeedOf.Me methodology with 8-second threshold for optimal accuracy
- Download Speed Testing: Measures bandwidth using files from 128KB to 100MB from GitHub repository
- Upload Speed Testing: Tests upload bandwidth using generated data from 128KB to 100MB
- Latency Testing: Measures network latency with detailed server location information
- Jitter Analysis: Calculates network stability using multiple latency samples (default: 5)
- Multi-CDN Support: Detects and provides info for Fastly, Cloudflare, and AWS CloudFront
- Geographic Location: Maps POP codes to physical locations (50+ locations worldwide)
- Cache Analysis: Detects HIT/MISS status and cache headers
- Server Metadata: Extracts detailed CDN headers including
x-served-by,via,x-cache - Comprehensive Testing: Single function to run all tests with complete metrics
Installation
Prerequisites
- Python 3.12 or higher (required for async support)
- pip or uv package manager
Option 1: Install from PyPI with pip (Recommended)
# Install the package globally
pip install mcp-internet-speed-test
# Run the MCP server
mcp-internet-speed-test
Option 2: Install from PyPI with uv
# Install the package globally
uv add mcp-internet-speed-test
# Or run directly without installing
uvx mcp-internet-speed-test
Option 3: Using docker
# Build the Docker image
docker build -t mcp-internet-speed-test .
# Run the MCP server in a Docker container
docker run -it --rm -v $(pwd):/app -w /app mcp-internet-speed-test
Option 4: Development/Local Installation
If you want to contribute or modify the code:
# Clone the repository
git clone https://github.com/inventer-dev/mcp-internet-speed-test.git
cd mcp-internet-speed-test
# Install in development mode
pip install -e .
# Or using uv
uv sync
uv run python -m mcp_internet_speed_test.main
Dependencies
The package automatically installs these dependencies:
mcp[cli]>=1.6.0: MCP server framework with CLI integrationhttpx>=0.27.0: Async HTTP client for speed tests
Configuration
To use this MCP server with Claude Desktop or other MCP clients, add it to your MCP configuration file.
Claude Desktop Configuration
Edit your Claude Desktop MCP configuration file:
Option 1: Using pip installed package (Recommended)
{
"mcpServers": {
"mcp-internet-speed-test": {
"command": "mcp-internet-speed-test"
}
}
}
Option 2: Using uvx
{
"mcpServers": {
"mcp-internet-speed-test": {
"command": "uvx",
"args": ["mcp-internet-speed-test"]
}
}
}
API Tools
The MCP Internet Speed Test provides the following tools:
Testing Functions
measure_download_speed: Measures download bandwidth (in Mbps) with server location infomeasure_upload_speed: Measures upload bandwidth (in Mbps) with server location infomeasure_latency: Measures network latency (in ms) with server location infomeasure_jitter: Measures network jitter by analyzing latency variations with server infoget_server_info: Get detailed CDN server information for any URL without running speed testsrun_complete_test: Comprehensive test with all metrics and server metadata
CDN Server Detection
This speed test now provides detailed information about the CDN servers serving your tests:
What You Get
- CDN Provider: Identifies if you're connecting to Fastly, Cloudflare, or Amazon CloudFront
- Geographic Location: Shows the physical location of the server (e.g., "Mexico City, Mexico")
- POP Code: Three-letter code identifying the Point of Presence (e.g., "MEX", "QRO", "DFW")
- Cache Status: Whether content is served from cache (HIT) or fetched from origin (MISS)
- Server Headers: Full HTTP headers including
x-served-by,via, andx-cache
Technical Implementation
Smart Testing Methodology
- Incremental Approach: Starts with small files (128KB) and progressively increases
- Time-Based Optimization: Uses 8-second base threshold + 4-second additional buffer
- Accuracy Focus: Selects optimal file size that provides reliable measurements
- Multi-Provider Support: Tests against geographically distributed endpoints
CDN Detection Capabilities
- Fastly: Detects POP codes and maps to 50+ global locations
- Cloudflare: Identifies data centers and geographic regions
- AWS CloudFront: Recognizes edge locations across continents
- Header Analysis: Parses
x-served-by,via,x-cache, and custom CDN headers
Why This Matters
- Network Diagnostics: Understand which server is actually serving your tests
- Performance Analysis: Correlate speed results with server proximity
- CDN Optimization: Identify if your ISP's routing is optimal
- Geographic Awareness: Know if tests are running from your expected region
- Troubleshooting: Identify routing issues and CDN misconfigurations
Example Server Info Output
{
"cdn_provider": "Fastly",
"pop_code": "MEX",
"pop_location": "Mexico City, Mexico",
"served_by": "cache-mex4329-MEX",
"cache_status": "HIT",
"x_cache": "HIT, HIT"
}
Technical Configuration
Default Test Files Repository
GitHub Repository: inventer-dev/speed-test-files
Branch: main
File Sizes: 128KB, 256KB, 512KB, 1MB, 2MB, 5MB, 10MB, 20MB, 40MB, 50MB, 100MB
Upload Endpoints Priority
- Cloudflare Workers (httpi.dev) - Global distribution, highest priority
- HTTPBin (httpbin.org) - AWS-based, secondary endpoint
Supported CDN Locations (150+ POPs)
Fastly POPs: MEX, QRO, DFW, LAX, NYC, MIA, LHR, FRA, AMS, CDG, NRT, SIN, SYD, GRU, SCL, BOG, MAD, MIL...
Cloudflare Centers: DFW, LAX, SJC, SEA, ORD, MCI, IAD, ATL, MIA, YYZ, LHR, FRA, AMS, CDG, ARN, STO...
AWS CloudFront: ATL, BOS, ORD, CMH, DFW, DEN, IAD, LAX, MIA, MSP, JFK, SEA, SJC, AMS, ATH, TXL...
Performance Thresholds
- Base Test Duration: 8.0 seconds
- Additional Buffer: 4.0 seconds
- Maximum File Size: Configurable (default: 100MB)
- Jitter Samples: 5 measurements (configurable)
Troubleshooting
Common Issues
MCP Server Connection
- Path Configuration: Ensure absolute path is used in MCP configuration
- Directory Permissions: Verify read/execute permissions for the project directory
- Python Version: Requires Python 3.12+ with async support
- Dependencies: Install
fastmcpandhttpxpackages
Speed Test Issues
- GitHub Repository Access: Ensure
inventer-dev/speed-test-filesis accessible - Firewall/Proxy: Check if corporate firewalls block test endpoints
- CDN Routing: Some ISPs may route differently to CDNs
- Network Stability: Jitter tests require stable connections
Performance Considerations
- File Size Limits: Large files (>50MB) may timeout on slow connections
- Upload Endpoints: If primary endpoint fails, fallback is automatic
- Geographic Accuracy: POP detection depends on CDN header consistency
Development
Project Structure
mcp-internet-speed-test/
├── mcp_internet_speed_test/ # Main package directory
│ ├── __init__.py # Package initialization
│ └── main.py # MCP server implementation
├── README.md # This documentation
├── Dockerfile # Container configuration
└── pyproject.toml # Python project configuration
Key Components
Configuration Constants
GITHUB_RAW_URL: Base URL for test files repositoryUPLOAD_ENDPOINTS: Prioritized list of upload test endpointsSIZE_PROGRESSION: Ordered list of file sizes for incremental testing*_POP_LOCATIONS: Mappings of CDN codes to geographic locations
Core Functions
extract_server_info(): Parses HTTP headers to identify CDN providersmeasure_*(): Individual test functions for different metricsrun_complete_test(): Orchestrates comprehensive testing suite
Configuration Customization
You can customize the following in mcp_internet_speed_test/main.py if you clone the repository:
# GitHub repository settings
GITHUB_USERNAME = "your-username"
GITHUB_REPO = "your-speed-test-files"
GITHUB_BRANCH = "main"
# Test duration thresholds
BASE_TEST_DURATION = 8.0 # seconds
ADDITIONAL_TEST_DURATION = 4.0 # seconds
# Default endpoints
DEFAULT_UPLOAD_URL = "your-upload-endpoint"
DEFAULT_LATENCY_URL = "your-latency-endpoint"
Contributing
This is an experimental project and contributions are welcome:
- Issues: Report bugs or request features
- Pull Requests: Submit code improvements
- Documentation: Help improve this README
- Testing: Test with different network conditions and CDNs
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- MCP Framework maintainers for standardizing AI tool interactions
- The Model Context Protocol community for documentation and examples
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Web Analyzer MCP
Intelligent web content analysis and summarization via MCP.
Web Analyzer MCP is an MCP-compliant server designed for intelligent web content analysis and summarization. It leverages FastMCP to perform advanced web scraping, content extraction, and AI-powered question-answering using OpenAI models. The tool integrates with various developer IDEs, offering structured markdown output, essential content extraction, and smart Q&A functionality. Its features streamline content analysis workflows and support flexible model selection.
- ⭐ 2
- MCP
- kimdonghwi94/web-analyzer-mcp
@just-every/mcp-screenshot-website-fast
Fast screenshot capture and tiling optimized for AI model workflows.
Provides a fast and efficient command-line tool for capturing high-quality screenshots of webpages, specifically optimized for integration with AI vision workflows via the Model Context Protocol (MCP). Automates image tiling to 1072x1072 pixel chunks for optimal processing and compatibility with tools like Claude Vision API. Includes advanced features such as full-page capture, screencast recording, support for JavaScript injection, configurable viewports, and resource-efficient browser management.
- ⭐ 89
- MCP
- just-every/mcp-screenshot-website-fast
Dappier MCP Server
Real-time web search and premium data access for AI agents via Model Context Protocol.
Dappier MCP Server enables fast, real-time web search and access to premium data sources, including news, financial markets, sports, and weather, for AI agents using the Model Context Protocol (MCP). It integrates seamlessly with tools like Claude Desktop and Cursor, allowing users to enhance their AI workflows with up-to-date, trusted information. Simple installation and configuration are provided for multiple platforms, leveraging API keys for secure access. The solution supports deployment via Smithery and direct installation with 'uv', facilitating rapid setup for developers.
- ⭐ 35
- MCP
- DappierAI/dappier-mcp
MCP Status Observer
Monitor and query real-time operational status of digital platforms using Model Context Protocol.
MCP Status Observer enables real-time monitoring and querying of service status for major digital platforms and AI providers through the Model Context Protocol (MCP). It supports integration with tools like Claude Desktop for streamlined status checks, incident tracking, and impact analysis across a range of platforms including GitHub, Slack, OpenAI, and Cloudflare. The tool provides detailed component-level insights, incident histories, and resolution statuses, empowering rapid awareness of operational issues and outages.
- ⭐ 5
- MCP
- imprvhub/mcp-status-observer
mcp-read-website-fast
Fast, token-efficient web content extraction and Markdown conversion for AI agents.
Provides a Model Context Protocol (MCP) compatible server that rapidly fetches web pages, removes noise, and converts content to clean Markdown with link preservation. Designed for local use by AI-powered tools like IDEs and large language models, it offers optimized token usage, concurrency, polite crawling, and smart caching. Integrates with Claude Code, VS Code, JetBrains IDEs, Cursor, and other MCP clients.
- ⭐ 111
- MCP
- just-every/mcp-read-website-fast
Nexus MCP Server
AI integration without the complexity
Nexus MCP Server implements the Model Context Protocol to provide streamlined AI-powered web search capabilities through OpenRouter and the Perplexity Sonar models. It enables intelligent model search and discovery for MCP-compatible clients such as Claude Desktop and Cursor, supporting context-rich queries and structured citation extraction. The server features robust deployment options, including zero-install NPX execution, and is fully implemented in TypeScript. Its architecture includes production-ready features like error handling, request caching, deduplication, and logging.
- ⭐ 16
- MCP
- adawalli/nexus
Didn't find tool you were looking for?