k6-mcp-server
A Model Context Protocol server for orchestrating k6 load tests via MCP-enabled clients.
Key Features
Use Cases
README
🚀 ⚡️ k6-mcp-server
A Model Context Protocol (MCP) server implementation for running k6 load tests.
✨ Features
- Simple integration with Model Context Protocol framework
- Support for custom test durations and virtual users (VUs)
- Easy-to-use API for running k6 load tests
- Configurable through environment variables
- Real-time test execution output
🔧 Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.12 or higher
- k6 load testing tool (Installation guide)
- uv package manager (Installation guide)
📦 Installation
- Clone the repository:
git clone https://github.com/qainsights/k6-mcp-server.git
- Install the required dependencies:
uv pip install -r requirements.txt
- Set up environment variables (optional):
Create a
.envfile in the project root:
K6_BIN=/path/to/k6 # Optional: defaults to 'k6' in system PATH
🚀 Getting Started
- Create a k6 test script (e.g.,
test.js):
import http from "k6/http";
import { sleep } from "k6";
export default function () {
http.get("http://test.k6.io");
sleep(1);
}
- Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):
{
"mcpServers": {
"k6": {
"command": "/path/to/bin/uv",
"args": [
"--directory",
"/path/to/k6-mcp-server",
"run",
"k6_server.py"
]
}
}
}
- Now ask the LLM to run the test e.g.
run k6 test for hello.js. The k6 mcp server will leverage either one of the below tools to start the test.
execute_k6_test: Run a test with default options (30s duration, 10 VUs)execute_k6_test_with_options: Run a test with custom duration and VUs
📝 API Reference
Execute K6 Test
execute_k6_test(
script_file: str,
duration: str = "30s", # Optional
vus: int = 10 # Optional
)
Execute K6 Test with Custom Options
execute_k6_test_with_options(
script_file: str,
duration: str,
vus: int
)
✨ Use cases
- LLM powered results analysis
- Effective debugging of load tests
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
locust-mcp-server
Run Locust load tests via Model Context Protocol integration.
locust-mcp-server provides a Model Context Protocol (MCP) server for executing Locust load tests, allowing seamless connection between Locust and AI-powered development environments. It offers easy configuration, real-time test output, and both headless and UI testing modes. The project features a simple API for customizable load testing scenarios and supports various runtime and user parameters.
- ⭐ 9
- MCP
- QAInsights/locust-mcp-server
metoro-mcp-server
Bridge Kubernetes observability data to LLMs via the Model Context Protocol.
Metoro MCP Server is an implementation of the Model Context Protocol (MCP) that enables seamless integration between Kubernetes observability data and large language models. It connects Metoro’s eBPF-based telemetry APIs to LLM applications such as the Claude Desktop App, allowing AI systems to query and analyze Kubernetes clusters. This solution supports both authenticated and demo modes for accessing real-time cluster insights.
- ⭐ 45
- MCP
- metoro-io/metoro-mcp-server
Parallel Task MCP
Launch deep research or task groups for Parallel APIs via the Model Context Protocol.
Parallel Task MCP provides a way to initiate and manage research or task groups through LLM clients using the Model Context Protocol. It enables seamless integration with Parallel’s APIs for flexible experimentation and production development. The tool supports both remote and local deployment, and offers connection capabilities for context-aware AI workflows.
- ⭐ 4
- MCP
- parallel-web/task-mcp
nerve
The Simple Agent Development Kit for LLM-based automation with native MCP support
Nerve provides a platform for building, running, evaluating, and orchestrating large language model (LLM) agents using declarative YAML configurations. It supports both client and server roles for the Model Context Protocol (MCP), allowing seamless integration, team collaboration, and advanced agent orchestration. With extensible tool support, benchmarking, and LLM-agnostic handling via LiteLLM, it enables programmable and reproducible workflows for technical users.
- ⭐ 1,278
- MCP
- evilsocket/nerve
GrowthBook MCP Server
Interact with GrowthBook from your LLM client via MCP.
GrowthBook MCP Server enables seamless integration between GrowthBook and LLM clients by implementing the Model Context Protocol. It allows users to view experiment details, add feature flags, and manage GrowthBook configurations directly from AI applications. The server is configurable via environment variables and leverages GrowthBook's API for functionality. This integration streamlines experimentation and feature management workflows in AI tools.
- ⭐ 15
- MCP
- growthbook/growthbook-mcp
Optuna MCP Server
Automated model optimization and analysis via the Model Context Protocol using Optuna.
Optuna MCP Server is an implementation of the Model Context Protocol (MCP) that enables automated hyperparameter optimization and analysis workflows through Optuna. It acts as a server providing standardized tools and endpoints for creating studies, managing trials, and visualizing optimization results. The server facilitates integration with MCP clients and supports deployment via both Python environments and Docker. It streamlines study creation, metric management, and result handling using Optuna’s capabilities.
- ⭐ 65
- MCP
- optuna/optuna-mcp
Didn't find tool you were looking for?