k6-mcp-server

k6-mcp-server

A Model Context Protocol server for orchestrating k6 load tests via MCP-enabled clients.

17
Stars
8
Forks
17
Watchers
6
Issues
k6-mcp-server implements the Model Context Protocol, allowing users to execute and manage k6 load testing scripts through standardized MCP clients. It provides a simple API, supports custom test durations and virtual users, and offers real-time execution output. The system is configurable via environment variables and can be easily integrated into existing MCP-compatible tooling.

Key Features

Implements Model Context Protocol for interoperability
Run k6 load tests through MCP-enabled clients
Supports customizable duration and virtual users
Real-time execution output streaming
API methods for test management
Environment variable configuration
Easy integration with common developer tools
Command line and programmatic execution
Reference implementation for MCP server-side execution
Supports LLM-powered analysis workflows

Use Cases

Orchestrating k6 load tests from LLM assistants
Automating performance testing in development workflows
Integrating load testing into MCP-compatible IDEs
Providing real-time load test feedback to AI copilots
Scripting k6 tests via standardized protocol clients
LLM-driven debugging of load test executions
Facilitating reproducible test scenarios through MCP configuration
Remote execution of k6 tests from conversational clients
Analyzing load test results using automated tooling
Centralized load test management for organizations using MCP

README

🚀 ⚡️ k6-mcp-server

A Model Context Protocol (MCP) server implementation for running k6 load tests.

✨ Features

  • Simple integration with Model Context Protocol framework
  • Support for custom test durations and virtual users (VUs)
  • Easy-to-use API for running k6 load tests
  • Configurable through environment variables
  • Real-time test execution output

🔧 Prerequisites

Before you begin, ensure you have the following installed:

📦 Installation

  1. Clone the repository:
bash
git clone https://github.com/qainsights/k6-mcp-server.git
  1. Install the required dependencies:
bash
uv pip install -r requirements.txt
  1. Set up environment variables (optional): Create a .env file in the project root:
bash
K6_BIN=/path/to/k6  # Optional: defaults to 'k6' in system PATH

🚀 Getting Started

  1. Create a k6 test script (e.g., test.js):
javascript
import http from "k6/http";
import { sleep } from "k6";

export default function () {
  http.get("http://test.k6.io");
  sleep(1);
}
  1. Configure the MCP server using the below specs in your favorite MCP client (Claude Desktop, Cursor, Windsurf and more):
json
{
  "mcpServers": {
    "k6": {
      "command": "/path/to/bin/uv",
      "args": [
        "--directory",
        "/path/to/k6-mcp-server",
        "run",
        "k6_server.py"
      ]
    }
  }
}

  1. Now ask the LLM to run the test e.g. run k6 test for hello.js. The k6 mcp server will leverage either one of the below tools to start the test.
  • execute_k6_test: Run a test with default options (30s duration, 10 VUs)
  • execute_k6_test_with_options: Run a test with custom duration and VUs

k6-MCP

📝 API Reference

Execute K6 Test

python
execute_k6_test(
    script_file: str,
    duration: str = "30s",  # Optional
    vus: int = 10          # Optional
)

Execute K6 Test with Custom Options

python
execute_k6_test_with_options(
    script_file: str,
    duration: str,
    vus: int
)

✨ Use cases

  • LLM powered results analysis
  • Effective debugging of load tests

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

Star History

Star History Chart

Repository Owner

QAInsights
QAInsights

User

Repository Details

Language Python
Default Branch main
Size 376 KB
Contributors 1
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
93.84%
JavaScript
6.16%

Tags

Topics

grafana k6 mcp model-context-protocol performance-testing

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • locust-mcp-server

    locust-mcp-server

    Run Locust load tests via Model Context Protocol integration.

    locust-mcp-server provides a Model Context Protocol (MCP) server for executing Locust load tests, allowing seamless connection between Locust and AI-powered development environments. It offers easy configuration, real-time test output, and both headless and UI testing modes. The project features a simple API for customizable load testing scenarios and supports various runtime and user parameters.

    • 9
    • MCP
    • QAInsights/locust-mcp-server
  • metoro-mcp-server

    metoro-mcp-server

    Bridge Kubernetes observability data to LLMs via the Model Context Protocol.

    Metoro MCP Server is an implementation of the Model Context Protocol (MCP) that enables seamless integration between Kubernetes observability data and large language models. It connects Metoro’s eBPF-based telemetry APIs to LLM applications such as the Claude Desktop App, allowing AI systems to query and analyze Kubernetes clusters. This solution supports both authenticated and demo modes for accessing real-time cluster insights.

    • 45
    • MCP
    • metoro-io/metoro-mcp-server
  • Parallel Task MCP

    Parallel Task MCP

    Launch deep research or task groups for Parallel APIs via the Model Context Protocol.

    Parallel Task MCP provides a way to initiate and manage research or task groups through LLM clients using the Model Context Protocol. It enables seamless integration with Parallel’s APIs for flexible experimentation and production development. The tool supports both remote and local deployment, and offers connection capabilities for context-aware AI workflows.

    • 4
    • MCP
    • parallel-web/task-mcp
  • nerve

    nerve

    The Simple Agent Development Kit for LLM-based automation with native MCP support

    Nerve provides a platform for building, running, evaluating, and orchestrating large language model (LLM) agents using declarative YAML configurations. It supports both client and server roles for the Model Context Protocol (MCP), allowing seamless integration, team collaboration, and advanced agent orchestration. With extensible tool support, benchmarking, and LLM-agnostic handling via LiteLLM, it enables programmable and reproducible workflows for technical users.

    • 1,278
    • MCP
    • evilsocket/nerve
  • GrowthBook MCP Server

    GrowthBook MCP Server

    Interact with GrowthBook from your LLM client via MCP.

    GrowthBook MCP Server enables seamless integration between GrowthBook and LLM clients by implementing the Model Context Protocol. It allows users to view experiment details, add feature flags, and manage GrowthBook configurations directly from AI applications. The server is configurable via environment variables and leverages GrowthBook's API for functionality. This integration streamlines experimentation and feature management workflows in AI tools.

    • 15
    • MCP
    • growthbook/growthbook-mcp
  • Optuna MCP Server

    Optuna MCP Server

    Automated model optimization and analysis via the Model Context Protocol using Optuna.

    Optuna MCP Server is an implementation of the Model Context Protocol (MCP) that enables automated hyperparameter optimization and analysis workflows through Optuna. It acts as a server providing standardized tools and endpoints for creating studies, managing trials, and visualizing optimization results. The server facilitates integration with MCP clients and supports deployment via both Python environments and Docker. It streamlines study creation, metric management, and result handling using Optuna’s capabilities.

    • 65
    • MCP
    • optuna/optuna-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results