MCPs tagged with infrastructure automation
-
mcp-server-docker
Natural language management of Docker containers via Model Context Protocol.
mcp-server-docker enables users to manage Docker containers using natural language instructions through the Model Context Protocol. It allows composing, introspecting, and debugging containers, as well as managing persistent Docker volumes. The tool is suitable for server administrators, tinkerers, and AI enthusiasts looking to leverage LLM capabilities for Docker management. Integration with tools like Claude Desktop and Docker ensures flexible deployment and easy connectivity to Docker engines.
- ⭐ 648
- MCP
- ckreiling/mcp-server-docker
-
MCP Nutanix
An MCP server enabling LLM access to Nutanix Prism Central APIs.
MCP Nutanix is an experimental Model Context Protocol (MCP) server that allows large language models to interact with Nutanix Prism Central APIs. It facilitates listing and accessing resources such as VMs, clusters, and hosts via standardized MCP client-server integration, using the Prism Go Client for backend communication. The implementation supports both interactive and static credential methods, making it compatible with various MCP clients including Claude and Cursor.
- ⭐ 11
- MCP
- thunderboltsid/mcp-nutanix
-
Portainer MCP
Connect AI assistants securely to Portainer environments using the Model Context Protocol.
Portainer MCP is an implementation of the Model Context Protocol (MCP) designed for seamless integration between AI assistants and Portainer-managed container environments. It enables management of Portainer resources and allows execution of Docker and Kubernetes commands through AI interfaces in a secure, standardized manner. The tool provides direct protocol-based access to environment data, facilitating automation and operational insights for container infrastructures.
- ⭐ 81
- MCP
- portainer/portainer-mcp
-
awslabs/mcp
Specialized MCP servers for seamless AWS integration in AI and development environments.
AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.
- ⭐ 6,220
- MCP
- awslabs/mcp