Higress

Higress

AI Native API Gateway with Built-in Model Context Protocol (MCP) Support

6,814
Stars
878
Forks
6,814
Watchers
655
Issues
Higress is a cloud-native API gateway built on Istio and Envoy, extensible with Wasm plugins in Go, Rust, or JS. It enables unified management and hosting of both LLM APIs and MCP Servers, allowing AI agents to easily call tools and services via standard protocols. The platform supports seamless conversion of OpenAPI specs to remote MCP servers and provides robust AI gateway features for enterprise and mainstream model providers. Higress is widely adopted in production environments, notably within Alibaba Cloud's core AI applications.

Key Features

Native support for Model Context Protocol (MCP) Servers
Plug-in extensibility through Wasm (Go, Rust, JS)
Integration with multiple mainstream AI model providers
Automatic OpenAPI to MCP server conversion tool
Unified dashboard for LLM and MCP API management
Cloud-native, scalable architecture
Enterprise-grade reliability and high availability
Out-of-the-box management console
Support for gRPC and Dubbo protocols
Ready-to-use plugin marketplace

Use Cases

Hosting and managing remote MCP Servers for AI toolchains
Enabling AI agents to access multiple tools and services via standardized protocols
Converting OpenAPI specifications into deployable MCP APIs for external integration
Centralized management of LLM and MCP APIs in enterprise environments
Extending API gateway capabilities with custom Wasm plugins
Providing reliable, scalable AI gateway services for critical applications
Integrating diverse AI models into a unified infrastructure
Supporting real-time API traffic and load-balancing for long-lived connections
Offering out-of-the-box solutions for cloud-based AI services
Facilitating rapid prototyping and deployment of AI-powered APIs

README

Build Status license discord

Official Site   |   Docs   |   Blog   |   MCP Server QuickStart   |   Developer Guide   |   Wasm Plugin Hub   |

What is Higress?

Higress is a cloud-native API gateway based on Istio and Envoy, which can be extended with Wasm plugins written in Go/Rust/JS. It provides dozens of ready-to-use general-purpose plugins and an out-of-the-box console (try the demo here).

Core Use Cases

Higress's AI gateway capabilities support all mainstream model providers both domestic and international. It also supports hosting MCP (Model Context Protocol) Servers through its plugin mechanism, enabling AI Agents to easily call various tools and services. With the openapi-to-mcp tool, you can quickly convert OpenAPI specifications into remote MCP servers for hosting. Higress provides unified management for both LLM API and MCP API.

🌟 Try it now at https://mcp.higress.ai/ to experience Higress-hosted Remote MCP Servers firsthand:

Higress MCP Server Platform

Enterprise Adoption

Higress was born within Alibaba to solve the issues of Tengine reload affecting long-connection services and insufficient load balancing capabilities for gRPC/Dubbo. Within Alibaba Cloud, Higress's AI gateway capabilities support core AI applications such as Tongyi Bailian model studio, machine learning PAI platform, and other critical AI services. Alibaba Cloud has built its cloud-native API gateway product based on Higress, providing 99.99% gateway high availability guarantee service capabilities for a large number of enterprise customers.

You can click the button below to install the enterprise version of Higress:

Deploy on AlibabaCloud

If you use open-source Higress and wish to obtain enterprise-level support, you can contact the project maintainer johnlanni's email: zty98751@alibaba-inc.com or social media accounts (WeChat ID: nomadao, DingTalk ID: chengtanzty). Please note Higress when adding as a friend :)

Summary

Quick Start

Higress can be started with just Docker, making it convenient for individual developers to set up locally for learning or for building simple sites:

bash
# Create a working directory
mkdir higress; cd higress
# Start higress, configuration files will be written to the working directory
docker run -d --rm --name higress-ai -v ${PWD}:/data \
        -p 8001:8001 -p 8080:8080 -p 8443:8443  \
        higress-registry.cn-hangzhou.cr.aliyuncs.com/higress/all-in-one:latest

Port descriptions:

  • Port 8001: Higress UI console entry
  • Port 8080: Gateway HTTP protocol entry
  • Port 8443: Gateway HTTPS protocol entry

All Higress Docker images use Higress's own image repository and are not affected by Docker Hub rate limits. In addition, the submission and updates of the images are protected by a security scanning mechanism (powered by Alibaba Cloud ACR), making them very secure for use in production environments.

If you experience a timeout when pulling image from higress-registry.cn-hangzhou.cr.aliyuncs.com, you can try replacing it with the following docker registry mirror source:

Southeast Asia: higress-registry.ap-southeast-7.cr.aliyuncs.com

For other installation methods such as Helm deployment under K8s, please refer to the official Quick Start documentation.

If you are deploying on the cloud, it is recommended to use the Enterprise Edition

Use Cases

  • MCP Server Hosting:

    Higress hosts MCP Servers through its plugin mechanism, enabling AI Agents to easily call various tools and services. With the openapi-to-mcp tool, you can quickly convert OpenAPI specifications into remote MCP servers.

    Key benefits of hosting MCP Servers with Higress:

    • Unified authentication and authorization mechanisms

    • Fine-grained rate limiting to prevent abuse

    • Comprehensive audit logs for all tool calls

    • Rich observability for monitoring performance

    • Simplified deployment through Higress's plugin mechanism

    • Dynamic updates without disruption or connection drops

      Learn more...

  • AI Gateway:

    Higress connects to all LLM model providers using a unified protocol, with AI observability, multi-model load balancing, token rate limiting, and caching capabilities:

  • Kubernetes ingress controller:

    Higress can function as a feature-rich ingress controller, which is compatible with many annotations of K8s' nginx ingress controller.

    Gateway API support is coming soon and will support smooth migration from Ingress API to Gateway API.

  • Microservice gateway:

    Higress can function as a microservice gateway, which can discovery microservices from various service registries, such as Nacos, ZooKeeper, Consul, Eureka, etc.

    It deeply integrates with Dubbo, Nacos, Sentinel and other microservice technology stacks.

  • Security gateway:

    Higress can be used as a security gateway, supporting WAF and various authentication strategies, such as key-auth, hmac-auth, jwt-auth, basic-auth, oidc, etc.

Core Advantages

  • Production Grade

    Born from Alibaba's internal product with over 2 years of production validation, supporting large-scale scenarios with hundreds of thousands of requests per second.

    Completely eliminates traffic jitter caused by Nginx reload, configuration changes take effect in milliseconds and are transparent to business. Especially friendly to long-connection scenarios such as AI businesses.

  • Streaming Processing

    Supports true complete streaming processing of request/response bodies, Wasm plugins can easily customize the handling of streaming protocols such as SSE (Server-Sent Events).

    In high-bandwidth scenarios such as AI businesses, it can significantly reduce memory overhead.

  • Easy to Extend

    Provides a rich official plugin library covering AI, traffic management, security protection and other common functions, meeting more than 90% of business scenario requirements.

    Focuses on Wasm plugin extensions, ensuring memory safety through sandbox isolation, supporting multiple programming languages, allowing plugin versions to be upgraded independently, and achieving traffic-lossless hot updates of gateway logic.

  • Secure and Easy to Use

    Based on Ingress API and Gateway API standards, provides out-of-the-box UI console, WAF protection plugin, IP/Cookie CC protection plugin ready to use.

    Supports connecting to Let's Encrypt for automatic issuance and renewal of free certificates, and can be deployed outside of K8s, started with a single Docker command, convenient for individual developers to use.

Community

Join our Discord community! This is where you can connect with developers and other enthusiastic users of Higress.

discord

Thanks

Higress would not be possible without the valuable open-source work of projects in the community. We would like to extend a special thank you to Envoy and Istio.

Related Repositories

Contributors

Star History

Star History Chart

Star History

Star History Chart

Repository Owner

alibaba
alibaba

Organization

Repository Details

Language Go
Default Branch main
Size 28,991 KB
Contributors 30
License Apache License 2.0
MCP Verified Nov 12, 2025

Programming Languages

Go
76.14%
Rust
2.7%
Shell
0.84%
Makefile
0.68%
TypeScript
0.59%
Starlark
0.47%
C
0.42%
Smarty
0.23%
Python
0.14%
Dockerfile
0.07%

Tags

Topics

ai-gateway ai-native api-gateway cloud-native envoy

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Apache APISIX

    Apache APISIX

    Dynamic, high-performance API Gateway with AI and Model Context Protocol support

    Apache APISIX is a real-time, high-performance API gateway that offers advanced traffic management features such as load balancing, dynamic routing, and authentication. It serves as both a traditional API gateway and an AI Gateway, supporting seamless proxying for AI workloads and managing LLM traffic. Notably, it includes the 'mcp-bridge' plugin, which allows for integration and conversion of stdio-based Model Context Protocol (MCP) servers to scalable HTTP SSE services, enabling robust context management for AI applications.

    • 15,837
    • MCP
    • apache/apisix
  • APISIX Model Context Protocol Server

    APISIX Model Context Protocol Server

    Bridge LLMs with APISIX for natural language API management.

    APISIX Model Context Protocol (MCP) Server enables large language models to interact with and manage APISIX resources via natural language commands. It provides a standardized protocol for connecting AI clients like Claude, Cursor, and Copilot to the APISIX Admin API. The server supports a range of operations including CRUD for routes, services, upstreams, plugins, security configurations, and more. Installation is streamlined via Smithery, npm, or direct source setup with customizable environment variables.

    • 29
    • MCP
    • api7/apisix-mcp
  • Insforge MCP Server

    Insforge MCP Server

    A Model Context Protocol server for seamless integration with Insforge and compatible AI clients.

    Insforge MCP Server implements the Model Context Protocol (MCP), enabling smooth integration with various AI tools and clients. It allows users to configure and manage connections to the Insforge platform, providing automated and manual installation methods. The server supports multiple AI clients such as Claude Code, Cursor, Windsurf, Cline, Roo Code, and Trae via standardized context management. Documentation and configuration guidelines are available for further customization and usage.

    • 3
    • MCP
    • InsForge/insforge-mcp
  • Kanboard MCP Server

    Kanboard MCP Server

    MCP server for seamless AI integration with Kanboard project management.

    Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.

    • 15
    • MCP
    • bivex/kanboard-mcp
  • MCP Swagger Server (mss)

    MCP Swagger Server (mss)

    Seamlessly convert OpenAPI/Swagger specs into Model Context Protocol tools for AI integration.

    MCP Swagger Server converts OpenAPI/Swagger API specifications into Model Context Protocol (MCP) compatible tools, enabling REST APIs to become directly callable by AI systems. It supports zero-configuration conversion, multiple transport protocols (SSE, Streamable, Stdio), and secure API access through Bearer Token authentication. The tool offers an interactive command-line interface and configuration options to filter operations, customize transports, and manage API security. Its modular structure includes OpenAPI parsing, web UI, and backend services.

    • 38
    • MCP
    • zaizaizhao/mcp-swagger-server
  • Mastra

    Mastra

    A TypeScript framework for building scalable AI-powered agents and applications.

    Mastra is a modern TypeScript-based framework designed for developing AI-powered applications and autonomous agents. It offers model routing to integrate over 40 AI providers, a graph-based workflow engine, advanced context management, and production-ready tools for observability and evaluation. Mastra features built-in support for authoring Model Context Protocol (MCP) servers, enabling standardized exposure of agents, tools, and structured AI resources via the MCP interface.

    • 18,189
    • MCP
    • mastra-ai/mastra
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results