HarmonyOS MCP Server

HarmonyOS MCP Server

Enables HarmonyOS device manipulation via the Model Context Protocol.

25
Stars
8
Forks
25
Watchers
2
Issues
HarmonyOS MCP Server provides an MCP-compatible server that allows programmatic control of HarmonyOS devices. It integrates with tools and frameworks such as OpenAI's openai-agents SDK and LangGraph to facilitate LLM-powered automation workflows. The server supports execution through standard interfaces and can be used with agent platforms to process natural language instructions for device actions. Its design allows for seamless interaction with HarmonyOS systems using the Model Context Protocol.

Key Features

MCP-compliant server for HarmonyOS device control
Integration with OpenAI openai-agents SDK
Support for LangGraph and Langchain-based agent workflows
Python 3.13 compatibility
Natural language instruction processing
Executable via command line with uv and server.py
Asynchronous server management
Stdio and API-based interaction interfaces
Flexible workflow state management
Open-source under MIT license

Use Cases

Automated testing of HarmonyOS devices using LLM-driven agents
Remote operation and control of HarmonyOS smartphones or tablets
Natural language automation of device functions for end users
Integration into multi-agent orchestration environments
Developing custom workflow applications that require HarmonyOS interaction
Educational demonstrations of agent-based mobile device management
Building custom tools for batch configuration or app deployment
Enhancing LLM assistant capabilities with HarmonyOS manipulation
Enabling context-aware device diagnostics and problem solving
Research on agent workflows for mobile operating systems

README

   

Intro

This is a MCP server for manipulating harmonyOS Device.

https://github.com/user-attachments/assets/7af7f5af-e8c6-4845-8d92-cd0ab30bfe17

Quick Start

Installation

  1. Clone this repo
bash
git clone https://github.com/XixianLiang/HarmonyOS-mcp-server.git
cd HarmonyOS-mcp-server
  1. Setup the envirnment.
bash
uv python install 3.13
uv sync

Usage

1.Claude Desktop

You can use Claude Desktop to try our tool.

2.Openai SDK

You can also use openai-agents SDK to try the mcp server. Here's an example

python
"""
Example: Use Openai-agents SDK to call HarmonyOS-mcp-server
"""
import asyncio
import os

from agents import Agent, Runner, gen_trace_id, trace
from agents.mcp import MCPServerStdio, MCPServer

async def run(mcp_server: MCPServer):
    agent = Agent(
        name="Assistant",
        instructions="Use the tools to manipulate the HarmonyOS device and finish the task.",
        mcp_servers=[mcp_server],
    )

    message = "Launch the app `settings` on the phone"
    print(f"Running: {message}")
    result = await Runner.run(starting_agent=agent, input=message)
    print(result.final_output)


async def main():

    # Use async context manager to initialize the server
    async with MCPServerStdio(
        params={
            "command": "<...>/bin/uv",
            "args": [
                "--directory",
                "<...>/harmonyos-mcp-server",
                "run",
                "server.py"
            ]
        }
    ) as server:
        trace_id = gen_trace_id()
        with trace(workflow_name="MCP HarmonyOS", trace_id=trace_id):
            print(f"View trace: https://platform.openai.com/traces/trace?trace_id={trace_id}\n")
            await run(server)

if __name__ == "__main__":
    asyncio.run(main())

3.Langchain

You can use LangGraph, a flexible LLM agent framework to design your workflows. Here's an example

python
"""
langgraph_mcp.py
"""

server_params = StdioServerParameters(
    command="/home/chad/.local/bin/uv",
    args=["--directory",
          ".",
          "run",
          "server.py"],
    
)


#This fucntion would use langgraph to build your own agent workflow
async def create_graph(session):
    llm = ChatOllama(model="qwen2.5:7b", temperature=0)
    #!!!load_mcp_tools is a langchain package function that integrates the mcp into langchain.
    #!!!bind_tools fuction enable your llm to access your mcp tools
    tools = await load_mcp_tools(session)
    llm_with_tool = llm.bind_tools(tools)

    
    system_prompt = await load_mcp_prompt(session, "system_prompt")
    prompt_template = ChatPromptTemplate.from_messages([
        ("system", system_prompt[0].content),
        MessagesPlaceholder("messages")
    ])
    chat_llm = prompt_template | llm_with_tool

    # State Management
    class State(TypedDict):
        messages: Annotated[List[AnyMessage], add_messages]

    # Nodes
    def chat_node(state: State) -> State:
        state["messages"] = chat_llm.invoke({"messages": state["messages"]})
        return state

    # Building the graph
    # graph is like a workflow of your agent.
    #If you want to know more langgraph basic,reference this link (https://langchain-ai.github.io/langgraph/tutorials/get-started/1-build-basic-chatbot/#3-add-a-node)
    graph_builder = StateGraph(State)
    graph_builder.add_node("chat_node", chat_node)
    graph_builder.add_node("tool_node", ToolNode(tools=tools))
    graph_builder.add_edge(START, "chat_node")
    graph_builder.add_conditional_edges("chat_node", tools_condition, {"tools": "tool_node", "__end__": END})
    graph_builder.add_edge("tool_node", "chat_node")
    graph = graph_builder.compile(checkpointer=MemorySaver())
    return graph





async def main():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            await session.initialize()

            config = RunnableConfig(thread_id=1234,recursion_limit=15)
            # Use the MCP Server in the graph
            agent = await create_graph(session)

            while True:
                message = input("User: ")
                try:
                    response = await agent.ainvoke({"messages": message}, config=config)
                    print("AI: "+response["messages"][-1].content)
                except RecursionError:
                    result = None
                    logging.error("Graph recursion limit reached.")


if __name__ == "__main__":
    asyncio.run(main())

Write the system prompt in server.py

python
"""
server.py
"""
@mcp.prompt()
def system_prompt() -> str:
    """System prompt description"""
    return """
    You are an AI assistant use the tools if needed.
    """

Use load_mcp_prompt function to get your prompt from mcp server.

python
"""
langgraph_mcp.py
"""
prompts = await load_mcp_prompt(session, "system_prompt")

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch main
Size 766 KB
Contributors 3
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
100%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Model Context Protocol Server for Home Assistant

    Model Context Protocol Server for Home Assistant

    Seamlessly connect Home Assistant to LLMs for natural language smart home control via MCP.

    Enables integration between a local Home Assistant instance and language models using the Model Context Protocol (MCP). Facilitates natural language monitoring and control of smart home devices, with robust API support for state management, automation, real-time updates, and system administration. Features secure, token-based access, and supports mobile and HTTP clients. Designed to bridge Home Assistant environments with modern AI-driven automation.

    • 468
    • MCP
    • tevonsb/homeassistant-mcp
  • OPC UA MCP Server

    OPC UA MCP Server

    Bridge AI agents with OPC UA industrial systems in real time.

    OPC UA MCP Server enables seamless integration of AI agents with OPC UA-enabled industrial automation systems. It provides real-time monitoring, analysis, and control of operational data through a set of standardized tool APIs. Supporting both reading and writing of OPC UA nodes, the server facilitates natural language interaction by exposing tools for AI-driven automation and control workflows.

    • 20
    • MCP
    • kukapay/opcua-mcp
  • FastMCP

    FastMCP

    The fast, Pythonic way to build MCP servers and clients.

    FastMCP is a production-ready framework for building Model Context Protocol (MCP) applications in Python. It streamlines the creation of MCP servers and clients, providing advanced features such as enterprise authentication, composable tools, OpenAPI/FastAPI generation, server proxying, deployment tools, and comprehensive client libraries. Designed for ease of use, it offers both standard protocol support and robust utilities for production deployments.

    • 20,201
    • MCP
    • jlowin/fastmcp
  • MetaTrader MCP Server

    MetaTrader MCP Server

    Let AI assistants trade for you using natural language.

    MetaTrader MCP Server is a bridge that connects AI assistants such as Claude and ChatGPT to the MetaTrader 5 trading platform via the Model Context Protocol (MCP). It enables users to perform trading actions on MetaTrader 5 through natural language instructions. The system supports real-time data access, full account management, and secure local credential handling, offering both MCP and REST API interfaces.

    • 120
    • MCP
    • ariadng/metatrader-mcp-server
  • Modbus MCP Server

    Modbus MCP Server

    Standardizes Modbus data for seamless AI integration via the Model Context Protocol.

    Modbus MCP Server provides an MCP-compliant interface that standardizes and contextualizes Modbus device data for use with AI agents and industrial IoT systems. It supports flexible Modbus connections over TCP, UDP, or serial interfaces and offers a range of Modbus tools for reading and writing registers and coils. With customizable prompts and structured tool definitions, it enables natural language-driven interactions and analysis of Modbus data within AI workflows. The solution is designed to ensure interoperability and easy configuration within MCP-compatible environments.

    • 18
    • MCP
    • kukapay/modbus-mcp
  • MCP Server for Odoo

    MCP Server for Odoo

    Connect AI assistants to Odoo ERP systems using the Model Context Protocol.

    MCP Server for Odoo enables AI assistants such as Claude to interact seamlessly with Odoo ERP systems via the Model Context Protocol (MCP). It provides endpoints for searching, creating, updating, and deleting Odoo records using natural language while respecting access controls and security. The server supports integration with any Odoo instance, includes smart features like pagination and LLM-optimized output, and offers both demo and production-ready modes.

    • 101
    • MCP
    • ivnvxd/mcp-server-odoo
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results