mcp-aoai-web-browsing

mcp-aoai-web-browsing

Bridge Model Context Protocol with Azure OpenAI and web browsing automation.

30
Stars
10
Forks
30
Watchers
2
Issues
Implements an MCP (Model Context Protocol) server and client to enable secure, controlled AI interactions with web resources using Azure OpenAI and Playwright. Utilizes FastMCP for server-side handling and converts MCP responses to OpenAI function calling format. Integrates a custom MCP-LLM Bridge for seamless communication between MCP and OpenAI-compatible models, supporting advanced web automation tasks. Provides a minimal yet extensible example for combining protocol-driven AI, cloud LLM services, and browser automation.

Key Features

Implements MCP server using FastMCP
Client application with GUI
Integration with Azure OpenAI APIs
Web automation through Playwright
MCP to OpenAI function-calling response bridging
Custom MCP-LLM bridge implementation
Tool definition and description for LLMs
Structured environment configuration via .env
Modern Python project using uv dependency manager
Sample use of function-calling style automation tools

Use Cases

Automating browser tasks with LLM-driven instructions
Integrating Azure OpenAI language models with web automation workflows
Facilitating secure connections between AI models and web interfaces
Demonstrating end-to-end protocol-based AI/web automation pipelines
Converting custom tool definitions into OpenAI-compatible formats
Rapid prototyping of AI-assisted web navigation and testing tasks
Showcasing bridging techniques between MCP servers and LLM APIs
Testing and debugging web automation flows via AI clients
Enabling contextual interaction with live web data for LLMs
Providing reference implementation for MCP-based automation projects

README

MCP Server & Client implementation for using Azure OpenAI

  • A minimal server/client application implementation utilizing the Model Context Protocol (MCP) and Azure OpenAI.

    1. The MCP server is built with FastMCP.
    2. Playwright is an an open source, end to end testing framework by Microsoft for testing your modern web applications.
    3. The MCP response about tools will be converted to the OpenAI function calling format.
    4. The bridge that converts the MCP server response to the OpenAI function calling format customises the MCP-LLM Bridge implementation.
    5. To ensure a stable connection, the server object is passed directly into the bridge.

Model Context Protocol (MCP)

Model Context Protocol (MCP) MCP (Model Context Protocol) is an open protocol that enables secure, controlled interactions between AI applications and local or remote resources.

Official Repositories

Community Resources

Related Projects

  • FastMCP: The fast, Pythonic way to build MCP servers.
  • Chat MCP: MCP client
  • MCP-LLM Bridge: MCP implementation that enables communication between MCP servers and OpenAI-compatible LLMs

MCP Playwright

Configuration

During the development phase in December 2024, the Python project should be initiated with 'uv'. Other dependency management libraries, such as 'pip' and 'poetry', are not yet fully supported by the MCP CLI.

  1. Rename .env.template to .env, then fill in the values in .env for Azure OpenAI:

    bash
    AZURE_OPEN_AI_ENDPOINT=
    AZURE_OPEN_AI_API_KEY=
    AZURE_OPEN_AI_DEPLOYMENT_MODEL=
    AZURE_OPEN_AI_API_VERSION=
    
  2. Install uv for python library management

    bash
    pip install uv
    uv sync
    
  3. Execute python chatgui.py

    • The sample screen shows the client launching a browser to navigate to the URL.

w.r.t. 'stdio'

stdio is a transport layer (raw data flow), while JSON-RPC is an application protocol (structured communication). They are distinct but often used interchangeably, e.g., "JSON-RPC over stdio" in protocols.

Tool description

cmd
@self.mcp.tool()
async def playwright_navigate(url: str, timeout=30000, wait_until="load"):
    """Navigate to a URL.""" -> This comment provides a description, which may be used in a mechanism similar to function calling in LLMs.

# Output
Tool(name='playwright_navigate', description='Navigate to a URL.', inputSchema={'properties': {'url': {'title': 'Url', 'type': 'string'}, 'timeout': {'default': 30000, 'title': 'timeout', 'type': 'string'}

Tip: uv

uv run: Run a script.
uv venv: Create a new virtual environment. By default, '.venv'.
uv add: Add a dependency to a script
uv remove: Remove a dependency from a script
uv sync: Sync (Install) the project's dependencies with the environment.

Tip

  • taskkill command for python.exe
cmd
taskkill /IM python.exe /F
  • Visual Code: Python Debugger: Debugging with launch.json will start the debugger using the configuration from .vscode/launch.json.

Star History

Star History Chart

Repository Owner

kimtth
kimtth

User

Repository Details

Language Python
Default Branch main
Size 439 KB
Contributors 2
License MIT License
MCP Verified Sep 5, 2025

Programming Languages

Python
96.81%
Dockerfile
3.19%

Tags

Topics

azure-openai mcp mcp-client mcp-server model-context-protocol playwright web-automation

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • cloudflare/mcp-server-cloudflare

    cloudflare/mcp-server-cloudflare

    Connect Cloudflare services to Model Context Protocol (MCP) clients for AI-powered management.

    Cloudflare MCP Server enables integration between Cloudflare's suite of services and clients using the Model Context Protocol (MCP). It provides multiple specialized servers that allow AI models to access, analyze, and manage configurations, logs, analytics, and other features across Cloudflare's platform. Users can leverage natural language interfaces in compatible MCP clients to read data, gain insights, and perform automated actions on their Cloudflare accounts. This project aims to streamline the orchestration of security, development, monitoring, and infrastructure tasks through standardized MCP connections.

    • 2,919
    • MCP
    • cloudflare/mcp-server-cloudflare
  • awslabs/mcp

    awslabs/mcp

    Specialized MCP servers for seamless AWS integration in AI and development environments.

    AWS MCP Servers is a suite of specialized servers implementing the open Model Context Protocol (MCP) to bridge large language model (LLM) applications with AWS services, tools, and data sources. It provides a standardized way for AI assistants, IDEs, and developer tools to access up-to-date AWS documentation, perform cloud operations, and automate workflows with context-aware intelligence. Featuring a broad catalog of domain-specific servers, quick installation for popular platforms, and both local and remote deployment options, it enhances cloud-native development, infrastructure management, and workflow automation for AI-driven tools. The project includes Docker, Lambda, and direct integration instructions for environments such as Amazon Q CLI, Cursor, Windsurf, Kiro, and VS Code.

    • 6,220
    • MCP
    • awslabs/mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results