MCP-wolfram-alpha

MCP-wolfram-alpha

An MCP server for querying the Wolfram Alpha API.

64
Stars
17
Forks
64
Watchers
1
Issues
MCP-wolfram-alpha provides an implementation of the Model Context Protocol, enabling integration with the Wolfram Alpha API. It exposes prompts and tools to facilitate AI systems in answering natural language queries by leveraging Wolfram Alpha's computational knowledge engine. The server requires an API key and offers configuration examples for seamless setup and development.

Key Features

Implements Model Context Protocol server
Integration with Wolfram Alpha API
Custom prompts for query formulation
Tool-based approach for querying knowledge engine
API key configuration via environment variable
Detailed setup instructions for MCP environments
JSON configuration examples
Support for both Windows and UNIX-like systems
CLI-based debugging and inspection support
Responsive to natural language input

Use Cases

Answering complex mathematical queries using Wolfram Alpha
Automated data retrieval for AI chatbots
Embedding computational knowledge into virtual assistants
Educational tools seeking real-time answers from Wolfram Alpha
Enhancing research automation by programmatically accessing knowledge
Configuring custom AI pipelines with external data sources
Generating responses to user questions via prompt-based workflows
Facilitating context-aware model interactions
Rapid prototyping of AI systems with computational backends
Integrating advanced calculation functions into conversational interfaces

README

MCP-wolfram-alpha

A MCP server to connect to wolfram alpha API.

Components

Prompts

This is analogous to the !wa bang in duckduckgo search.

python
def wa(query: str) -> f"Use wolfram alpha to answer the following question: {query}"

Tools

Query Wolfram Alpha api.

python
def query_wolfram_alpha(query: str) -> str

Configuration

You must set the WOLFRAM_API_KEY environment variable. Get an api ket from Wolfram Alpha.

This was tested with the full results API, but it might not be required.

json
{
    "mcpServers": {
        "MCP-wolfram-alpha": {
            "command": "uv",
            "args": [
                "--directory",
                "C:\\Users\\root\\Documents\\MCP-wolfram-alpha",
                "run",
                "MCP-wolfram-alpha"
            ],
            "env": {
                "WOLFRAM_API_KEY": "your-app-id"
            }
        }
    }
}

Development

Debugging

Since the official MCP inspector does not have good environment support, I reccommend using wong2's mcp-cli-inspector.

Create a config.json file in the same style as claude desktop.

json
{
    "mcpServers": {
        "MCP-wolfram-alpha": {
            "command": "uv",
            "args": [
                "--directory",
                "/full/path/to/MCP-wolfram-alpha",
                "run",
                "MCP-wolfram-alpha"
            ],
            "env": {
                "WOLFRAM_API_KEY": "your-app-id"
            }
        }
    }
}

Then run:

bash
npx @wong2/mcp-cli -c .\config.json

Star History

Star History Chart

Repository Owner

Repository Details

Language Python
Default Branch master
Size 94 KB
Contributors 4
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

Python
98.21%
Dockerfile
1.79%

Tags

Topics

mcp mcp-server wolfram wolfram-alpha

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Perplexity MCP Server

    Perplexity MCP Server

    MCP Server integration for accessing the Perplexity API with context-aware chat completion.

    Perplexity MCP Server provides a Model Context Protocol (MCP) compliant server that interfaces with the Perplexity API, enabling chat completion with citations. Designed for seamless integration with clients such as Claude Desktop, it allows users to send queries and receive context-rich responses from Perplexity. Environment configuration for API key management is supported, and limitations with long-running requests are noted. Future updates are planned to enhance support for client progress reporting.

    • 85
    • MCP
    • tanigami/mcp-server-perplexity
  • ws-mcp

    ws-mcp

    WebSocket bridge for MCP stdio servers.

    ws-mcp wraps Model Context Protocol (MCP) stdio servers with a WebSocket interface, enabling seamless integration with web-based clients and tools. It allows users to configure and launch multiple MCP servers via a flexible configuration file or command-line arguments. The tool is designed to be compatible with services such as wcgw, fetch, and other MCP-compliant servers, providing standardized access to system operations, HTTP requests, and more. Integration with tools like Kibitz enables broader applications in model interaction workflows.

    • 19
    • MCP
    • nick1udwig/ws-mcp
  • Google Workspace MCP Server

    Google Workspace MCP Server

    Full natural language control of Google Workspace through the Model Context Protocol.

    Google Workspace MCP Server enables comprehensive natural language interaction with Google services such as Calendar, Drive, Gmail, Docs, Sheets, Slides, Forms, Tasks, and Chat via any MCP-compatible client or AI assistant. It supports both single-user and secure multi-user OAuth 2.1 authentication, providing a production-ready backend for custom apps. Built on FastMCP, it delivers high performance and advanced context handling, offering deep integration with the entire Google Workspace suite.

    • 890
    • MCP
    • taylorwilsdon/google_workspace_mcp
  • OpenAI MCP Server

    OpenAI MCP Server

    Bridge between Claude and OpenAI models using the MCP protocol.

    OpenAI MCP Server enables direct querying of OpenAI language models from Claude via the Model Context Protocol (MCP). It provides a configurable Python server that exposes OpenAI APIs as MCP endpoints. The server is designed for seamless integration, requiring simple configuration updates and environment variable setup. Automated testing is supported to verify connectivity and response from the OpenAI API.

    • 77
    • MCP
    • pierrebrunelle/mcp-server-openai
  • mcp-server-atlassian-confluence

    mcp-server-atlassian-confluence

    Seamlessly connect AI assistants to your Atlassian Confluence knowledge base.

    Enables integration of AI assistants like Claude and Cursor AI directly with Atlassian Confluence, allowing users to interact with their documentation and knowledge base using natural language queries. Supports instant answers, search across all spaces, and access to specific Confluence content and discussions. Follows the Model Context Protocol (MCP) for standardized model context management and easy configuration with various AI assistants via STDIO transport or config files.

    • 39
    • MCP
    • aashari/mcp-server-atlassian-confluence
  • TeslaMate MCP Server

    TeslaMate MCP Server

    Query your TeslaMate data using the Model Context Protocol

    TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

    • 106
    • MCP
    • cobanov/teslamate-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results