TeslaMate MCP Server

TeslaMate MCP Server

Query your TeslaMate data using the Model Context Protocol

106
Stars
14
Forks
106
Watchers
1
Issues
TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

Key Features

Model Context Protocol (MCP) compliant server
Accesses TeslaMate PostgreSQL databases
Provides 20 pre-defined query tools for Tesla data
Exposes endpoints for vehicle status, charging history, and analytics
Supports bearer token authentication for secure access
Compatible with AI assistants for natural language queries
Local and Docker deployment options
Environmental variable-based configuration
Quality and security assessment badges integrated
Location and efficiency analytics

Use Cases

Querying Tesla vehicle information via AI assistants
Retrieving and analyzing driving and charging statistics
Monitoring battery health and efficiency metrics
Integrating Tesla data with personal dashboards
Automating vehicle status reporting
Enabling secure remote access to TeslaMate data
Building custom analytic tools using MCP endpoints
Natural language queries about vehicle history
Combining Tesla data into smart home or automation systems
Providing data for fleet management or energy analysis

README

MseeP.ai Security Assessment Badge

TeslaMate MCP Server

Trust Score

A Model Context Protocol (MCP) server that provides access to your TeslaMate database, allowing AI assistants to query Tesla vehicle data and analytics.

teslamate-mcp

Overview

This MCP server connects to your TeslaMate PostgreSQL database and exposes various tools to retrieve Tesla vehicle information, driving statistics, charging data, battery health, efficiency metrics, and location analytics. It's designed to work with MCP-compatible AI assistants like Claude Desktop, enabling natural language queries about your Tesla data.

Prerequisites

  • TeslaMate running with a PostgreSQL database
  • Python 3.11 or higher
  • Access to your TeslaMate database

Installation

Option 1: Local Installation

  1. Clone this repository:

    bash
    git clone https://github.com/yourusername/teslamate-mcp.git
    cd teslamate-mcp
    
  2. Install dependencies using uv (recommended):

    bash
    uv sync
    

    Or using pip:

    bash
    pip install -r requirements.txt
    
  3. Create a .env file in the project root:

    env
    DATABASE_URL=postgresql://username:password@hostname:port/teslamate
    

Option 2: Docker Deployment (Remote Access)

For remote deployment using Docker. Quick start:

bash
# Clone and navigate to the repository
git clone https://github.com/yourusername/teslamate-mcp.git
cd teslamate-mcp

# Run the deployment script
./deploy.sh deploy

# Or manually:
cp env.example .env
# Edit .env with your database credentials
docker-compose up -d

The remote server will be available at:

  • Streamable HTTP: http://localhost:8888/mcp

Configuring Authentication (Optional)

To secure your remote MCP server with bearer token authentication:

  1. Set a bearer token in your .env file:

    env
    AUTH_TOKEN=your-secret-bearer-token-here
    

    Generate a secure token:

    bash
    # Use the provided token generator
    python3 generate_token.py
    
    # Or generate manually with openssl
    openssl rand -base64 32
    
    # Or use any other method to create a secure random string
    
  2. When connecting from MCP clients, include the Authorization header:

    json
    {
      "mcpServers": {
        "teslamate-remote": {
          "url": "http://your-server:8888/mcp",
          "transport": "streamable_http",
          "headers": {
            "Authorization": "Bearer your-secret-bearer-token-here"
          }
        }
      }
    }
    
  3. Or use curl for testing:

    bash
    curl -H "Authorization: Bearer your-secret-bearer-token-here" \
         http://localhost:8888/mcp
    

Security Considerations

  • Use HTTPS in production: Bearer tokens are sent in plain text. Always use HTTPS/TLS in production environments.
  • Strong tokens: Use long, random tokens (at least 32 characters).
  • Environment variables: Never commit tokens to version control. Use environment variables or secrets management.
  • Network security: Consider using a VPN or restricting access by IP address for additional security.
  • Token rotation: Regularly rotate your bearer tokens.

Available Tools

The MCP server provides 20 tools for querying your TeslaMate data:

Pre-defined Query Tools

  1. get_basic_car_information - Basic vehicle details (VIN, model, name, color, etc.)
  2. get_current_car_status - Current state, location, battery level, and temperature
  3. get_software_update_history - Timeline of software updates
  4. get_battery_health_summary - Battery degradation and health metrics
  5. get_battery_degradation_over_time - Historical battery capacity trends
  6. get_daily_battery_usage_patterns - Daily battery consumption patterns
  7. get_tire_pressure_weekly_trends - Tire pressure history and trends
  8. get_monthly_driving_summary - Monthly distance, efficiency, and driving time
  9. get_daily_driving_patterns - Daily driving habits and patterns
  10. get_longest_drives_by_distance - Top drives by distance with details
  11. get_total_distance_and_efficiency - Overall driving statistics
  12. get_drive_summary_per_day - Daily drive summaries
  13. get_efficiency_by_month_and_temperature - Efficiency analysis by temperature
  14. get_average_efficiency_by_temperature - Temperature impact on efficiency
  15. get_unusual_power_consumption - Anomalous power usage detection
  16. get_charging_by_location - Charging statistics by location
  17. get_all_charging_sessions_summary - Complete charging history summary
  18. get_most_visited_locations - Frequently visited places

Custom Query Tools

  1. get_database_schema - Returns complete database schema (tables, columns, data types)
  2. run_sql - Execute custom SELECT queries with safety validation
    • Only SELECT statements allowed
    • Prevents DROP, CREATE, INSERT, UPDATE, DELETE, ALTER, etc.
    • Blocks multiple statement execution
    • Safely handles strings and comments

Configuration

Environment Variables

  • DATABASE_URL: PostgreSQL connection string for your TeslaMate database

MCP Client Configuration

To use this server with Claude Desktop, add the following to your MCP configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

Local Configuration (stdio transport)

json
{
  "mcpServers": {
    "teslamate": {
      "command": "uv",
      "args": ["run", "python", "/path/to/teslamate-mcp/main.py"],
      "env": {
        "DATABASE_URL": "postgresql://username:password@hostname:port/teslamate"
      }
    }
  }
}

Remote Configuration (streamable HTTP transport)

For connecting to a remote server:

json
{
  "mcpServers": {
    "TeslaMate": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-remote",
        "http://your-private-server:8888/mcp",
        "--allow-http"
      ]
    }
  }
}

With authentication enabled:

json
{
  "mcpServers": {
    "TeslaMate": {
      "command": "npx",
      "args": [
        "-y",
        "mcp-remote",
        "http://your-private-server:8888/mcp",
        "--allow-http",
        "--header",
        "Authorization:${AUTH_HEADER}"
      ],
      "env": {
        "AUTH_HEADER": "Bearer <secret bearer token>"
      }
    }
  }
}

Usage

Running the Server (STDIO)

bash
uv run python main.py

Example Queries

Once configured with an MCP client, you can ask natural language questions organized by category:

Basic Vehicle Information

  • "What's my Tesla's basic information?"
  • "Show me my current car status"
  • "What software updates has my Tesla received?"

Battery and Health

  • "How is my battery health?"
  • "Show me battery degradation over time"
  • "What are my daily battery usage patterns?"
  • "How are my tire pressures trending?"

Driving Analytics

  • "Show me my monthly driving summary"
  • "What are my daily driving patterns?"
  • "What are my longest drives by distance?"
  • "What's my total distance driven and efficiency?"

Efficiency Analysis

  • "How does temperature affect my efficiency?"
  • "Show me efficiency trends by month and temperature"
  • "Are there any unusual power consumption patterns?"

Charging and Location Data

  • "Where do I charge most frequently?"
  • "Show me all my charging sessions summary"
  • "What are my most visited locations?"

Custom SQL Queries

  • "Show me the database schema"
  • "Run a SQL query to find drives longer than 100km"
  • "Query the average charging power by location"
  • "Find all charging sessions at superchargers"

Note: The run_sql tool only allows SELECT queries. All data modification operations (INSERT, UPDATE, DELETE, DROP, etc.) are strictly forbidden for safety.

Adding New Queries

  1. Create a new SQL file in the queries/ directory
  2. Add a corresponding tool function in main.py
  3. Follow the existing pattern for error handling and database connections

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

For bugs and feature requests, please open an issue on GitHub.

Star History

Star History Chart

Repository Owner

cobanov
cobanov

User

Repository Details

Language Python
Default Branch main
Size 10,884 KB
Contributors 5
MCP Verified Nov 12, 2025

Programming Languages

Python
81.82%
Shell
14.97%
Dockerfile
3.21%

Tags

Topics

mcp mcp-server tesla tesla-api teslamate

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • mcp-graphql

    mcp-graphql

    Enables LLMs to interact dynamically with GraphQL APIs via Model Context Protocol.

    mcp-graphql provides a Model Context Protocol (MCP) server that allows large language models to discover and interact with GraphQL APIs. The implementation facilitates schema introspection, exposes the GraphQL schema as a resource, and enables secure query and mutation execution based on configuration. It supports configuration through environment variables, automated or manual installation options, and offers flexibility in using local or remote schema files. By default, mutation operations are disabled for security, but can be enabled if required.

    • 319
    • MCP
    • blurrah/mcp-graphql
  • Teamwork MCP Server

    Teamwork MCP Server

    Seamless Teamwork.com integration for Large Language Models via the Model Context Protocol

    Teamwork MCP Server is an implementation of the Model Context Protocol (MCP) that enables Large Language Models to interact securely and programmatically with Teamwork.com. It offers standardized interfaces, including HTTP and STDIO, allowing AI agents to perform various project management operations. The server supports multiple authentication methods, an extensible toolset architecture, and is designed for production deployments. It provides read-only capability for safe integrations and robust observability features.

    • 11
    • MCP
    • Teamwork/mcp
  • Yuque-MCP-Server

    Yuque-MCP-Server

    Seamless integration of Yuque knowledge base with Model-Context-Protocol for AI model context management.

    Yuque-MCP-Server provides an MCP-compatible server for interacting with the Yuque knowledge base platform. It enables AI models to retrieve, manage, and analyze Yuque documents and user information through a standardized Model-Context-Protocol interface. The server supports operations such as document creation, reading, updating, deletion, advanced search, and team statistics retrieval, making it ideal for AI-powered workflows. Inspired by Figma-Context-MCP, it facilitates contextual awareness and dynamic knowledge management for AI applications.

    • 31
    • MCP
    • HenryHaoson/Yuque-MCP-Server
  • MCP-wolfram-alpha

    MCP-wolfram-alpha

    An MCP server for querying the Wolfram Alpha API.

    MCP-wolfram-alpha provides an implementation of the Model Context Protocol, enabling integration with the Wolfram Alpha API. It exposes prompts and tools to facilitate AI systems in answering natural language queries by leveraging Wolfram Alpha's computational knowledge engine. The server requires an API key and offers configuration examples for seamless setup and development.

    • 64
    • MCP
    • SecretiveShell/MCP-wolfram-alpha
  • Kanboard MCP Server

    Kanboard MCP Server

    MCP server for seamless AI integration with Kanboard project management.

    Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.

    • 15
    • MCP
    • bivex/kanboard-mcp
  • GitHub MCP Server

    GitHub MCP Server

    Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.

    GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.

    • 24,418
    • MCP
    • github/github-mcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results