TeslaMate MCP Server
Query your TeslaMate data using the Model Context Protocol
Key Features
Use Cases
README
TeslaMate MCP Server
A Model Context Protocol (MCP) server that provides access to your TeslaMate database, allowing AI assistants to query Tesla vehicle data and analytics.
Overview
This MCP server connects to your TeslaMate PostgreSQL database and exposes various tools to retrieve Tesla vehicle information, driving statistics, charging data, battery health, efficiency metrics, and location analytics. It's designed to work with MCP-compatible AI assistants like Claude Desktop, enabling natural language queries about your Tesla data.
Prerequisites
- TeslaMate running with a PostgreSQL database
- Python 3.11 or higher
- Access to your TeslaMate database
Installation
Option 1: Local Installation
-
Clone this repository:
bashgit clone https://github.com/yourusername/teslamate-mcp.git cd teslamate-mcp -
Install dependencies using uv (recommended):
bashuv syncOr using pip:
bashpip install -r requirements.txt -
Create a
.envfile in the project root:envDATABASE_URL=postgresql://username:password@hostname:port/teslamate
Option 2: Docker Deployment (Remote Access)
For remote deployment using Docker. Quick start:
# Clone and navigate to the repository
git clone https://github.com/yourusername/teslamate-mcp.git
cd teslamate-mcp
# Run the deployment script
./deploy.sh deploy
# Or manually:
cp env.example .env
# Edit .env with your database credentials
docker-compose up -d
The remote server will be available at:
- Streamable HTTP:
http://localhost:8888/mcp
Configuring Authentication (Optional)
To secure your remote MCP server with bearer token authentication:
-
Set a bearer token in your
.envfile:envAUTH_TOKEN=your-secret-bearer-token-hereGenerate a secure token:
bash# Use the provided token generator python3 generate_token.py # Or generate manually with openssl openssl rand -base64 32 # Or use any other method to create a secure random string -
When connecting from MCP clients, include the Authorization header:
json{ "mcpServers": { "teslamate-remote": { "url": "http://your-server:8888/mcp", "transport": "streamable_http", "headers": { "Authorization": "Bearer your-secret-bearer-token-here" } } } } -
Or use curl for testing:
bashcurl -H "Authorization: Bearer your-secret-bearer-token-here" \ http://localhost:8888/mcp
Security Considerations
- Use HTTPS in production: Bearer tokens are sent in plain text. Always use HTTPS/TLS in production environments.
- Strong tokens: Use long, random tokens (at least 32 characters).
- Environment variables: Never commit tokens to version control. Use environment variables or secrets management.
- Network security: Consider using a VPN or restricting access by IP address for additional security.
- Token rotation: Regularly rotate your bearer tokens.
Available Tools
The MCP server provides 20 tools for querying your TeslaMate data:
Pre-defined Query Tools
get_basic_car_information- Basic vehicle details (VIN, model, name, color, etc.)get_current_car_status- Current state, location, battery level, and temperatureget_software_update_history- Timeline of software updatesget_battery_health_summary- Battery degradation and health metricsget_battery_degradation_over_time- Historical battery capacity trendsget_daily_battery_usage_patterns- Daily battery consumption patternsget_tire_pressure_weekly_trends- Tire pressure history and trendsget_monthly_driving_summary- Monthly distance, efficiency, and driving timeget_daily_driving_patterns- Daily driving habits and patternsget_longest_drives_by_distance- Top drives by distance with detailsget_total_distance_and_efficiency- Overall driving statisticsget_drive_summary_per_day- Daily drive summariesget_efficiency_by_month_and_temperature- Efficiency analysis by temperatureget_average_efficiency_by_temperature- Temperature impact on efficiencyget_unusual_power_consumption- Anomalous power usage detectionget_charging_by_location- Charging statistics by locationget_all_charging_sessions_summary- Complete charging history summaryget_most_visited_locations- Frequently visited places
Custom Query Tools
get_database_schema- Returns complete database schema (tables, columns, data types)run_sql- Execute custom SELECT queries with safety validation- Only SELECT statements allowed
- Prevents DROP, CREATE, INSERT, UPDATE, DELETE, ALTER, etc.
- Blocks multiple statement execution
- Safely handles strings and comments
Configuration
Environment Variables
DATABASE_URL: PostgreSQL connection string for your TeslaMate database
MCP Client Configuration
To use this server with Claude Desktop, add the following to your MCP configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Local Configuration (stdio transport)
{
"mcpServers": {
"teslamate": {
"command": "uv",
"args": ["run", "python", "/path/to/teslamate-mcp/main.py"],
"env": {
"DATABASE_URL": "postgresql://username:password@hostname:port/teslamate"
}
}
}
}
Remote Configuration (streamable HTTP transport)
For connecting to a remote server:
{
"mcpServers": {
"TeslaMate": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://your-private-server:8888/mcp",
"--allow-http"
]
}
}
}
With authentication enabled:
{
"mcpServers": {
"TeslaMate": {
"command": "npx",
"args": [
"-y",
"mcp-remote",
"http://your-private-server:8888/mcp",
"--allow-http",
"--header",
"Authorization:${AUTH_HEADER}"
],
"env": {
"AUTH_HEADER": "Bearer <secret bearer token>"
}
}
}
}
Usage
Running the Server (STDIO)
uv run python main.py
Example Queries
Once configured with an MCP client, you can ask natural language questions organized by category:
Basic Vehicle Information
- "What's my Tesla's basic information?"
- "Show me my current car status"
- "What software updates has my Tesla received?"
Battery and Health
- "How is my battery health?"
- "Show me battery degradation over time"
- "What are my daily battery usage patterns?"
- "How are my tire pressures trending?"
Driving Analytics
- "Show me my monthly driving summary"
- "What are my daily driving patterns?"
- "What are my longest drives by distance?"
- "What's my total distance driven and efficiency?"
Efficiency Analysis
- "How does temperature affect my efficiency?"
- "Show me efficiency trends by month and temperature"
- "Are there any unusual power consumption patterns?"
Charging and Location Data
- "Where do I charge most frequently?"
- "Show me all my charging sessions summary"
- "What are my most visited locations?"
Custom SQL Queries
- "Show me the database schema"
- "Run a SQL query to find drives longer than 100km"
- "Query the average charging power by location"
- "Find all charging sessions at superchargers"
Note: The run_sql tool only allows SELECT queries. All data modification operations (INSERT, UPDATE, DELETE, DROP, etc.) are strictly forbidden for safety.
Adding New Queries
- Create a new SQL file in the
queries/directory - Add a corresponding tool function in
main.py - Follow the existing pattern for error handling and database connections
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- TeslaMate - Tesla data logging software
- Model Context Protocol - Protocol for AI-tool integration
For bugs and feature requests, please open an issue on GitHub.
Star History
Repository Owner
User
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
mcp-graphql
Enables LLMs to interact dynamically with GraphQL APIs via Model Context Protocol.
mcp-graphql provides a Model Context Protocol (MCP) server that allows large language models to discover and interact with GraphQL APIs. The implementation facilitates schema introspection, exposes the GraphQL schema as a resource, and enables secure query and mutation execution based on configuration. It supports configuration through environment variables, automated or manual installation options, and offers flexibility in using local or remote schema files. By default, mutation operations are disabled for security, but can be enabled if required.
- ⭐ 319
- MCP
- blurrah/mcp-graphql
Teamwork MCP Server
Seamless Teamwork.com integration for Large Language Models via the Model Context Protocol
Teamwork MCP Server is an implementation of the Model Context Protocol (MCP) that enables Large Language Models to interact securely and programmatically with Teamwork.com. It offers standardized interfaces, including HTTP and STDIO, allowing AI agents to perform various project management operations. The server supports multiple authentication methods, an extensible toolset architecture, and is designed for production deployments. It provides read-only capability for safe integrations and robust observability features.
- ⭐ 11
- MCP
- Teamwork/mcp
Yuque-MCP-Server
Seamless integration of Yuque knowledge base with Model-Context-Protocol for AI model context management.
Yuque-MCP-Server provides an MCP-compatible server for interacting with the Yuque knowledge base platform. It enables AI models to retrieve, manage, and analyze Yuque documents and user information through a standardized Model-Context-Protocol interface. The server supports operations such as document creation, reading, updating, deletion, advanced search, and team statistics retrieval, making it ideal for AI-powered workflows. Inspired by Figma-Context-MCP, it facilitates contextual awareness and dynamic knowledge management for AI applications.
- ⭐ 31
- MCP
- HenryHaoson/Yuque-MCP-Server
MCP-wolfram-alpha
An MCP server for querying the Wolfram Alpha API.
MCP-wolfram-alpha provides an implementation of the Model Context Protocol, enabling integration with the Wolfram Alpha API. It exposes prompts and tools to facilitate AI systems in answering natural language queries by leveraging Wolfram Alpha's computational knowledge engine. The server requires an API key and offers configuration examples for seamless setup and development.
- ⭐ 64
- MCP
- SecretiveShell/MCP-wolfram-alpha
Kanboard MCP Server
MCP server for seamless AI integration with Kanboard project management.
Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.
- ⭐ 15
- MCP
- bivex/kanboard-mcp
GitHub MCP Server
Connect AI tools directly to GitHub for repository, issue, and workflow management via natural language.
GitHub MCP Server enables AI tools such as agents, assistants, and chatbots to interact natively with the GitHub platform. It allows these tools to access repositories, analyze code, manage issues and pull requests, and automate workflows using the Model Context Protocol (MCP). The server supports integration with multiple hosts, including VS Code and other popular IDEs, and can operate both remotely and locally. Built for developers seeking to enhance AI-powered development workflows through seamless GitHub context access.
- ⭐ 24,418
- MCP
- github/github-mcp-server
Didn't find tool you were looking for?