Pydantic Logfire MCP Server
Enables LLMs to access and analyze application telemetry data through standardized MCP tools.
Key Features
Use Cases
README
Pydantic Logfire MCP Server
This repository contains a Model Context Protocol (MCP) server with tools that can access the OpenTelemetry traces and metrics you've sent to Pydantic Logfire.
This MCP server enables LLMs to retrieve your application's telemetry data, analyze distributed traces, and make use of the results of arbitrary SQL queries executed using the Pydantic Logfire APIs.
Available Tools
-
find_exceptions_in_file- Get the details about the 10 most recent exceptions on the file.- Arguments:
filepath(string) - The path to the file to find exceptions in.age(integer) - Number of minutes to look back, e.g. 30 for last 30 minutes. Maximum allowed value is 7 days.
- Arguments:
-
arbitrary_query- Run an arbitrary query on the Pydantic Logfire database.- Arguments:
query(string) - The query to run, as a SQL string.age(integer) - Number of minutes to look back, e.g. 30 for last 30 minutes. Maximum allowed value is 7 days.
- Arguments:
-
logfire_link- Creates a link to help the user to view the trace in the Logfire UI.- Arguments:
trace_id(string) - The trace ID to link to.
- Arguments:
-
schema_reference- The database schema for the Logfire DataFusion database.
Setup
Install uv
The first thing to do is make sure uv is installed, as uv is used to run the MCP server.
For installation instructions, see the uv installation docs.
If you already have an older version of uv installed, you might need to update it with uv self update.
Obtain a Pydantic Logfire read token
In order to make requests to the Pydantic Logfire APIs, the Pydantic Logfire MCP server requires a "read token".
You can create one under the "Read Tokens" section of your project settings in Pydantic Logfire: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens
[!IMPORTANT] Pydantic Logfire read tokens are project-specific, so you need to create one for the specific project you want to expose to the Pydantic Logfire MCP server.
Manually run the server
Once you have uv installed and have a Pydantic Logfire read token, you can manually run the MCP server using uvx (which is provided by uv).
You can specify your read token using the LOGFIRE_READ_TOKEN environment variable:
LOGFIRE_READ_TOKEN=YOUR_READ_TOKEN uvx logfire-mcp@latest
You can also set LOGFIRE_READ_TOKEN in a .env file:
LOGFIRE_READ_TOKEN=pylf_v1_us_...
NOTE: for this to work, the MCP server needs to run with the directory containing the .env file in its working directory.
or using the --read-token flag:
uvx logfire-mcp@latest --read-token=YOUR_READ_TOKEN
[!NOTE] If you are using Cursor, Claude Desktop, Cline, or other MCP clients that manage your MCP servers for you, you do NOT need to manually run the server yourself. The next section will show you how to configure these clients to make use of the Pydantic Logfire MCP server.
Base URL
If you are running Logfire in a self hosted environment, you need to specify the base URL.
This can be done using the LOGFIRE_BASE_URL environment variable:
LOGFIRE_BASE_URL=https://logfire.my-company.com uvx logfire-mcp@latest --read-token=YOUR_READ_TOKEN
You can also use the --base-url argument:
uvx logfire-mcp@latest --base-url=https://logfire.my-company.com --read-token=YOUR_READ_TOKEN
Configuration with well-known MCP clients
Configure for Cursor
Create a .cursor/mcp.json file in your project root:
{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp@latest", "--read-token=YOUR-TOKEN"]
}
}
}
The Cursor doesn't accept the env field, so you need to use the --read-token flag instead.
Configure for Claude code
Run the following command:
claude mcp add logfire -e LOGFIRE_READ_TOKEN=YOUR_TOKEN -- uvx logfire-mcp@latest
Configure for Claude Desktop
Add to your Claude settings:
{
"command": ["uvx"],
"args": ["logfire-mcp@latest"],
"type": "stdio",
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
}
}
Configure for Cline
Add to your Cline settings in cline_mcp_settings.json:
{
"mcpServers": {
"logfire": {
"command": "uvx",
"args": ["logfire-mcp@latest"],
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
},
"disabled": false,
"autoApprove": []
}
}
}
Configure for VS Code
Make sure you enabled MCP support in VS Code.
Create a .vscode/mcp.json file in your project's root directory:
{
"servers": {
"logfire": {
"type": "stdio",
"command": "uvx", // or the absolute /path/to/uvx
"args": ["logfire-mcp@latest"],
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
}
}
}
}
Configure for Zed
Create a .zed/settings.json file in your project's root directory:
{
"context_servers": {
"logfire": {
"source": "custom",
"command": "uvx",
"args": ["logfire-mcp@latest"],
"env": {
"LOGFIRE_READ_TOKEN": "YOUR_TOKEN"
},
"enabled": true
}
}
}
Example Interactions
- Get details about exceptions from traces in a specific file:
{
"name": "find_exceptions_in_file",
"arguments": {
"filepath": "app/api.py",
"age": 1440
}
}
Response:
[
{
"created_at": "2024-03-20T10:30:00Z",
"message": "Failed to process request",
"exception_type": "ValueError",
"exception_message": "Invalid input format",
"function_name": "process_request",
"line_number": "42",
"attributes": {
"service.name": "api-service",
"code.filepath": "app/api.py"
},
"trace_id": "1234567890abcdef"
}
]
- Run a custom query on traces:
{
"name": "arbitrary_query",
"arguments": {
"query": "SELECT trace_id, message, created_at, attributes->>'service.name' as service FROM records WHERE severity_text = 'ERROR' ORDER BY created_at DESC LIMIT 10",
"age": 1440
}
}
Examples of Questions for Claude
- "What exceptions occurred in traces from the last hour across all services?"
- "Show me the recent errors in the file 'app/api.py' with their trace context"
- "How many errors were there in the last 24 hours per service?"
- "What are the most common exception types in my traces, grouped by service name?"
- "Get me the OpenTelemetry schema for traces and metrics"
- "Find all errors from yesterday and show their trace contexts"
Getting Started
-
First, obtain a Pydantic Logfire read token from: https://logfire.pydantic.dev/-/redirect/latest-project/settings/read-tokens
-
Run the MCP server:
bashuvx logfire-mcp@latest --read-token=YOUR_TOKEN -
Configure your preferred client (Cursor, Claude Desktop, or Cline) using the configuration examples above
-
Start using the MCP server to analyze your OpenTelemetry traces and metrics!
Contributing
We welcome contributions to help improve the Pydantic Logfire MCP server. Whether you want to add new trace analysis tools, enhance metrics querying functionality, or improve documentation, your input is valuable.
For examples of other MCP servers and implementation patterns, see the Model Context Protocol servers repository.
License
Pydantic Logfire MCP is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Panther MCP Server
Natural language and IDE-powered server for detection, alert triage, and data lake querying in Panther.
Panther MCP Server enables interactive management of security alerts, data lake queries, and scheduled reporting using natural language and integrated tools. It allows users to write and tune detections from an IDE, triage and comment on alerts, and execute advanced queries against security logs. The system provides a wide range of operations including alert investigation, bulk updates, AI-powered triage insight generation, and data lake schema exploration. Integration with the Model Context Protocol ensures standardized and extensible interactions for security operations.
- ⭐ 32
- MCP
- panther-labs/mcp-panther
VictoriaMetrics MCP Server
Model Context Protocol server enabling advanced monitoring and observability for VictoriaMetrics.
VictoriaMetrics MCP Server implements the Model Context Protocol (MCP) to provide seamless integration with VictoriaMetrics, allowing advanced monitoring, data exploration, and observability. It offers access to almost all read-only APIs, as well as embedded documentation for offline usage. The server facilitates comprehensive metric querying, cardinality analysis, alert and rule testing, and automation capabilities for engineers and tools.
- ⭐ 87
- MCP
- VictoriaMetrics-Community/mcp-victoriametrics
@dealx/mcp-server
MCP server enabling LLMs to search and interact with the DealX platform.
Implements the Model Context Protocol, providing a standardized interface for large language models to interact with the DealX platform. Supports searching for ads through structured prompts and is designed for easy integration with tools like Claude and VS Code extensions. Flexible configuration options are available for environment variables, logging, and deployment. Extensible architecture supports future feature additions beyond ad search.
- ⭐ 0
- MCP
- DealExpress/mcp-server
Dappier MCP Server
Real-time web search and premium data access for AI agents via Model Context Protocol.
Dappier MCP Server enables fast, real-time web search and access to premium data sources, including news, financial markets, sports, and weather, for AI agents using the Model Context Protocol (MCP). It integrates seamlessly with tools like Claude Desktop and Cursor, allowing users to enhance their AI workflows with up-to-date, trusted information. Simple installation and configuration are provided for multiple platforms, leveraging API keys for secure access. The solution supports deployment via Smithery and direct installation with 'uv', facilitating rapid setup for developers.
- ⭐ 35
- MCP
- DappierAI/dappier-mcp
Grafana-Loki MCP Server
A FastMCP server for querying Grafana Loki logs via the Model Context Protocol.
Grafana-Loki MCP Server is a FastMCP-compliant server that enables querying and formatting of Grafana Loki logs using the Model Context Protocol (MCP). It supports various transport protocols (stdio and SSE) and provides tools for querying logs, retrieving labels and label values, and formatting log results in multiple output formats. The server integrates with Grafana's API, offering both command-line and environment variable configuration options for flexibility. Designed for seamless integration into AI model context workflows, it enhances log retrieval and processing in context-aware applications.
- ⭐ 18
- MCP
- tumf/grafana-loki-mcp
Databricks MCP Server
Expose Databricks data and jobs securely with Model Context Protocol for LLMs.
Databricks MCP Server implements the Model Context Protocol (MCP) to provide a bridge between Databricks APIs and large language models. It enables LLMs to run SQL queries, list Databricks jobs, retrieve job statuses, and fetch detailed job information via a standardized MCP interface. The server handles authentication, secure environment configuration, and provides accessible endpoints for interaction with Databricks workspaces.
- ⭐ 42
- MCP
- JordiNeil/mcp-databricks-server
Didn't find tool you were looking for?