Mastra
A TypeScript framework for building scalable AI-powered agents and applications.
Key Features
Use Cases
README
Mastra
Mastra is a framework for building AI-powered applications and agents with a modern TypeScript stack.
It includes everything you need to go from early prototypes to production-ready applications. Mastra integrates with frontend and backend frameworks like React, Next.js, and Node, or you can deploy it anywhere as a standalone server. It's the easiest way to build, tune, and scale reliable AI products.
Why Mastra?
Purpose-built for TypeScript and designed around established AI patterns, Mastra gives you everything you need to build great AI applications out-of-the-box.
Some highlights include:
-
Model routing - Connect to 40+ providers through one standard interface. Use models from OpenAI, Anthropic, Gemini, and more.
-
Agents - Build autonomous agents that use LLMs and tools to solve open-ended tasks. Agents reason about goals, decide which tools to use, and iterate internally until the model emits a final answer or an optional stopping condition is met.
-
Workflows - When you need explicit control over execution, use Mastra's graph-based workflow engine to orchestrate complex multi-step processes. Mastra workflows use an intuitive syntax for control flow (
.then(),.branch(),.parallel()). -
Human-in-the-loop - Suspend an agent or workflow and await user input or approval before resuming. Mastra uses storage to remember execution state, so you can pause indefinitely and resume where you left off.
-
Context management - Give your agents the right context at the right time. Provide conversation history, retrieve data from your sources (APIs, databases, files), and add human-like working and semantic memory so your agents behave coherently.
-
Integrations - Bundle agents and workflows into existing React, Next.js, or Node.js apps, or ship them as standalone endpoints. When building UIs, integrate with agentic libraries like Vercel's AI SDK UI and CopilotKit to bring your AI assistant to life on the web.
-
MCP servers - Author Model Context Protocol servers, exposing agents, tools, and other structured resources via the MCP interface. These can then be accessed by any system or agent that supports the protocol.
-
Production essentials - Shipping reliable agents takes ongoing insight, evaluation, and iteration. With built-in evals and observability, Mastra gives you the tools to observe, measure, and refine continuously.
Get started
The recommended way to get started with Mastra is by running the command below:
npm create mastra@latest
Follow the Installation guide for step-by-step setup with the CLI or a manual install.
If you're new to AI agents, check out our templates, course, and YouTube videos to start building with Mastra today.
Documentation
Visit our official documentation.
MCP Servers
Learn how to make your IDE a Mastra expert by following the @mastra/mcp-docs-server guide.
Contributing
Looking to contribute? All types of help are appreciated, from coding to testing and feature specification.
If you are a developer and would like to contribute with code, please open an issue to discuss before opening a Pull Request.
Information about the project setup can be found in the development documentation
Support
We have an open community Discord. Come and say hello and let us know if you have any questions or need any help getting things running.
It's also super helpful if you leave the project a star here at the top of the page
Security
We are committed to maintaining the security of this repo and of Mastra as a whole. If you discover a security finding we ask you to please responsibly disclose this to us at security@mastra.ai and we will get back to you.
Star History
Repository Owner
Organization
Repository Details
Programming Languages
Tags
Topics
Join Our Newsletter
Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.
Related MCPs
Discover similar Model Context Protocol servers
Wanaku MCP Router
A router connecting AI-enabled applications through the Model Context Protocol.
Wanaku MCP Router serves as a middleware router facilitating standardized context exchange between AI-enabled applications and large language models via the Model Context Protocol (MCP). It streamlines context provisioning, allowing seamless integration and communication in multi-model AI environments. The tool aims to unify and optimize the way applications provide relevant context to LLMs, leveraging open protocol standards.
- ⭐ 87
- MCP
- wanaku-ai/wanaku
Klavis
One MCP server for AI agents to handle thousands of tools.
Klavis provides an MCP (Model Context Protocol) server with over 100 prebuilt integrations for AI agents, enabling seamless connectivity with various tools and services. It offers both cloud-hosted and self-hosted deployment options and includes out-of-the-box OAuth support for secure authentication. Klavis is designed to act as an intelligent connector, streamlining workflow automation and enhancing agent capability through standardized context management.
- ⭐ 5,447
- MCP
- Klavis-AI/klavis
FastMCP
TypeScript framework for building robust MCP servers with minimal setup.
FastMCP is a TypeScript framework designed for building servers that adhere to the Model Context Protocol (MCP), enabling efficient management of client sessions and context. It streamlines the creation of MCP servers by providing intuitive APIs, built-in authentication, session and request tracking, and support for handling various content types such as images and audio. The framework also enforces best practices around error handling, logging, and streaming outputs. Developers benefit from reduced boilerplate and a focus on core MCP functionality.
- ⭐ 2,738
- MCP
- punkpeye/fastmcp
Godot MCP
A Model Context Protocol (MCP) server implementation using Godot and Node.js.
Godot MCP implements the Model Context Protocol (MCP) as a server, leveraging the Godot game engine along with Node.js and TypeScript technologies. Designed for seamless integration and efficient context management, it aims to facilitate standardized communication between AI models and applications. This project offers a ready-to-use MCP server for developers utilizing Godot and modern JavaScript stacks.
- ⭐ 1,071
- MCP
- Coding-Solo/godot-mcp
Azure MCP Server
Connect AI agents with Azure services through Model Context Protocol.
Azure MCP Server provides a seamless interface between AI agents and Azure services by implementing the Model Context Protocol (MCP) specification. It enables integration with tools like GitHub Copilot for Azure and supports a wide range of Azure resource management tasks directly via conversational AI interfaces. Designed for extensibility and compatibility, it offers enhanced contextual capabilities for agents working with Azure environments.
- ⭐ 1,178
- MCP
- Azure/azure-mcp
Weblate MCP Server
Seamlessly connect AI assistants to Weblate for translation management via the Model Context Protocol.
Weblate MCP Server enables AI assistants and clients to directly manage Weblate translation projects through the Model Context Protocol (MCP). It integrates with the Weblate REST API, allowing natural language interaction for project and translation management. The tool offers multiple transport options including HTTP, SSE, and STDIO, and is optimized for large language model workflows. Full support for project, component, and translation operations is provided, with a focus on type safety and flexible environment configuration.
- ⭐ 9
- MCP
- mmntm/weblate-mcp
Didn't find tool you were looking for?