Framelink MCP for Figma

Framelink MCP for Figma

Seamlessly connect Figma design data to AI coding agents via the Model Context Protocol

11,747
Stars
958
Forks
11,747
Watchers
33
Issues
Framelink MCP for Figma enables AI-powered coding tools to access and process context-rich Figma design data through the Model Context Protocol. It serves as a mediator, simplifying and translating Figma API responses into concise, relevant layout and styling information tailored for model consumption. The tool is designed for integration with editors like Cursor, helping AI agents accurately implement designs in various frameworks from Figma links. It also supports straightforward configuration for different platforms with personal Figma access tokens.

Key Features

Transforms Figma API data into concise, model-friendly context
Seamless integration with AI-powered code editors like Cursor
Supports one-shot design implementation across frameworks
Customizable configuration for MacOS, Linux, and Windows
Requires personal Figma API access tokens
Reduces irrelevant context to improve AI accuracy
Supports links to Figma files, frames, or groups
Simplifies design data for faster model ingestion
Open source and MIT licensed
Documentation and multilingual support

Use Cases

Allow AI code assistants to access Figma design files in IDEs
Automate conversion of Figma designs to code within development environments
Provide streamlined Figma context to AI agents for accurate code generation
Enhance one-shot implementation of UI designs from Figma through AI tools
Integrate Figma-driven workflows in AI-aided coding platforms
Reduce manual extraction and formatting of design data for AI consumption
Support multi-language teams with multilingual documentation
Enable rapid prototyping based on live Figma files
Facilitate design hand-off by connecting Figma with AI coding agents
Customize context extraction to match project or model requirements

README

Give Cursor and other AI-powered coding tools access to your Figma files with this Model Context Protocol server.

When Cursor has access to Figma design data, it's way better at one-shotting designs accurately than alternative approaches like pasting screenshots.

Demo

Watch a demo of building a UI in Cursor with Figma design data

Watch the video

How it works

  1. Open your IDE's chat (e.g. agent mode in Cursor).
  2. Paste a link to a Figma file, frame, or group.
  3. Ask Cursor to do something with the Figma file—e.g. implement the design.
  4. Cursor will fetch the relevant metadata from Figma and use it to write your code.

This MCP server is specifically designed for use with Cursor. Before responding with context from the Figma API, it simplifies and translates the response so only the most relevant layout and styling information is provided to the model.

Reducing the amount of context provided to the model helps make the AI more accurate and the responses more relevant.

Getting Started

Many code editors and other AI clients use a configuration file to manage MCP servers.

The figma-developer-mcp server can be configured by adding the following to your configuration file.

NOTE: You will need to create a Figma access token to use this server. Instructions on how to create a Figma API access token can be found here.

MacOS / Linux

json
{
  "mcpServers": {
    "Framelink MCP for Figma": {
      "command": "npx",
      "args": ["-y", "figma-developer-mcp", "--figma-api-key=YOUR-KEY", "--stdio"]
    }
  }
}

Windows

json
{
  "mcpServers": {
    "Framelink MCP for Figma": {
      "command": "cmd",
      "args": ["/c", "npx", "-y", "figma-developer-mcp", "--figma-api-key=YOUR-KEY", "--stdio"]
    }
  }
}

Or you can set FIGMA_API_KEY and PORT in the env field.

If you need more information on how to configure the Framelink MCP for Figma, see the Framelink docs.

Star History

Learn More

The Framelink MCP for Figma is simple but powerful. Get the most out of it by learning more at the Framelink site.

Star History

Star History Chart

Repository Owner

GLips
GLips

User

Repository Details

Language TypeScript
Default Branch main
Size 588 KB
Contributors 23
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
98.46%
JavaScript
1.54%

Tags

Topics

ai cursor figma mcp typescript

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Bifrost

    Bifrost

    VSCode Dev Tools exposed via the Model Context Protocol for AI tool integration.

    Bifrost is a Visual Studio Code extension that launches a Model Context Protocol (MCP) server, enabling external AI coding assistants to access advanced code navigation, analysis, and manipulation features from VSCode. It exposes language server capabilities, symbol search, semantic code analysis, and refactoring tools through MCP-compatible HTTP and SSE endpoints. The extension is designed for seamless integration with AI assistants, supporting multi-project environments and configuration via JSON files.

    • 184
    • MCP
    • biegehydra/BifrostMCP
  • Webvizio MCP Server

    Webvizio MCP Server

    Bridge between Webvizio feedback and AI coding agents via the Model Context Protocol

    Webvizio MCP Server is a TypeScript-based server implementing the Model Context Protocol to securely and efficiently interface with the Webvizio API. It transforms web page feedback and bug reports into structured, actionable developer tasks, providing AI coding agents with comprehensive task context and data. It offers methods to fetch project and task details, retrieve logs and screenshots, and manage task statuses. The server standardizes communication between Webvizio and AI agent clients, facilitating automated issue resolution.

    • 4
    • MCP
    • Webvizio/mcp
  • Firefly MCP Server

    Firefly MCP Server

    Seamless resource discovery and codification for Cloud and SaaS with Model Context Protocol integration.

    Firefly MCP Server is a TypeScript-based server implementing the Model Context Protocol to enable integration with the Firefly platform for discovering and managing resources across Cloud and SaaS accounts. It supports secure authentication, resource codification into infrastructure as code, and easy integration with tools such as Claude and Cursor. The server can be configured via environment variables or command line and communicates using standardized MCP interfaces. Its features facilitate automation and codification workflows for cloud resource management.

    • 15
    • MCP
    • gofireflyio/firefly-mcp
  • Mindpilot MCP

    Mindpilot MCP

    Visualize and understand code structures with on-demand diagrams for AI coding assistants.

    Mindpilot MCP provides AI coding agents with the capability to visualize, analyze, and understand complex codebases through interactive diagrams. It operates as a Model Context Protocol (MCP) server, enabling seamless integration with multiple development environments such as VS Code, Cursor, Windsurf, Zed, and Claude Code. Mindpilot ensures local processing for privacy, supports multi-client connections, and offers robust configuration options for server operation and data management. Users can export diagrams and adjust analytics settings for improved user control.

    • 61
    • MCP
    • abrinsmead/mindpilot-mcp
  • MCP Linear

    MCP Linear

    MCP server for AI-driven control of Linear project management.

    MCP Linear is a Model Context Protocol (MCP) server implementation that enables AI assistants to interact with the Linear project management platform. It provides a bridge between AI systems and the Linear GraphQL API, allowing the retrieval and management of issues, projects, teams, and more. With MCP Linear, users can create, update, assign, and comment on Linear issues, as well as manage project and team structures directly through AI interfaces. The tool supports seamless integration via Smithery and can be configured for various AI clients like Cursor and Claude Desktop.

    • 117
    • MCP
    • tacticlaunch/mcp-linear
  • Sequa MCP

    Sequa MCP

    Bridge Sequa's advanced context engine to any MCP-capable AI client.

    Sequa MCP acts as a seamless integration layer, connecting Sequa’s knowledge engine with various AI coding assistants and IDEs via the Model Context Protocol (MCP). It enables tools to leverage Sequa’s contextual knowledge streams, enhancing code understanding and task execution across multiple repositories. The solution provides a simple proxy command to interface with standardized MCP transports, supporting configuration in popular environments such as Cursor, Claude, VSCode, and others. Its core purpose is to deliver deep, project-specific context to LLM agents through a unified and streamable endpoint.

    • 16
    • MCP
    • sequa-ai/sequa-mcp
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results