Lara Translate MCP Server

Lara Translate MCP Server

Context-aware translation server implementing the Model Context Protocol.

76
Stars
13
Forks
76
Watchers
1
Issues
Lara Translate MCP Server enables AI applications to seamlessly access professional translation services via the standardized Model Context Protocol. It supports features such as language detection, context-aware translations, and translation memory integration. The server acts as a secure bridge between AI models and Lara Translate, managing credentials and facilitating structured translation requests and responses.

Key Features

Implements Model Context Protocol for standardized integration
Context-aware machine translation
Language detection support
Translation memory utilization
Secure API credential management
Supports multiple access methods (HTTP and STDIO servers)
Structured request and response handling
Extensible for advanced translation configuration
Seamless connection to Lara Translate API
Natural language instruction for translation parameters

Use Cases

Enhancing LLMs with accurate multilingual support
Automating context-sensitive text translation in AI-driven workflows
Building AI tools requiring secure access to professional translation services
Integrating translation features into chatbots or virtual assistants
Leveraging translation memory for consistency in large multilingual projects
Providing real-time translation for collaborative applications
Powering content localization pipelines
Enabling AI models to detect and translate multiple languages dynamically
Simplifying translation API integration in enterprise software
Supporting domain-specific translation for specialized industries

README

Lara Translate MCP Server

A Model Context Protocol (MCP) Server for Lara Translate API, enabling powerful translation capabilities with support for language detection, context-aware translations and translation memories.

License Docker Pulls npm downloads

๐Ÿ“š Table of Contents

๐Ÿ“– Introduction

Model Context Protocol (MCP) is an open standardized communication protocol that enables AI applications to connect with external tools, data sources, and services. Think of MCP like a USB-C port for AI applications - just as USB-C provides a standardized way to connect devices to various peripherals, MCP provides a standardized way to connect AI models to different data sources and tools.

Lara Translate MCP Server enables AI applications to access Lara Translate's powerful translation capabilities through this standardized protocol.

More info about Model Context Protocol on: https://modelcontextprotocol.io/

Lara Translate MCP Server implements the Model Context Protocol to provide seamless translation capabilities to AI applications. The integration follows this flow:

  1. Connection Establishment: When an MCP-compatible AI application starts, it connects to configured MCP servers, including the Lara Translate MCP Server
  2. Tool & Resource Discovery: The AI application discovers available translation tools and resources provided by the Lara Translate MCP Server
  3. Request Processing: When translation needs are identified:
    • The AI application formats a structured request with text to translate, language pairs, and optional context
    • The MCP server validates the request and transforms it into Lara Translate API calls
    • The request is securely sent to Lara Translate's API using your credentials
  4. Translation & Response: Lara Translate processes the translation using advanced AI models
  5. Result Integration: The translation results are returned to the AI application, which can then incorporate them into its response

This integration architecture allows AI applications to access professional-grade translations without implementing the API directly, while maintaining the security of your API credentials and offering flexibility to adjust translation parameters through natural language instructions.

Integrating Lara with LLMs creates a powerful synergy that significantly enhances translation quality for non-English languages.

Why General LLMs Fall Short in Translation

While large language models possess broad linguistic capabilities, they often lack the specialized expertise and up-to-date terminology required for accurate translations in specific domains and languages.

Laraโ€™s Domain-Specific Advantage

Lara overcomes this limitation by leveraging Translation Language Models (T-LMs) trained on billions of professionally translated segments. These models provide domain-specific machine translation that captures cultural nuances and industry terminology that generic LLMs may miss. The result: translations that are contextually accurate and sound natural to native speakers.

Designed for Non-English Strength

Lara has a strong focus on non-English languages, addressing the performance gap found in models such as GPT-4. The dominance of English in datasets such as Common Crawl and Wikipedia results in lower quality output in other languages. Lara helps close this gap by providing higher quality understanding, generation, and restructuring in a multilingual context.

Faster, Smarter Multilingual Performance

By offloading complex translation tasks to specialized T-LMs, Lara reduces computational overhead and minimizes latencyโ€”a common issue for LLMs handling non-English input. Its architecture processes translations in parallel with the LLM, enabling for real-time, high-quality output without compromising speed or efficiency.

Cost-Efficient Translation at Scale

Lara also lowers the cost of using models like GPT-4 in non-English workflows. Since tokenization (and pricing) is optimized for English, using Lara allows translation to take place before hitting the LLM, meaning that only the translated English content is processed. This improves cost efficiency and supports competitive scalability for global enterprises.

๐Ÿ›  Available Tools

Translation Tools

Inputs:

  • text (array): An array of text blocks to translate, each with:
    • text (string): The text content
    • translatable (boolean): Whether this block should be translated
  • source (optional string): Source language code (e.g., 'en-EN')
  • target (string): Target language code (e.g., 'it-IT')
  • context (optional string): Additional context to improve translation quality
  • instructions (optional string[]): Instructions to adjust translation behavior
  • source_hint (optional string): Guidance for language detection

Returns: Translated text blocks maintaining the original structure

Translation Memories Tools

Returns: Array of memories and their details

Inputs:

  • name (string): Name of the new memory
  • external_id (optional string): ID of the memory to import from MyMemory (e.g., 'ext_my_[MyMemory ID]')

Returns: Created memory data

Inputs:

  • id (string): ID of the memory to update
  • name (string): The new name for the memory

Returns: Updated memory data

Inputs:

  • id (string): ID of the memory to delete

Returns: Deleted memory data

Inputs:

  • id (string | string[]): ID or IDs of memories where to add the translation unit
  • source (string): Source language code
  • target (string): Target language code
  • sentence (string): The source sentence
  • translation (string): The translated sentence
  • tuid (optional string): Translation Unit unique identifier
  • sentence_before (optional string): Context sentence before
  • sentence_after (optional string): Context sentence after

Returns: Added translation details

Inputs:

  • id (string): ID of the memory
  • source (string): Source language code
  • target (string): Target language code
  • sentence (string): The source sentence
  • translation (string): The translated sentence
  • tuid (optional string): Translation Unit unique identifier
  • sentence_before (optional string): Context sentence before
  • sentence_after (optional string): Context sentence after

Returns: Removed translation details

Inputs:

  • id (string): ID of the memory to update
  • tmx_content (string): The content of the tmx file to upload
  • gzip (boolean): Indicates if the file is compressed (.gz)

Returns: Import details

Inputs:

  • id (string): The ID of the import job

Returns: Import details

๐Ÿš€ Getting Started

Lara supports both the STDIO and streamable HTTP protocols. For a hassle-free setup, we recommend using the HTTP protocol. If you prefer to use STDIO, it must be installed locally on your machine.

You'll find setup instructions for both protocols in the sections below.

HTTP Server ๐ŸŒ

This installation guide is intended for clients that do NOT support the url-based configuration. This option requires Node.js to be installed on your system.

If you're unsure how to configure an MCP with your client, please refer to your MCP client's official documentation.


  1. Open your client's MCP configuration JSON file with a text editor, then copy and paste the following snippet:
json
{
  "mcpServers": {
    "lara": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.laratranslate.com/v1",
        "--header",
        "x-lara-access-key-id: ${X_LARA_ACCESS_KEY_ID}",
        "--header",
        "x-lara-access-key-secret: ${X_LARA_ACCESS_KEY_SECRET}"
      ],
      "env": {
        "X_LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
        "X_LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your Lara Translate API credentials. Refer to the Official Documentation for details.

  2. Restart your MCP client.

This installation guide is intended for clients that support the url-based configuration. These clients can connect to Lara through a remote HTTP endpoint by specifying a simple configuration object.

Some examples of supported clients include Cursor, Continue, OpenDevin, and Aider.

If you're unsure how to configure an MCP with your client, please refer to your MCP client's official documentation.


  1. Open your client's MCP configuration JSON file with a text editor, then copy and paste the following snippet:
json
{
  "mcpServers": {
    "lara": {
      "url": "https://mcp.laratranslate.com/v1",
      "headers": {
        "x-lara-access-key-id": "<YOUR_ACCESS_KEY_ID>",
        "x-lara-access-key-secret": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your Lara Translate API credentials. Refer to the Official Documentation for details.

  2. Restart your MCP client.


STDIO Server ๐Ÿ–ฅ๏ธ

This option requires Node.js to be installed on your system.

  1. Add the following to your MCP configuration file:
json
{
  "mcpServers": {
    "lara-translate": {
      "command": "npx",
      "args": ["-y", "@translated/lara-mcp@latest"],
      "env": {
        "LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
        "LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your actual Lara API credentials.

This option requires Docker to be installed on your system.

  1. Add the following to your MCP configuration file:
json
{
  "mcpServers": {
    "lara-translate": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "LARA_ACCESS_KEY_ID",
        "-e",
        "LARA_ACCESS_KEY_SECRET",
        "translatednet/lara-mcp:latest"
      ],
      "env": {
        "LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
        "LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your actual Lara API credentials.

Using Node.js

  1. Clone the repository:
bash
git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
  1. Install dependencies and build:
bash
# Install dependencies
pnpm install

# Build
pnpm run build
  1. Add the following to your MCP configuration file:
json
{
  "mcpServers": {
    "lara-translate": {
      "command": "node",
      "args": ["<FULL_PATH_TO_PROJECT_FOLDER>/dist/index.js"],
      "env": {
        "LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
        "LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace:
    • <FULL_PATH_TO_PROJECT_FOLDER> with the absolute path to your project folder
    • <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your actual Lara API credentials.

Building a Docker Image

  1. Clone the repository:
bash
git clone https://github.com/translated/lara-mcp.git
cd lara-mcp
  1. Build the Docker image:
bash
docker build -t lara-mcp .
  1. Add the following to your MCP configuration file:
json
{
  "mcpServers": {
    "lara-translate": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "LARA_ACCESS_KEY_ID",
        "-e",
        "LARA_ACCESS_KEY_SECRET",
        "lara-mcp"
      ],
      "env": {
        "LARA_ACCESS_KEY_ID": "<YOUR_ACCESS_KEY_ID>",
        "LARA_ACCESS_KEY_SECRET": "<YOUR_ACCESS_KEY_SECRET>"
      }
    }
  }
}
  1. Replace <YOUR_ACCESS_KEY_ID> and <YOUR_ACCESS_KEY_SECRET> with your actual credentials.

๐Ÿงช Verify Installation

After restarting your MCP client, you should see Lara Translate MCP in the list of available MCPs.

The method for viewing installed MCPs varies by client. Please consult your MCP client's documentation.

To verify that Lara Translate MCP is working correctly, try translating with a simple prompt:

text
Translate with Lara "Hello world" to Spanish

Your MCP client will begin generating a response. If Lara Translate MCP is properly installed and configured, your client will either request approval for the action or display a notification that Lara Translate is being used.

๐Ÿ’ป Popular Clients that supports MCPs

For a complete list of MCP clients and their feature support, visit the official MCP clients page.

Client Description
Claude Desktop Desktop application for Claude AI
Aixplain Production-ready AI Agents
Cursor AI-first code editor
Cline for VS Code VS Code extension for AI assistance
GitHub Copilot MCP VS Code extension for GitHub Copilot MCP integration
Windsurf AI-powered code editor and development environment

๐Ÿ†˜ Support

Star History

Star History Chart

Repository Owner

translated
translated

Organization

Repository Details

Language TypeScript
Default Branch main
Size 163 KB
Contributors 3
License MIT License
MCP Verified Nov 12, 2025

Programming Languages

TypeScript
92.65%
JavaScript
3.5%
Shell
3.48%
Dockerfile
0.37%

Tags

Join Our Newsletter

Stay updated with the latest AI tools, news, and offers by subscribing to our weekly newsletter.

We respect your privacy. Unsubscribe at any time.

Related MCPs

Discover similar Model Context Protocol servers

  • Teamwork MCP Server

    Teamwork MCP Server

    Seamless Teamwork.com integration for Large Language Models via the Model Context Protocol

    Teamwork MCP Server is an implementation of the Model Context Protocol (MCP) that enables Large Language Models to interact securely and programmatically with Teamwork.com. It offers standardized interfaces, including HTTP and STDIO, allowing AI agents to perform various project management operations. The server supports multiple authentication methods, an extensible toolset architecture, and is designed for production deployments. It provides read-only capability for safe integrations and robust observability features.

    • โญ 11
    • MCP
    • Teamwork/mcp
  • Wanaku MCP Router

    Wanaku MCP Router

    A router connecting AI-enabled applications through the Model Context Protocol.

    Wanaku MCP Router serves as a middleware router facilitating standardized context exchange between AI-enabled applications and large language models via the Model Context Protocol (MCP). It streamlines context provisioning, allowing seamless integration and communication in multi-model AI environments. The tool aims to unify and optimize the way applications provide relevant context to LLMs, leveraging open protocol standards.

    • โญ 87
    • MCP
    • wanaku-ai/wanaku
  • Kanboard MCP Server

    Kanboard MCP Server

    MCP server for seamless AI integration with Kanboard project management.

    Kanboard MCP Server is a Go-based server implementing the Model Context Protocol (MCP) for integrating AI assistants with the Kanboard project management system. It enables users to manage projects, tasks, users, and workflows in Kanboard directly via natural language commands through compatible AI tools. With built-in support for secure authentication and high performance, it facilitates streamlined project operations between Kanboard and AI-powered clients like Cursor or Claude Desktop. The server is configurable and designed for compatibility with MCP standards.

    • โญ 15
    • MCP
    • bivex/kanboard-mcp
  • TeslaMate MCP Server

    TeslaMate MCP Server

    Query your TeslaMate data using the Model Context Protocol

    TeslaMate MCP Server implements the Model Context Protocol to enable AI assistants and clients to securely access and query Tesla vehicle data, statistics, and analytics from a TeslaMate PostgreSQL database. The server exposes a suite of tools for retrieving vehicle status, driving history, charging sessions, battery health, and more using standardized MCP endpoints. It supports local and Docker deployments, includes bearer token authentication, and is intended for integration with MCP-compatible AI systems like Claude Desktop.

    • โญ 106
    • MCP
    • cobanov/teslamate-mcp
  • Make MCP Server (legacy)

    Make MCP Server (legacy)

    Enable AI assistants to utilize Make automation workflows as callable tools.

    Make MCP Server (legacy) provides a Model Context Protocol (MCP) server that connects AI assistants with Make scenarios configured for on-demand execution. It parses and exposes scenario parameters, allowing AI systems to invoke automation workflows and receive structured JSON outputs. The server supports secure integration through API keys and facilitates seamless communication between AI and Make's automation platform.

    • โญ 142
    • MCP
    • integromat/make-mcp-server
  • Plane MCP Server

    Plane MCP Server

    Enables LLMs to manage Plane.so projects and issues via the Model Context Protocol.

    Plane MCP Server provides a standardized interface to connect large language models with Plane.so project management APIs. It enables LLMs to interact directly with project and issue data, supporting tasks such as listing projects, retrieving detailed information, creating and updating issues, while prioritizing user control and security. Installation is streamlined through tools like Smithery, and configuration supports multiple clients including Claude for Desktop.

    • โญ 32
    • MCP
    • kelvin6365/plane-mcp-server
  • Didn't find tool you were looking for?

    Be as detailed as possible for better results