Docs

Introduction

Lara offers flexible integration options for developers to build branded translation AI experiences powered by our fine-tuned language models.

Lara is a new adaptive translation AI that combines the fluency, reasoning, context handling and instruct capabilities of LLMs with the low hallucination rate and latency of MT. On top of this, Lara is adaptive which means that Lara does not require training, instead is able to adapt to any domain on the fly leveraging previously translated content or context.

Whether you need standalone integration through our SDK or seamless LLM enhancement via MCP, Lara provides powerful translation capabilities to elevate your multilingual applications.

SDK

Build custom translation AI with complete control.

Our SDK empowers you to build your own branded translation AI leveraging our translation fine-tuned language model. All major translation features are accessible, making it easy to integrate and customize for your needs.

The Lara SDKs allow you to programmatically translate text, optionally providing context and translation examples (translation memories) in order to further improve quality.

Available Programming Languages

The SDK is available for Python, NodeJS, Java and PHP. More SDKs will be available soon for other languages, with the GO and C# SDKs on the way. All SDKs expose the same methods and will rely on the same features.

Ready to build with Lara? Get Started with the SDK and start creating your custom translation experience today.

Why SDKs and not REST APIs?

Although low level REST APIs can be made available for specific use cases (contact us), we highly recommend using the SDK for the following reasons:

  • They are better documented.
  • The SDK are easier and faster to integrate.
  • They implement enterprise grade security natively. Requests are encrypted and signed.
  • They will include the future audiovisual features like audio-to-audio, subtitles and dubbing that will not be available in the only-text REST APIs.

MCP Server

Lara is also available as a MCP-compatible server exposing its translation capabilities via the Model Context Protocol (MCP). In this setup, Lara acts as a specialized translation agent, seamlessly integrating into MCP-aware environments like Claude, ChatGPT, or any other system that supports MCP agents.

By integrating Lara directly into LLM workflows, you gain access to superior translation quality for domain-specific content, the ability to maintain context and terminology consistency across multilingual conversations—all without disrupting the user experience or requiring external API calls. Find out more about why to use Lara inside an LLM or to set it up and start translating, check out the Getting Started guide for the MCP Server

Subscription Options

Both the SDK and MCP server work seamlessly with any Lara Subscription (Free, Pro and Team), whether you're building standalone applications or enhancing AI workflows. For advanced functionality, expert-level APIs are available by requesting a customized enterprise plan.