Skip to content

Bring Your Own LLM

Overview

Lingo.dev CLI allows custom integration with AI providers like OpenAI and Anthropic for translations without requiring a Lingo.dev account. This allows you to use your existing LLM provider account with Lingo.dev CLI.

Key Features

  • Provider Flexibility: Decide which models to use
  • Customizable Prompts: Create your own prompt
  • Model Selection: Choose specific models from each provider
  • No Vendor Lock-in: Use your own API keys and accounts
  • Full Control: Manage your own API usage and preferences

Configuration Guide

Setting Up a Provider in i18n.json

json
{
  "version": 1.5,
  "provider": {
    "id": "anthropic",
    "model": "claude-3-7-sonnet-latest",
    "prompt": "You're translating text from {source} to {target}."
  },
  "locale": {
    "source": "en",
    "targets": ["es", "fr", "it", "de", "ar"]
  }
}

Note: The {source} and {target} tokens in the prompt will be automatically replaced with the actual locale codes during translation execution.

Available Provider Options

Lingo.dev (Managed Provider)

json
"provider": {
  "id": "lingo",
  "model": "best",
  "prompt": ""
}

OpenAI

json
"provider": {
  "id": "openai",
  "model": "gpt-4",
  "prompt": "Translate accurately from {source} to {target}, maintaining the original meaning.",
  "baseUrl": "https://api.openai.com/v1" // Optional
}

Anthropic

json
"provider": {
  "id": "anthropic",
  "model": "claude-3-opus-20240229",
  "prompt": "You're an expert translator from {source} to {target}. Maintain the tone and style."
}

Environment Setup

Before using custom provider integration, set up the appropriate API keys as environment variables:

For OpenAI

bash
export OPENAI_API_KEY=your_openai_api_key

For Anthropic

bash
export ANTHROPIC_API_KEY=your_anthropic_api_key

Feature Comparison

FeatureManaged by Lingo.devOwn LLM
Translation
Translation Memory❌ (managed manually)
Glossary Support❌ (managed manually)
Context-aware Translation✅ (tailored to industry + product)✅ (via files + system prompt)
Custom PromptsLimited (+ brand voice)
Model SelectionLimited (+ routed to the best model)
Cost ControlManagedSelf-managed
Quality Assurance✅ (automated)❌ (requires manual checking)
Format Preservation✅ (automatic)Partial (prompt-dependent)
Consistency Management✅ (recalls past translations)❌ (manual effort required)
Error Handling✅ (robust recovery + Enterprise SLA)Limited (basic retries)

Use Cases

When to use Lingo.dev Provider:

  • For projects requiring consistent translations across many files
  • When you'd benefit from our translation memory capabilities
  • For enterprise projects with glossary requirements
  • When you prefer a simplified workflow with quality assurance built-in
  • For teams collaborating on translations who need centralized management

When to use Custom AI Providers:

  • For highly customized translation styles
  • When specific AI models are required for your particular use case
  • If you already have API credits with OpenAI or Anthropic
  • For experimental or specialized translation approaches
  • When you have specific prompt engineering expertise you'd like to leverage

Usage Instructions

  1. Update your i18n.json file with your preferred provider configuration
  2. Set the appropriate environment variables for your provider's API key
  3. Run translations normally using the lingo.dev i18n command
bash
# Example workflow
export ANTHROPIC_API_KEY=your_api_key
lingo.dev i18n

Custom provider integration gives you the flexibility to choose your AI translation provider while still benefiting from some of Lingo.dev's localization workflow conveniences.

Request Additional Providers

If you'd like to see support for additional AI providers, we'd be grateful for your input:

  • Open an issue on our GitHub repository
  • Submit a pull request with the implementation for your desired provider

We're continuously learning and expanding our provider options based on valuable community feedback.