Bring Your Own LLM
Connect your preferred AI provider directly to Lingo.dev for complete translation control.
Benefits at a Glance
- Use Your Own AI Account: Connect OpenAI or Anthropic without a Lingo.dev account
- Complete Model Control: Select specific models that match your needs
- Custom Translation Prompts: Craft prompts for your exact translation style
- Independent Management: Control your own API usage and costs
- No Vendor Lock-in: Use your existing API keys and accounts
Quick Setup
Add your provider configuration to i18n.json
:
json
{
"version": 1.5,
"provider": {
"id": "anthropic",
"model": "claude-3-7-sonnet-latest",
"prompt": "You're translating text from {source} to {target}."
},
"locale": {
"source": "en",
"targets": ["es", "fr", "it", "de", "ar"]
}
}
Note: The
{source}
and{target}
tokens are automatically replaced with actual locale codes during translation.
Supported Providers
Lingo.dev (Managed Service)
json
"provider": {
"id": "lingo",
"model": "best",
"prompt": ""
}
OpenAI
json
"provider": {
"id": "openai",
"model": "gpt-4",
"prompt": "Translate accurately from {source} to {target}, maintaining the original meaning.",
"baseUrl": "https://api.openai.com/v1" // Optional
}
Anthropic
json
"provider": {
"id": "anthropic",
"model": "claude-3-opus-20240229",
"prompt": "You're an expert translator from {source} to {target}. Maintain the tone and style."
}
Setting API Keys
Set the appropriate environment variable for your chosen provider:
OpenAI
bash
export OPENAI_API_KEY=your_openai_api_key
Anthropic
bash
export ANTHROPIC_API_KEY=your_anthropic_api_key
Feature Comparison
Feature | Lingo.dev Managed | Your Own LLM |
---|---|---|
Translation | ✅ | ✅ |
Translation Memory | ✅ | ❌ (requires manual tracking) |
Glossary Support | ✅ | ❌ (requires manual management) |
Context-aware Translation | ✅ (industry + product optimized) | ✅ (via files + system prompt) |
Custom Prompts | Limited (with brand voice) | ✅ (complete control) |
Model Selection | Limited (auto-routed to best model) | ✅ (any supported model) |
Cost Control | Managed pricing | Self-managed (pay your provider) |
Quality Assurance | ✅ (automated) | ❌ (manual verification needed) |
Format Preservation | ✅ (guaranteed) | Partial (depends on prompt) |
Consistency | ✅ (automatic) | ❌ (manual effort required) |
Error Handling | ✅ (robust recovery + Enterprise SLA) | Basic retries only |
When to Choose Each Option
Best for Lingo.dev Managed Service:
- Projects requiring consistent translations across multiple files
- When translation memory and glossary features are important
- Enterprise projects with strict quality requirements
- Teams needing centralized translation management
- When you want a worry-free, quality-assured solution
Best for Your Own LLM:
- Projects needing highly customized translation styles
- When you require specific AI models
- If you already have API credits with OpenAI or Anthropic
- Experimental or specialized translation projects
- When you have prompt engineering expertise to leverage
Using Your Own LLM
- Configure your provider in
i18n.json
- Set the appropriate API key as an environment variable
- Run translations with the standard command:
bash
# Example workflow
export ANTHROPIC_API_KEY=your_api_key
npx lingo.dev@latest i18n
This approach gives you provider flexibility while still benefiting from Lingo.dev's workflow tools.
Request New Provider Support
Want support for additional AI providers? We welcome your input:
- Open an issue on our GitHub repository
- Submit a pull request with your provider implementation
We continuously expand our provider options based on community feedback.