Running Translation Servers โ
This guide teaches you how to set up and run the background servers required for the various translation engines in Cheat UI.
๐๏ธ Architecture Overview โ
The translation system works as follows:
- Game (Cheat UI): Sends Japanese text to a local or remote server.
- Background Server: Receives the text, translates it using its engine, and returns it.
- Cheat UI: Updates the game data (Map names, Variables, etc.) with the translated text.
IMPORTANT
You must have the background server running BEFORE you click "Start Translation" in-game.
๐ฐ๐ท ezTrans Setup (JP โ KR) โ
For high-quality Japanese to Korean translation, you need the original ezTrans XP software and a "wrapper" server.
1. Prerequisite: ezTrans XP โ
You must have ezTrans XP installed on your Windows machine. It is a commercial product and must be set up correctly in your system registry.
2. Choose a Wrapper Server โ
Option A: ezTransWeb (Python) โ
- GitHub: HelloKS/ezTransWeb
- Setup:
- Install Python 3.
- Install Flask:
pip install flask - Download
ezTransWeb.pyfrom the repository. - Run:
python ezTransWeb.py
- Port: 5000 (Default).
Option B: eztrans-server (Rust/High Performance) โ
- GitHub: nanikit/eztrans-server
- Setup:
- Download the latest
.exefrom the Releases page. - Run: Just double-click the
eztrans-server.exe.
- Download the latest
- Port: 8000 (Default).
๐ณ Lingva Setup (Auto-detect โ EN/Others) โ
Lingva is an open-source front-end for Google Translate. For bulk translations, you should always run it locally via Docker.
Running a Cluster (Recommended) โ
This repository includes a docker-compose.yml that runs a 3-node balanced cluster for extreme performance (~160 strings/sec).
- Install Docker Desktop.
- Open your terminal in the plugin root directory.
- Run:
docker-compose up -d - Cheat UI will now be able to use ports 3000, 3001, and 3002.
Running a Single Instance โ
If you just want a simple local node:
docker run -d -p 3000:3000 thedaviddelta/lingva-translate๐ค AI/LLM Setup (Generic) โ
For the smartest translations, use an Artificial Intelligence model.
Ollama (Local AI) โ
- Install Ollama.
- Pull a model:
ollama pull personal/llama3-8b-instruct(or any other translation model). - Run: The Ollama server runs automatically in the background.
- In Cheat UI, select the Ollama (Local) preset.
๐งช Testing with Dummy Server โ
If you want to test the UI and connectivity without setting up a real engine, you can use the included dummy server:
- Open terminal in
dummy-translator/. - Run:
python eztrans.py - In Cheat UI, select ezTransWeb as the endpoint.
- It will return "T: [Original Text]" to confirm the connection is working.