Thanks to visit codestin.com
Credit goes to github.com

Skip to content
/ koharu Public
forked from mayocream/koharu

Automated manga translation tool with LLM, written in Rust.

License

Apache-2.0, GPL-3.0 licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
GPL-3.0
LICENSE-GPL
Notifications You must be signed in to change notification settings

Aloxaf/koharu

 
 

Repository files navigation

Koharu

AI-powered manga translator, written in Rust.

Koharu introduces a new workflow for manga translation, utilizing the power of AI to automate the process. It combines the capabilities of object detection, OCR, inpainting, and LLMs to create a seamless translation experience.

Under the hood, Koharu uses candle for high-performance inference, and uses Tauri for the GUI. All components are written in Rust, ensuring safety and speed.


screenshot-1 screenshot-2

Note

For help and support, please join our Discord server.

Features

  • Automatic speech bubble detection and segmentation
  • OCR for manga text recognition
  • Inpainting to remove original text from images
  • LLM-powered translation
  • Vertical text layout for CJK languages

GPU acceleration

CUDA and Metal are supported for GPU acceleration, significantly improving performance on supported hardware.

CUDA

Koharu is built with CUDA support, allowing it to leverage the power of NVIDIA GPUs for faster processing.

Koharu bundles CUDA toolkit 12.x and cuDNN 9.x, dylibs will be automatically extracted to the application data directory on first run.

Supported NVIDIA GPUs

Koharu supports NVIDIA GPUs with compute capability 7.5 or higher.

Please make sure your GPU is supported by checking the CUDA GPU Compute Capability and the cuDNN Support Matrix.

Metal

Koharu supports Metal for GPU acceleration on macOS with Apple Silicon (M1, M2, etc.). This allows Koharu to run efficiently on a wide range of Apple devices.

AI Models

Koharu relies on a mixin of computer vision and natural language processing models to perform its tasks.

Computer Vision Models

Koharu uses several pre-trained models for different tasks:

The models will be automatically downloaded when you run Koharu for the first time.

We convert the original models to safetensors format for better performance and compatibility with Rust. The converted models are hosted on Hugging Face.

Large Language Models

Koharu supports various quantized LLMs in GGUF format via candle. Currently supported models include:

LLMs will be automatically downloaded on demand when you select a model in the settings.

Installation

You can download the latest release of Koharu from the releases page.

We provide pre-built binaries for Windows and macOS, for other platforms, you may need to build from source, see the Development section below.

Development

To build Koharu from source, follow the steps below.

Prerequisites

  • Rust (1.85 or later)
  • Bun (1.0 or later)

Install dependencies

bun install

Build

bun run build

The built binaries will be located in the target/release directory.

Sponsorship

If you find Koharu useful, consider sponsoring the project to support its development!

License

Koharu application is licensed under the GNU General Public License v3.0.

The sub-crates of Koharu are licensed under the Apache License 2.0.

About

Automated manga translation tool with LLM, written in Rust.

Resources

License

Apache-2.0, GPL-3.0 licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
GPL-3.0
LICENSE-GPL

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 73.1%
  • TypeScript 21.6%
  • Python 5.3%