Studio
Native desktop IDE for the LTHN platform. Lemma model family, MLX inference, and LEM training. Built with Wails for macOS, Windows, and Linux.
Lemma Model Family
4 models · GGUF + MLX variants · EUPL-1.2
Studio ships with native support for the Lemma model family. Four models trained with LEK consent-based alignment, available in MLX and GGUF formats for local inference.
Lemer
Edge · 2.3B
Lemma
General · 4.5B
Lemmy
Agentic · 26B MoE
Lemrd
Research · 30.7B
Features
LEM Training
Fine-tune Lethean Ethical Models locally with LoRA adapters. Monitor training progress, loss curves, and checkpoint scoring in real time.
Chat Interface
Interact with Lemma models through a native chat UI. Streaming responses, think panel for reasoning traces, and conversation history.
Model Management
Browse, download, and manage models from the Lemma family and the wider LTHN catalogue. View benchmark results, parameter counts, and ethical scores.
MLX-Native Inference
Apple Silicon native via MLX. Metal GPU acceleration, unified memory, optimised for the Lemma model family. ROCm on Linux for AMD GPUs.
Tech Stack
Go
Backend
Wails v3
Desktop Framework
Web Components
Frontend
MLX / ROCm
GPU Inference
Platforms
macOS
Apple Silicon native. Metal GPU via MLX for local inference and training.
Linux
AMD GPU support via ROCm. Ideal for dedicated training rigs and headless labs.
Windows
Full desktop experience. GPU inference via compatible backends.