Skip to main content
Game Engine · Unity · Local LLM

CorePlay

Game engine integration with local LLM inference. Unity plugin, NPC dialogue, and procedural content generation powered by the Lemma model family.

Local inference Unity plugin EUPL-1.2

Features

NPC Dialogue

Dynamic NPC conversations powered by local Lemma models. Context-aware dialogue that responds to game state, player history, and world events.

Unity Plugin

Drop-in Unity integration. C# API for inference calls, component-based NPC behaviours, and editor tools for prompt authoring and testing.

Procedural Content

Generate quests, lore, item descriptions, and world-building text at runtime. Consistent with game tone through LoRA-tuned models.

Local-First

All inference runs on the player's hardware. No cloud dependency, no API latency, no data leaving the device. Privacy by architecture.

Architecture

Unity

Game Engine

Core Go

Inference Runtime

Lemma

Model Family

GGUF

Model Format

In Development

Coming Soon

CorePlay is under active development. The Unity plugin and inference runtime are being built on top of the Core Go framework and the Lemma model family.

Follow on GitHub