MCP
The MCP layer publishes the registry, live server metadata, and tool contract that lets AI clients talk to LTHN safely.
mcp.lthn
Portal host
JSON registry
Discovery format
HTTP bridge
Remote execution
Tool analytics
Operational feedback
What ships with it
Registry and portal in one place
The MCP site publishes the machine-readable registry while also giving humans a browsable server catalogue.
Protocol-level contract
Servers, tools, resources, and connection examples are documented as part of the platform rather than hidden in a repo.
Shared auth and quotas
Remote MCP calls respect the same API key scopes, workspace context, and quota tracking as the REST endpoints.
Public discovery, private execution where needed
Clients can discover the registry openly while protected tools still enforce workspace and entitlement checks.
Surface area
Registry
.well-known/mcp-servers.json
Expose the current server list in the standard discovery format used by MCP-capable clients.
Portal
GET /servers · /connect · /openapi.json
Browse servers, inspect their tools, and generate client connection snippets from the portal.
HTTP bridge
POST /mcp/tools/call
Invoke registered tools remotely over HTTP with the same scoped auth model used elsewhere in the stack.
Tool metadata
GET /mcp/servers/{id}/tools
Read tool lists, version metadata, and resource endpoints for clients that need explicit introspection.
Remote MCP registry
Use the portal for discovery and onboarding, then call the bridge endpoints on the API domain once the client is configured.
{
"mcpServers": {
"lthn": {
"command": "npx",
"args": ["-y", "@anthropic/mcp-remote", "https://mcp.lthn.ai"],
"env": {
"API_KEY": "YOUR_API_KEY"
}
}
}
}
Read the contract
Start from the registry, then read the package contract.
That combination gives clients the live discovery document and developers the security model behind it.