OpenAI API, but it points to your machine.
Drop-in compatible. Runs on your laptop. No cloud. No data leaving your device. Free to start.
# Install and start in 30 seconds
npm install -g locai
locai start --model=llama-3-8b
# Your existing code — unchanged
const openai = new OpenAI({
baseURL: "http://localhost:8000"
})
Why developers use Loc.ai
OpenAI-compatible
You already know the API. Point it at localhost instead.
5-minute setup
One install command. No Kubernetes. No configuration hell.
Privacy by default
Your prompts never leave your machine. Not a setting — the architecture.
Swap models freely
Run Llama, Mistral, Qwen, or any open-source model. You're not locked in.
Zero network latency
Local inference means real-time UX. No 300ms round-trips.
