Private, Fast, Local‑first AI for Coding

Run high‑quality coding models locally or on your LAN. Minimal setup. Privacy by default. Built for real work.

New to models and hardware sizing? See Quick Tips.

Why Cline Local

  • Plan/Act workflows in VS Code
  • MCP tools and extensible providers
  • No external data required
  • Works with LM Studio

Recommended Models

  • Qwen2.5‑Coder 32B (or Qwen Coder 30B A3A)
  • GPT‑OSS‑120B for top‑tier quality (server‑class)

For restricted hardware, see low‑resource local options in Quick Tips.

Two Common Setups

  1. Corporate Laptop: VS Code + Cline Local + LM Studio on the same machine.
  2. Remote GPU Server: Model runs on a workstation/server; client machine runs Cline Local over LAN/VPN.

Get Started