Skip to main content
🖥️

Msty

Local-first desktop app for running and chatting with LLMs privately

Code & Development
Msty logo

Msty

Local-first desktop app for running and chatting with LLMs privately

Msty is a desktop application for macOS, Windows, and Linux that lets users run open-source LLMs locally or connect to cloud providers through a single unified interface. It focuses on privacy—all local conversations stay on-device—while offering a polished chat experience with conversation folders, model comparison side-by-side, and prompt library management. Msty supports Ollama-managed local models as well as OpenAI, Anthropic, and other cloud API providers.

Key Features

  • Local LLM support
  • Privacy-first
  • Model comparison
  • Cloud API support
  • Conversation management
  • Prompt library
#local-llm#privacy#desktop-app#ollama#chat-interface

Get Started

Visit Msty
🔵
Freemium
Free plan + paid upgrades

Quick Info

Category
Code & Development
Pricing
Freemium

More Code & Development Tools