Sanctum
Run LLMs locally
Sanctum lets users download and run open‑source large language models directly on their desktop computers, eliminating the need for an internet connection after installation. The application supports macOS 12+ on Apple Silicon and Intel, as well as Windows 10+, with Linux planned for the future. It provides a simple setup process that loads full‑featured LLMs in seconds, enabling conversational AI, PDF summarization, code assistance, data analysis, and other productivity tasks entirely on‑device.
All processing and storage occur locally, with on‑device encryption ensuring that user data never leaves the machine. This privacy‑first approach gives users complete control over their information while still accessing a wide range of models through built‑in Hugging Face integration, which can check compatibility and download GGUF model files.
The software is positioned for individuals who require a private, offline AI assistant for tasks such as brainstorming, content creation, personal health tracking, financial analysis, and collaborative coding, without relying on cloud services. It is released as a stable product for macOS and Windows.
Reviews
Loading reviews…
Similar apps

AI Coding Agents
GPT4All
Run LLMs locally

AI Coding Agents
Msty
Run LLMs locally

AI Coding Agents
LM Studio
Discover, download, and run local LLMs

AI Coding Agents
Osaurus
LLM server built on MLX
Window & Desktop Management
LocalChatAI
Private, Offline AI Chatbot powered by Apple Intelligence.
AI Chat & Voice Agents
HuggingChat
Chat client for models on HuggingFace