Newsletter
Join the Community
Subscribe to our newsletter for the latest news and updates
Discover, download, and run local LLMs
An IDE Agent built by the Codeium team, the Windsurf Editor combines the best of copilot and agent systems to help you ship products faster, leveraging better context to provide better suggestions.
LM Studio is a powerful desktop application that allows you to run Large Language Models (LLMs) completely offline on your local machine.
You can run
🤖 • Run LLMs on your laptop, entirely offline
📚 • Chat with your local documents (new in 0.3)
👾 • Use models through the in-app Chat UI or an OpenAI compatible local server
📂 • Download any compatible model files from Hugging Face 🤗 repositories
🔭 • Discover new & noteworthy LLMs right inside the app's Discover page
LM Studio supports any GGUF Llama, Mistral, Phi, Gemma, StarCoder, etc model on Hugging Face
Minimum requirements: M1/M2/M3/M4 Mac, or a Windows / Linux PC with a processor that supports AVX2.
LM Studio is completely free to use.
For optimal experience:
Does LM Studio collect any data?
No. One of the main reasons for using a local LLM is privacy, and LM Studio is designed for that. Your data remains private and local to your machine.
Can I use LM Studio at work?
We'd love to enable you. Please fill out the LM Studio @ Work request form and we will get back to you as soon as we can.
What are the minimum hardware / software requirements?
Visit the System Requirements page for the most up to date information.