What is LM Studio
LM Studio is a powerful desktop application that allows you to run Large Language Models (LLMs) completely offline on your local machine.
You can run
- Llama 3.2
- Mistral
- Phi 3.1
- Gemma 2
- DeepSeek 2.5
- Qwen 2.5
on your computer
What are the main features of LM Studio?
π€ β’ Run LLMs on your laptop, entirely offline
π β’ Chat with your local documents (new in 0.3)
πΎ β’ Use models through the in-app Chat UI or an OpenAI compatible local server
π β’ Download any compatible model files from Hugging Face π€ repositories
π β’ Discover new & noteworthy LLMs right inside the app's Discover page
LM Studio supports any GGUF Llama, Mistral, Phi, Gemma, StarCoder, etc model on Hugging Face
Minimum requirements: M1/M2/M3/M4 Mac, or a Windows / Linux PC with a processor that supports AVX2.
Price of LM Studio?
LM Studio is completely free to use.
How to use LM Studio?
- Visit https://lmstudio.ai/ and click the download button in the top right corner.
- Install the application on your system.
- During the initial setup, you can download recommended models. For more models, click the "Discover" button in the left sidebar.
- Click "Load a Selected Model" to initialize your chosen model.
- Start chatting with the model through the interface.
For optimal experience:
- Choose models based on your hardware capabilities
- Wait for the model to fully load before starting a conversation
- You can adjust various parameters in the settings for better performance
Frequently Asked Questions
Does LM Studio collect any data?
No. One of the main reasons for using a local LLM is privacy, and LM Studio is designed for that. Your data remains private and local to your machine.
Can I use LM Studio at work?
We'd love to enable you. Please fill out the LM Studio @ Work request form and we will get back to you as soon as we can.
What are the minimum hardware / software requirements?
Visit the System Requirements page for the most up to date information.