LM Studio
Run AI models locally with LM Studio for secure, private, and efficient performance
About LM Studio
LM Studio allows you to deploy and use AI models like gpt-oss, Qwen3, Gemma3, and DeepSeek directly on your hardware, ensuring privacy and control. It supports headless deployments for servers and CI environments, offers SDKs for JavaScript and Python, and provides an OpenAI-compatible API for seamless integration. LM Studio also introduces LM Link for remote model access and a CLI tool for advanced users.
FAQ
Yes, LM Studio is free for home and work use.
LM Studio supports Windows, Mac, and Linux.
Yes, you can deploy LM Studio on servers, Linux boxes, cloud servers, or even in CI using llmster, which is LM Studio's core without the GUI.
You can install LM Studio on Linux using the following command: curl -fsSL https://lmstudio.ai/install.sh | bash.
You can install LM Studio on Windows using the following command: irm https://lmstudio.ai/install.ps1 | iex.
Yes, LM Studio supports running Apple MLX models.
Alternatives to consider
See all alternativesBadges
Promote LM Studio giving it more exposure, by adding these badges to your website, documentation, or product listing. Each badge links back to LM Studio page on Webfolio.
<a href="https://www.webfolio.to/tools/lm-studio?utm_source=badge&utm_campaign=badge" target="_blank" rel="noopener noreferrer"><img src="https://www.webfolio.to/badges/featured_color.svg" alt="Featured on Webfolio" style="max-width: 150px" /></a>
Categories
Claim this tool
Are you the founder? Claim your profile to update details and track views.