Chorus Docs home page
Search...
⌘K
Get Started
Welcome to Chorus!
Installation
Your First Chat
Core Features
Multiple Models
Projects
Tools + MCP
Ambient Chat
Local Models
Threads
Navigating Chorus
Staff Picks
How-to Guides
Managing your Context Window
Sharing Context Across Chats
Sharing Your Screen with Chorus
Importing Your Chats
Settings
Bring Your Own API Keys
Account
Dashboard
Privacy
Chorus Docs home page
Search...
⌘K
Support
Community
Download
Download
Search...
Navigation
Core Features
Local Models
Documentation
Documentation
Support
Community
Download
On this page
Ollama
LMStudio
Base URL
Core Features
Local Models
Run Ollama and LMStudio models on Chorus
Chorus supports local models running via Ollama or LMStudio.
Ollama
First, set up
Ollama
. Then run it:
Run Ollama
Copy
ollama
serve
Next, open the model picker (
⌘
+
J
) and scroll to the
LOCAL
section. You may need to click the
refresh icon.
LMStudio
Download
LMStudio
and load your first model.
Next, go to the developer tab and start the LMStudio server. Make sure that CORS is enabled:
Next, open the model picker (
⌘
+
J
) and scroll to the
LOCAL
section. You may need to click the
refresh icon.
Base URL
To change the LMStudio base URL, go to
Settings
->
API KEYS
-> Open the LMStudio settings at the bottom of the tab.
Was this page helpful?
Yes
No
Ambient Chat
Threads
Assistant
Responses are generated using AI and may contain mistakes.