Core Features
Local Models
Run Ollama and LMStudio models on Chorus
Chorus supports local models running via Ollama or LMStudio.
Ollama
First, set up Ollama. Then run it:
Run Ollama
Next, open the model picker (⌘ + J) and scroll to the LOCAL
section. You may need to click the refresh icon.
LMStudio
Download LMStudio and load your first model.
Next, go to the developer tab and start the LMStudio server. Make sure that CORS is enabled:
Next, open the model picker (⌘ + J) and scroll to the LOCAL
section. You may need to click the refresh icon.
Base URL
To change the LMStudio base URL, go to Settings
-> API KEYS
-> Open the LMStudio settings at the bottom of the tab.