This simple post is written to keep help how to setup open-webui along with LM Studio for daily use.
Prerequisites
- Make sure you already have LM Studioinstalled
- Make sure you have Dockerinstalled
- You have downloaded at least one model in LM Studio
Steps
- Download the open-webui docker image
- Before you start running the docker image, make sure that you’ve started your server and added a model in LM Studio. Make sure to turn on
Enable CORS
andServe on Local Network
- Make sure to start your server by toggling the
Status
and it showsRunning
and then copy yourlocal server address
- Paste this command in the terminal to start the open-webui server.
- Run the following command in your terminal to start the
open-webui
server. It’s likely that you would have the same API URL for LM Studio, however, please double check this before running the command
- now, wait for a few seconds until the server starts, you can monitor it from the container logs in the docker app, then you can visit
http://localhost:3000
to start chatting. Please note that you’ll be asked to make an account, with an email. Please go ahead and make one, no data will be shared to them, the entire details are present locally on your computer only.