This simple post is written to keep help how to setup open-webui along with LM Studio for daily use.

Prerequisites

  1. Make sure you already have LM Studioinstalled
  2. Make sure you have Dockerinstalled
  3. You have downloaded at least one model in LM Studio

Steps

  1. Download the open-webui docker image
docker pull ghcr.io/open-webui/open-webui:main
  1. Before you start running the docker image, make sure that you’ve started your server and added a model in LM Studio. Make sure to turn on Enable CORS and Serve on Local Network
  2. Make sure to start your server by toggling the Status and it shows Running and then copy your local server address
  3. Paste this command in the terminal to start the open-webui server.
  4. Run the following command in your terminal to start the open-webui server. It’s likely that you would have the same API URL for LM Studio, however, please double check this before running the command
docker run -d -p 3000:8080 -e OPENAI_API_BASE_URL=http://172.20.164.102:1234/v1 -e OPENAI_API_KEY=lm-studio -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
  1. now, wait for a few seconds until the server starts, you can monitor it from the container logs in the docker app, then you can visit http://localhost:3000 to start chatting. Please note that you’ll be asked to make an account, with an email. Please go ahead and make one, no data will be shared to them, the entire details are present locally on your computer only.