oobabooga

wget get https://github.com/oobabooga/text-generation-webui/archive/refs/heads/main.zip

./start_linux.sh

allow access on network

edit vi CMD_FLAGS.txt

uncomment out listen part

run

./start-linux.sh

If you want to run on a different port, add --port-listen 7401 or whatever port you want to to start on.

You know it is a network accessible port when you see http://0.0.0.0:7401 instead of http://127.0.0.1:7401

 Downloading Modals from the commandline

in root app directory

python download-model.py meta-llama/Meta-Llama-3-8B

 Gotchas

Error:

AssertionError: Torch not compiled with CUDA enabled

For me, I was running without GPU, so oobabooga was installed without it. So I had to tell oobabooga to run without it

edit vi CMD_FLAGS.txt

--listen --api --cpu