wget get https://github.com/oobabooga/text-generation-webui/archive/refs/heads/main.zip


allow access on network

edit vi CMD_FLAGS.txt

uncomment out listen part



If you want to run on a different port, add --port-listen 7401 or whatever port you want to to start on.

You know it is a network accessible port when you see instead of

 Downloading Modals from the commandline

in root app directory

python download-model.py meta-llama/Meta-Llama-3-8B



AssertionError: Torch not compiled with CUDA enabled

For me, I was running without GPU, so oobabooga was installed without it. So I had to tell oobabooga to run without it

edit vi CMD_FLAGS.txt

--listen --api --cpu