ollama

install

curl -fsSL https://ollama.com/install.sh | sh

run

ollama run llama3

llama run llama3:8b

what models do i have?

ollama list

batch questions


ollama run llama3 < sourcequestions.txt

compare models

https://github.com/ollama/ollama/tree/main/examples/bash-comparemodels

api access

lynx http://localhost:11434

Webui

assuming ollama is already installed and running on the same box:

ddocker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main

then run at http://localhost:8080

If you want to get rid of this docker:

docker ps
 docker container stop xxx
 docker container remove yyy

Interesting models

ollama run rfc/whiterabbitneo
ollama run starcoder2:3b
ollama run starcoder2:7b
ollama run starcoder2:15b

starcoder