1.4 KiB
1.4 KiB
tags, date, author
tags | date | author | |
---|---|---|---|
|
2025-04-29 | The Bunker Admin |
Setting up the Ollama service, including a few starter models, as follows:
Configure Ollama
[!warning] Ollama on Network The following configuration will allow Ollama to be accessed on your local network. Exposing this endpoint can increase system exposure.
Create/edit the configuration file:
sudo mkdir -p /etc/ollama
sudo nano /etc/ollama/config
Add the following content:
{
"host": "0.0.0.0"
}
Ollama System Service
sudo nano /etc/systemd/system/ollama.service
Note
For the following text, you can also just add the
Environment="OLLAMA_HOST=0.0.0.0"
itself and then your system should work fine.
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0"
[Install]
WantedBy=default.target
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama
Models
A series of models that all will run on the build.homelab.
ollama pull gemma3:12b
ollama pull qwen3
ollama pull deepseek-r1
ollama pull mistral-small3.1
ollama pull llama3.2