73 lines
3.9 KiB
Markdown
73 lines
3.9 KiB
Markdown
# Ollama: Local AI Model Server
|
|
|
|
Ollama is a tool that allows you to run large language models (LLMs) locally on your own server or computer. It simplifies the process of downloading, setting up, and interacting with powerful open-source AI models, providing AI capabilities without relying on third-party cloud services and ensuring data privacy.
|
|
|
|
## Key Features
|
|
|
|
* **Run LLMs Locally**: Host and run various open-source large language models (like Llama, Gemma, Mistral, etc.) on your own hardware.
|
|
* **Simple CLI**: Easy-to-use command-line interface for downloading models (`ollama pull`), running them (`ollama run`), and managing them (`ollama list`).
|
|
* **API Server**: Ollama serves models through a local API, allowing other applications (like OpenWebUI) to interact with them.
|
|
* **Data Privacy**: Since models run locally, your data doesn't leave your server when you interact with them.
|
|
* **Growing Model Library**: Access a growing library of popular open-source models.
|
|
* **Customization**: Create custom model files (Modelfiles) to tailor model behavior.
|
|
|
|
## Documentation
|
|
|
|
For more detailed information about Ollama, visit the [official repository](https://github.com/ollama/ollama).
|
|
|
|
## Getting Started with Ollama (within Changemaker)
|
|
|
|
Ollama itself is primarily a command-line tool and an API server. You typically interact with it via a terminal or through a UI like OpenWebUI.
|
|
|
|
### Managing Ollama via Terminal (e.g., in Code Server)
|
|
|
|
1. **Access a Terminal**:
|
|
* Open the integrated terminal in Code Server.
|
|
* Alternatively, SSH directly into your Changemaker server.
|
|
|
|
2. **Common Ollama Commands**:
|
|
* **List Downloaded Models**: See which models you currently have.
|
|
```bash
|
|
docker exec -it ollama-changemaker ollama list
|
|
```
|
|
*(The `docker exec -it ollama-changemaker` part is necessary if Ollama is running in a Docker container named `ollama-changemaker`, which is common. If Ollama is installed directly on the host, you'd just run `ollama list`.)*
|
|
|
|
* **Pull (Download) a New Model**: Download a model from the Ollama library. Replace `gemma:2b` with the desired model name and tag.
|
|
```bash
|
|
docker exec -it ollama-changemaker ollama pull gemma:2b
|
|
```
|
|
(Example: `ollama pull llama3`, `ollama pull mistral`)
|
|
|
|
* **Run a Model (Interactive Chat in Terminal)**: Chat directly with a model in the terminal.
|
|
```bash
|
|
docker exec -it ollama-changemaker ollama run gemma:2b
|
|
```
|
|
(Press `Ctrl+D` or type `/bye` to exit the chat.)
|
|
|
|
* **Remove a Model**: Delete a downloaded model to free up space.
|
|
```bash
|
|
docker exec -it ollama-changemaker ollama rm gemma:2b
|
|
```
|
|
|
|
### Interacting with Ollama via OpenWebUI
|
|
|
|
For a more user-friendly chat experience, use OpenWebUI, which connects to your Ollama service. See the `apps/openwebui.md` documentation for details.
|
|
|
|
## Use Cases within Changemaker
|
|
|
|
* **Powering OpenWebUI**: Ollama is the backend engine that OpenWebUI uses to provide its chat interface.
|
|
* **AI-Assisted Content Creation**: Generate text, summaries, ideas, or code snippets with privacy.
|
|
* **Custom AI Applications**: Developers can build custom applications that leverage the Ollama API for various AI tasks.
|
|
* **Offline AI Capabilities**: Use AI models even without an active internet connection (once models are downloaded).
|
|
|
|
## Editing the Site
|
|
|
|
Ollama is an AI model server. It is not used for editing this documentation site. Site editing is done via **Code Server**.
|
|
|
|
## Further Information
|
|
|
|
* **Ollama Official Website**: [https://ollama.ai/](https://ollama.ai/)
|
|
* **Ollama Documentation**: [https://ollama.ai/docs](https://ollama.ai/docs)
|
|
* **Ollama GitHub**: [https://github.com/ollama/ollama](https://github.com/ollama/ollama)
|
|
* The existing `ollama.md` at the root of the `docs` folder in your project might also contain specific setup notes for your Changemaker instance.
|