Skip to content

Using Ollama Models in VS Code (Code-Server)

You can integrate Ollama models with your VS Code environment (code-server) in several ways:

Option 1: Install a VS Code Extension

The easiest approach is to install a VS Code extension that connects to Ollama:

  1. In code-server (your VS Code interface), open the Extensions panel
  2. Search for "Continue" or "Ollama" and install an extension like "Continue" or "Ollama Chat"
  3. Configure the extension to connect to Ollama using the internal Docker network URL:
    http://ollama-changemaker:11434
    

Option 2: Use the API Directly from the VS Code Terminal

Since the Docker CLI isn't available inside the code-server container, we can interact with the Ollama API directly using curl:

# List available models
curl http://ollama-changemaker:11434/api/tags

# Generate text with a model
curl -X POST http://ollama-changemaker:11434/api/generate -d '{
  "model": "llama3",
  "prompt": "Write a function to calculate Fibonacci numbers"
}'

# Pull a new model
curl -X POST http://ollama-changemaker:11434/api/pull -d '{
  "name": "mistral:7b"
}'

Option 3: Write Code That Uses the Ollama API

You can write scripts that connect to Ollama's API. For example, in Python:

import requests

def ask_ollama(prompt, model="llama3"):
    response = requests.post(
        "http://ollama-changemaker:11434/api/generate",
        json={"model": model, "prompt": prompt}
    )
    return response.json()["response"]

# Example usage
result = ask_ollama("What is the capital of France?")
print(result)

# List available models
def list_models():
    response = requests.get("http://ollama-changemaker:11434/api/tags")
    models = response.json()["models"]
    return [model["name"] for model in models]

# Pull a new model
def pull_model(model_name):
    response = requests.post(
        "http://ollama-changemaker:11434/api/pull",
        json={"name": model_name}
    )
    # This will take time for large models
    return response.status_code

From Your Host Machine's Terminal (Not VS Code)

If you want to use Docker commands, you'll need to run them from your host machine's terminal, not from inside VS Code:

# List available models
docker exec -it ollama-changemaker ollama list

# Pull models
docker exec -it ollama-changemaker ollama pull llama3
docker exec -it ollama-changemaker ollama pull mistral:7b
docker exec -it ollama-changemaker ollama pull codellama

The key is using the Docker network hostname ollama-changemaker with port 11434 as your connection point, which should be accessible from your code-server container since they're on the same network.