Skip to main content
GET
/
api
/
v1
/
ollama
/
list-models
curl --request GET \
  --url 'http://localhost:3000/api/v1/ollama/list-models' \
  --header 'Authorization: Bearer <token>'
{
  "success": true,
  "models": [
    {
      "name": "llama2:latest",
      "size": 3826793677,
      "digest": "sha256:78e26419b4469263f75331927a00a0284ef6544c1975b826b15abdaef17bb962",
      "modified_at": "2024-01-15T10:30:00Z"
    },
    {
      "name": "mistral:latest",
      "size": 4109865159,
      "digest": "sha256:61e88e884507ba5e06c49b40e6226884b2a16e872382dca1224f4d8f5e5f2f53",
      "modified_at": "2024-01-14T08:20:00Z"
    }
  ]
}

Query Parameters

baseUrl
string
Optional Ollama server URL (defaults to http://localhost:11434)

Response

success
boolean
Indicates if the request was successful
models
array
Array of installed model objects
curl --request GET \
  --url 'http://localhost:3000/api/v1/ollama/list-models' \
  --header 'Authorization: Bearer <token>'
{
  "success": true,
  "models": [
    {
      "name": "llama2:latest",
      "size": 3826793677,
      "digest": "sha256:78e26419b4469263f75331927a00a0284ef6544c1975b826b15abdaef17bb962",
      "modified_at": "2024-01-15T10:30:00Z"
    },
    {
      "name": "mistral:latest",
      "size": 4109865159,
      "digest": "sha256:61e88e884507ba5e06c49b40e6226884b2a16e872382dca1224f4d8f5e5f2f53",
      "modified_at": "2024-01-14T08:20:00Z"
    }
  ]
}

Notes

Ensure Ollama is running before calling this endpoint. The default Ollama server runs on http://localhost:11434
  • Returns an empty array if no models are installed
  • Model sizes are in bytes (convert to GB by dividing by 1,073,741,824)
  • The digest can be used to verify model integrity