Skip to main content

Minibase Ollama - Overview

An overview of the Minibase Ollama implementation.

M
Written by Michael McCarty
Updated over a month ago

Minibase Ollama lets you run your Minibase-trained models locally on your computer with full privacy and control.


What is it?

A customized version of Ollama that connects directly to your Minibase account to download and run your custom-trained models.


​Key Features

  • Private & Local - All inference runs on your machine, data never leaves your computer

  • Pre-configured - Models come with their training instructions built-in

  • Custom Instructions - Override default behavior with `--instruction` flag

  • Compatible - Works with all standard Ollama tools and libraries

Quick Links

Official Ollama Documentation

For complete Ollama documentation, see:

What Makes Minibase Ollama Different?

Feature

Standard Ollama

Minibase Ollama

Model Source

Public registry (ollama.com)

Your Minibase account

Authentication

None (public models)

API key authentication

Model Instructions

Manual via Modelfile

Automatic from training

Custom Instructions

Via Modelfile only

`--instruction` flag

Model Format

Community models

Your fine-tuned models

Support


​

Did this answer your question?