papi-ai/ollama

Ollama (local) provider for PapiAI

Maintainers

Package info

github.com/papi-ai/ollama

pkg:composer/papi-ai/ollama

Statistics

Installs: 0

Dependents: 0

Suggesters: 0

Stars: 0

Open Issues: 0

v0.9.1 2026-03-10 18:12 UTC

This package is not auto-updated.

Last update: 2026-03-10 18:25:55 UTC


README

Tests

Ollama provider for PapiAI - A simple but powerful PHP library for building AI agents.

Installation

composer require papi-ai/ollama

Usage

use PapiAI\Core\Agent;
use PapiAI\Ollama\OllamaProvider;

$provider = new OllamaProvider();

$agent = new Agent(
    provider: $provider,
    instructions: 'You are a helpful assistant.',
);

$response = $agent->run('Hello!');
echo $response->text;

Available Models

The following models are supported (as referenced in OllamaProvider):

Model Type
llama3.1 (default) General purpose
codellama Code generation
mistral General purpose
mixtral Mixture of experts
qwen2.5-coder Code generation
nomic-embed-text Embeddings (default)

Features

  • Tool/function calling
  • Streaming support
  • Embeddings support
  • Vision support
  • Structured output / JSON mode
  • Runs locally via Ollama (no API key required)

License

MIT