papi-ai / ollama
Ollama (local) provider for PapiAI
v0.9.1
2026-03-10 18:12 UTC
Requires
- php: ^8.2
- ext-curl: *
- papi-ai/papi-core: ^0.9
Requires (Dev)
- friendsofphp/php-cs-fixer: ^3.0
- pestphp/pest: ^3.0
- vimeo/psalm: ^6.0
README
Ollama provider for PapiAI - A simple but powerful PHP library for building AI agents.
Installation
composer require papi-ai/ollama
Usage
use PapiAI\Core\Agent; use PapiAI\Ollama\OllamaProvider; $provider = new OllamaProvider(); $agent = new Agent( provider: $provider, instructions: 'You are a helpful assistant.', ); $response = $agent->run('Hello!'); echo $response->text;
Available Models
The following models are supported (as referenced in OllamaProvider):
| Model | Type |
|---|---|
llama3.1 (default) |
General purpose |
codellama |
Code generation |
mistral |
General purpose |
mixtral |
Mixture of experts |
qwen2.5-coder |
Code generation |
nomic-embed-text |
Embeddings (default) |
Features
- Tool/function calling
- Streaming support
- Embeddings support
- Vision support
- Structured output / JSON mode
- Runs locally via Ollama (no API key required)
License
MIT