tenqz / ollama
Php library for working with Ollama server.
v0.1.4
2025-07-20 05:37 UTC
Requires
- php: ^7.2|^8.0
- ext-curl: *
- ext-json: *
Requires (Dev)
- friendsofphp/php-cs-fixer: ^2.19
- phpstan/phpstan: ^1.12
- phpunit/phpunit: ^8.5
- squizlabs/php_codesniffer: ^3.7
README
Ollama PHP Client Library
About
Ollama PHP Client Library is a robust, well-designed PHP client for interacting with the Ollama API. This library allows PHP developers to easily integrate large language models (LLMs) into their applications using the Ollama server.
Features
- Clean, domain-driven architecture
- Simple HTTP transport layer
- Full support for Ollama API endpoints
- Type-safe request and response handling
- PSR standards compliance
- Comprehensive test coverage
Installation
You can install the package via composer:
composer require tenqz/ollama
Basic Usage
use Tenqz\Ollama\Transport\Infrastructure\Http\Client\CurlTransportClient; // Initialize the client $client = new CurlTransportClient('http://localhost:11434'); // Generate text using a model $response = $client->post('/api/generate', [ 'model' => 'llama2', 'prompt' => 'What is artificial intelligence?' ]); // Get the generated text $result = $response->getData(); echo $result['response'];
Requirements
- PHP 7.2 or higher
- cURL extension
License
The MIT License (MIT). Please see License File for more information.