modelflow-ai / ollama
Client for ollama API.
Requires
- php: ^8.2
- modelflow-ai/api-client: ^0.2
- webmozart/assert: ^1.11
Requires (Dev)
- asapo/remove-vendor-plugin: ^0.1
- jangregor/phpstan-prophecy: ^1.0
- php-cs-fixer/shim: ^3.15
- phpspec/prophecy-phpunit: ^2.1@stable
- phpstan/extension-installer: ^1.2
- phpstan/phpstan: ^1.10, <1.10.55
- phpstan/phpstan-phpunit: ^1.3@stable
- phpunit/phpunit: ^10.3
- rector/rector: ^0.18.1
README
Modelflow AI
Ollama
Ollama is a PHP package that provides an easy-to-use client for the ollama API.
Note: This is part of the
modelflow-ai
project create issues in the main repository.
Note: This project is heavily under development and any feedback is greatly appreciated.
Installation
To install the Ollama package, you need to have PHP 8.2 or higher and Composer installed on your machine. Then, you can add the package to your project by running the following command:
composer require modelflow-ai/ollama
Examples
Here are some examples of how you can use the Ollama in your PHP applications. You can find more detailed examples in the examples directory.
Usage
use ModelflowAi\Ollama\Ollama; // Create a client instance $client = Ollama::client(); // Use the client $chat = $client->chat(); $completion = $client->completion(); $embeddings = $client->embeddings(); // Example usage of chat $chatResponse = $chat->create([ 'model' => 'llama2', 'messages' => [['role' => 'user', 'content' => 'Hello, world!']], ]); echo $chatResponse->message->content; // Example usage of completion $completionResponse = $completion->create([ 'model' => 'llama2', 'prompt' => 'Once upon a time', ]); echo $completionResponse->response; // Example usage of embeddings $embeddingsResponse = $embeddings->create(['prompt' => 'Hello, world!']); echo $embeddingsResponse->embedding;
For more examples, see the examples directory.
Testing & Code Quality
To run the tests and all the code quality tools with the following commands:
composer fix
composer lint
composer test
Open Points
Model API
The Model API is another area that we are actively working on. Once completed, this will provide users with the ability to manage and interact with their AI models directly from the Ollama package.
Contributing
Contributions are welcome. Please open an issue or submit a pull request in the main repository at https://github.com/modelflow-ai/.github.
License
This project is licensed under the MIT License. For the full copyright and license information, please view the LICENSE file that was distributed with this source code.