cloudstudio / ollama-laravel
This is my package ollama-laravel
Installs: 15 200
Dependents: 0
Suggesters: 0
Security: 0
Stars: 344
Watchers: 7
Forks: 26
Open Issues: 2
Requires
- php: ^8.2
- guzzlehttp/guzzle: ^7.8
- illuminate/contracts: ^11.0
- spatie/laravel-package-tools: ^1.16
Requires (Dev)
- laravel/pint: ^1.0
- nunomaduro/collision: ^7.8
- orchestra/testbench: ^8.8
- pestphp/pest: ^2.20
- pestphp/pest-plugin-arch: ^2.0
- pestphp/pest-plugin-laravel: ^2.0
README
Ollama-Laravel is a Laravel package that provides a seamless integration with the Ollama API. It includes functionalities for model management, prompt generation, format setting, and more. This package is perfect for developers looking to leverage the power of the Ollama API in their Laravel applications.
If you use laravel 10.x, please use the following version V1.0.5
https://github.com/cloudstudio/ollama-laravel/releases/tag/v1.0.5
Installation
composer require cloudstudio/ollama-laravel
Configuration
php artisan vendor:publish --tag="ollama-laravel-config"
Published config file:
return [ 'model' => env('OLLAMA_MODEL', 'llama2'), 'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'), 'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello, how can I assist you today?'), 'connection' => [ 'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300), ], ];
Usage
Basic Usage
use Cloudstudio\Ollama\Facades\Ollama; /** @var array $response */ $response = Ollama::agent('You are a weather expert...') ->prompt('Why is the sky blue?') ->model('llama2') ->options(['temperature' => 0.8]) ->stream(false) ->ask();
Vision Support
/** @var array $response */ $response = Ollama::model('llava:13b') ->prompt('What is in this picture?') ->image(public_path('images/example.jpg')) ->ask(); // "The image features a close-up of a person's hand, wearing bright pink fingernail polish and blue nail polish. In addition to the colorful nails, the hand has two tattoos – one is a cross and the other is an eye."
Chat Completion
$messages = [ ['role' => 'user', 'content' => 'My name is Toni Soriano and I live in Spain'], ['role' => 'assistant', 'content' => 'Nice to meet you , Toni Soriano'], ['role' => 'user', 'content' => 'where I live ?'], ]; $response = Ollama::agent('You know me really well!') ->model('llama2') ->chat($messages); // "You mentioned that you live in Spain." ### Chat Completion
Chat Completion with tools
$messages = [ ['role' => 'user', 'content' => 'What is the weather in Toronto?'], ]; $response = Ollama::model('llama3.1') ->tools([ [ "type" => "function", "function" => [ "name" => "get_current_weather", "description" => "Get the current weather for a location", "parameters" => [ "type" => "object", "properties" => [ "location" => [ "type" => "string", "description" => "The location to get the weather for, e.g. San Francisco, CA", ], "format" => [ "type" => "string", "description" => "The format to return the weather in, e.g. 'celsius' or 'fahrenheit'", "enum" => ["celsius", "fahrenheit"], ], ], "required" => ["location", "format"], ], ], ], ]) ->chat($messages);
Streamable responses
use Cloudstudio\Ollama\Facades\Ollama; use Illuminate\Console\BufferedConsoleOutput; /** @var \GuzzleHttp\Psr7\Response $response */ $response = Ollama::agent('You are a snarky friend with one-line responses') ->prompt("I didn't sleep much last night") ->model('llama3') ->options(['temperature' => 0.1]) ->stream(true) ->ask(); $output = new BufferedConsoleOutput(); $responses = Ollama::processStream($response->getBody(), function($data) use ($output) { $output->write($data['response']); }); $output->write("\n"); $complete = implode('', array_column($responses, 'response')); $output->write("<info>$complete</info>");
Show Model Information
$response = Ollama::model('Llama2')->show();
Copy a Model
Ollama::model('Llama2')->copy('NewModel');
Delete a Model
Ollama::model('Llama2')->delete();
Generate Embeddings
$embeddings = Ollama::model('Llama2')->embeddings('Your prompt here');
Testing
pest