codingwisely / taskallama
Taskallama is a Laravel package that seamlessly integrates with Ollama’s LLM API to empower your applications with AI-driven text generation, task management assistance, and more. Designed for simplicity and scalability, Taskallama brings the power of language models to your Laravel projects.
Fund package maintenance!
CodingWisely
Requires
- php: ^8.3
- illuminate/contracts: ^10.0||^11.0
- spatie/laravel-package-tools: ^1.16
Requires (Dev)
- laravel/pint: ^1.14
- nunomaduro/collision: ^8.1.1||^7.10.0
- orchestra/testbench: ^9.0.0||^8.22.0
- pestphp/pest: ^2.34
- pestphp/pest-plugin-arch: ^2.7
- pestphp/pest-plugin-laravel: ^2.3
This package is auto-updated.
Last update: 2025-02-03 16:35:41 UTC
README
Taskallama is a Laravel package that provides seamless integration with Ollama's LLM API. It simplifies generating AI-powered content, from professional task writing to conversational agents, with minimal effort. Whether you're building a task management system, an HR assistant for job posts, or blog content generation, Taskallama has you covered.
Why i built it? Simple reasons - i want to implement a ai helper on or Project and task management system at Taskavel.com to help me quickly scaffold the task. We gonna use it also on our another SaaS project, Advanced ATS system at Bagel.blue to make it easy to create a Job Postings.
Features
- Simple API for generating AI responses via the Ollama LLM.
- Supports task creation, conversational AI, embeddings, and more.
- Customizable agent personalities for tailored responses.
- Integration with Laravel Livewire for real-time interactions.
- Configurable options like streaming, model selection, and temperature.
Prerequisites
-
Ollama Installation
- Taskallama requires Ollama to be installed and running locally on your machine. You can download and install Ollama from their official website:
-
Ollama Configuration
- By default, Taskallama connects to Ollama at
http://127.0.0.1:11434
. Ensure that Ollama is running and accessible at this address. You can update theOLLAMA_URL
in the config file if it's hosted elsewhere.
- By default, Taskallama connects to Ollama at
-
System Requirements
- PHP
^8.3
or higher. - Laravel
^11.0
or higher.
- PHP
Installation
You can install the package via composer:
composer require codingwisely/taskallama
Next, you should publish the package's configuration file:
php artisan vendor:publish --tag="taskallama-config"
This will publish a taskallama.php
file in your config
directory where you can configure your Ollama API key and other settings.
return [ 'model' => env('OLLAMA_MODEL', 'llama3.2'), 'default_format' => 'json', 'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'), 'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello Taskavelian, how can I assist you today?'), 'connection' => [ 'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300), ], ];
Usage
Basic Example (non-stream)
Generate a response using a prompt:
use CodingWisely\Taskallama\Facades\Taskallama; $response = Taskallama::agent('You are a professional task creator...') ->prompt('Write a task for implementing a new feature in a SaaS app.') ->model('llama3.2') ->options(['temperature' => 0.5]) ->stream(false) ->ask(); return $response['response'];
Basic Example (stream)
Generate a stream response using a prompt:
use CodingWisely\Taskallama\Facades\Taskallama; return response()->stream(function () use () { Taskallama::agent('You are a professional task creator...') ->prompt('Write a task for implementing a new feature in a SaaS app.') ->model('llama3.2') ->options(['temperature' => 0.5]) ->stream(true) ->ask(); }, 200, [ 'Cache-Control' => 'no-cache', 'X-Accel-Buffering' => 'no', 'Content-Type' => 'text/event-stream', ]);
Chat Example
Create a conversational agent:
use CodingWisely\Taskallama\Facades\Taskallama; $messages = [ ['role' => 'user', 'content' => 'Tell me about Laravel'], ['role' => 'assistant', 'content' => 'Laravel is a PHP framework for web development.'], ['role' => 'user', 'content' => 'Why is it so popular?'], ]; $response = Taskallama::agent('You are a Laravel expert.') ->model('llama3.2') ->options(['temperature' => 0.7]) ->chat($messages);
Livewire Integration Example
Integrate Taskallama into a Livewire component for real-time task generation:
namespace App\Livewire; use CodingWisely\Taskallama\Taskallama; use Livewire\Component; class AskTaskallama extends Component { public $question = ''; public $response = ''; public function ask() { if (empty(trim($this->question))) { $this->response = "Please provide a valid question."; return; } try { $this->response = Taskallama::agent('You are a task-writing assistant.') ->prompt($this->question) ->model('llama3.2') ->options(['temperature' => 0]) ->stream(false) ->ask()['response'] ?? "No response received."; } catch (\Exception $e) { $this->response = "Error: " . $e->getMessage(); } } public function render() { return view('livewire.ask-taskallama'); } }
Embeddings Example
Generate embeddings for advanced search or semantic analysis:
$embeddings = Taskallama::agent('Embedding Assistant') ->model('llama3.2') ->options(['temperature' => 0.5]) ->ask(); print_r($embeddings);
Additional Methods
List Local Models
$models = Taskallama::getInstance()->listLocalModels(); print_r($models);
Retrieve Model Information
$modelInfo = Taskallama::getInstance()->getModelInfo('llama3.2'); print_r($modelInfo);
Retrieve Model Settings
$modelSettings = Taskallama::getInstance()->getModelSettings('llama3.2'); print_r($modelSettings);
Pull or Delete a Model
If you're pulling model, make sure you set this a background job, as it may take a while to download the model.
$pullModel = Taskallama::getInstance()->pull('mistral'); $deleteModel = Taskallama::getInstance()->delete('mistral');
Testing
Run the tests with:
composer test
License
This package is open-source software licensed under the MIT License. Please see the LICENSE.md file for more information.