lzhx00 / laravel-llm-client
Laravel LLM Client Package - A unified interface for multiple LLM providers
v1.0.0
2025-07-09 07:31 UTC
Requires
- php: >=8.1
- laravel/framework: ^10.0|^11.0|^12.0
Requires (Dev)
- orchestra/testbench: ^8.0|^9.0|^10.0
- phpunit/phpunit: ^10.0
README
A Laravel package providing a unified, chainable interface for multiple LLM (Large Language Model) providers: OpenAI, Anthropic (Claude), Gemini, and Ollama.
Requirements
- Laravel 10.x, 11.x, or 12.x (This package only supports Laravel 10 and above)
- PHP 8.1 or higher
Tested on Laravel 12.x. Other versions may work, but are not officially tested.
Installation
composer require lzhx00/laravel-llm-client
Laravel will auto-discover and register the package.
If you have disabled auto-discovery, add the following to config/app.php
:
'providers' => [ // ... Lzhx00\LLMClient\LLMClientServiceProvider::class, ], 'aliases' => [ // ... 'LLMClient' => Lzhx00\LLMClient\Facades\LLMClient::class, ],
Configuration
Publish the config file (optional, for customization):
php artisan vendor:publish --tag=llm-client-config
Set your API keys and provider settings in .env
or config/llm.php
:
LLM_DEFAULT_PROVIDER=openai OPENAI_API_KEY=sk-... ANTHROPIC_API_KEY=sk-ant-... GOOGLE_API_KEY=AIza... OLLAMA_BASE_URL=http://localhost:11434
Basic Usage
Generate Text
$response = LLMClient::generate('Say hello in English.');
Specify Provider
$response = LLMClient::use('ollama')->generate('Say hello in English.');
With Options (e.g., model, temperature)
$response = LLMClient::with(['model' => 'llama3', 'temperature' => 0.5])->generate('Say hello.');
Streaming Response
LLMClient::generateStream('Tell me a joke.', [], function($chunk) { echo $chunk; });
Embeddings
$vector = LLMClient::use('openai')->embed('hello world');
List Models
$models = LLMClient::use('gemini')->models();
Supported Providers
- OpenAI (ChatGPT)
- Anthropic (Claude)
- Gemini (Google)
- Ollama
⚠️ Note: Only the Ollama provider has been fully tested.
Other providers are implemented based on official docs, but not tested with real API keys.
📄 License
MIT License