rahasistiyak/laravel-ai-integration

Comprehensive AI integration package for Laravel

Installs: 0

Dependents: 0

Suggesters: 0

Security: 0

Stars: 1

Watchers: 0

Forks: 0

Open Issues: 0

pkg:composer/rahasistiyak/laravel-ai-integration

1.2.0 2025-12-28 08:57 UTC

This package is auto-updated.

Last update: 2025-12-28 09:51:53 UTC


README

Enterprise-ready Laravel package for seamless AI integration with multiple providers through a unified API.

Latest Version GitHub Tests Action Status PHP Version License

Laravel AI Integration provides a unified, elegant API to interact with multiple AI providers including OpenAI, Anthropic (Claude), Google (Gemini), Ollama, and Groq. Built specifically for Laravel 11+, it abstracts provider complexity while offering powerful features like streaming, function calling, embeddings, and more.

โœจ Features

  • ๐ŸŽฏ Multi-Provider Support: OpenAI, Anthropic (Claude), Google (Gemini), Ollama, Groq
  • ๐Ÿ’ฌ Chat Completions: Standard and real-time streaming responses
  • ๐Ÿง  Vector Embeddings: Generate embeddings for semantic search
  • ๐Ÿ–ผ๏ธ Image Generation: DALL-E and compatible APIs
  • ๐Ÿ› ๏ธ Function Calling: Tool/function use support
  • ๐Ÿ”„ Real-Time Streaming: SSE streaming for chat responses
  • ๐ŸŽจ Eloquent Integration: AI-powered model traits
  • โšก Task Abstraction: Pre-built tasks (classification, etc.)
  • ๐Ÿ’ป Artisan Commands: CLI tools for code generation
  • ๐Ÿ“ฆ Queue Support: Background job processing

๐Ÿ“ฆ Installation

Install via Composer:

composer require rahasistiyak/laravel-ai-integration

Publish the configuration file:

php artisan vendor:publish --tag=ai-config

โš™๏ธ Configuration

Environment Variables

Add your API keys to .env:

# OpenAI Configuration
OPENAI_API_KEY=sk-...

# Anthropic (Claude) Configuration
ANTHROPIC_API_KEY=sk-ant-...

# Google (Gemini) Configuration
GOOGLE_API_KEY=...

# Groq Configuration
GROQ_API_KEY=...

# Ollama (Local) Configuration
OLLAMA_BASE_URL=http://localhost:11434

# Default Provider
AI_DEFAULT_PROVIDER=openai

Provider Configuration

Edit config/ai.php to customize provider settings:

return [
    'default' => env('AI_DEFAULT_PROVIDER', 'openai'),

    'providers' => [
        'openai' => [
            'driver' => 'openai',
            'api_key' => env('OPENAI_API_KEY'),
            'base_url' => env('OPENAI_BASE_URL', 'https://api.openai.com/v1'),
            'timeout' => 30,
            'models' => [
                'chat' => ['gpt-4', 'gpt-3.5-turbo'],
                'embedding' => ['text-embedding-ada-002'],
            ],
        ],
        // Additional providers...
    ],
];

๐Ÿš€ Usage

Basic Chat

use Rahasistiyak\LaravelAiIntegration\Facades\AI;

$response = AI::chat()
    ->messages([
        ['role' => 'user', 'content' => 'Explain quantum computing in simple terms']
    ])
    ->get();

echo $response->content();

Streaming Responses

Stream responses in real-time:

AI::chat()
    ->messages([
        ['role' => 'user', 'content' => 'Write a short story about AI']
    ])
    ->stream(function ($chunk) {
        echo $chunk; // Output each chunk as it arrives
    });

Using Different Providers

// Use Anthropic (Claude)
$response = AI::driver('anthropic')
    ->chat([
        ['role' => 'user', 'content' => 'Hello Claude!']
    ]);

// Use Google Gemini
$response = AI::driver('google')
    ->chat([
        ['role' => 'user', 'content' => 'Hello Gemini!']
    ]);

// Use Groq
$response = AI::driver('groq')
    ->chat([
        ['role' => 'user', 'content' => 'Hello Groq!']
    ]);

// Use local Ollama
$response = AI::driver('ollama')
    ->chat([
        ['role' => 'user', 'content' => 'Hello Llama!']
    ]);

Embeddings

Generate vector embeddings for semantic search:

$embedding = AI::embed()->generate('Your text here');
// Returns: [0.0123, -0.0234, 0.0156, ...]

Eloquent Model Integration

Add AI capabilities to your models:

use Rahasistiyak\LaravelAiIntegration\Traits\HasAiEmbeddings;

class Article extends Model
{
    use HasAiEmbeddings;
}

// Generate embeddings
$article = Article::find(1);
$embedding = $article->generateEmbedding();

Function Calling / Tools

Use function calling for structured outputs:

$response = AI::chat()
    ->withTools([
        [
            'type' => 'function',
            'function' => [
                'name' => 'get_weather',
                'description' => 'Get the current weather for a location',
                'parameters' => [
                    'type' => 'object',
                    'properties' => [
                        'location' => [
                            'type' => 'string',
                            'description' => 'City name',
                        ],
                        'unit' => [
                            'type' => 'string',
                            'enum' => ['celsius', 'fahrenheit'],
                        ],
                    ],
                    'required' => ['location'],
                ],
            ],
        ],
    ])
    ->messages([
        ['role' => 'user', 'content' => 'What\'s the weather in Tokyo?']
    ])
    ->get();

Task Abstraction

Use pre-built tasks for common operations:

// Text classification
$category = AI::task()->classify(
    'This new GPU delivers incredible performance for AI workloads',
    ['Technology', 'Fashion', 'Sports', 'Politics']
);
// Returns: "Technology"

Image Generation

$image = AI::image()->generate('A futuristic city at sunset', [
    'size' => '1024x1024',
    'quality' => 'hd'
]);
// Returns: ['url' => 'https://...']

Console Commands

Generate code via Artisan:

php artisan ai:generate-code "Create a UserObserver that logs model events" --language=php

Background Jobs

Process AI tasks in the background:

use Rahasistiyak\LaravelAiIntegration\Jobs\ProcessAiTask;

ProcessAiTask::dispatch('classify', $text, [
    'labels' => ['Positive', 'Negative', 'Neutral']
]);

๐Ÿ› ๏ธ Advanced Features

Custom Model Selection

AI::chat()
    ->model('gpt-4')
    ->messages([...])
    ->get();

Custom Parameters

AI::chat()
    ->withParameters([
        'temperature' => 0.9,
        'max_tokens' => 500,
        'top_p' => 0.95,
    ])
    ->messages([...])
    ->get();

Fluent API Chaining

$response = AI::chat()
    ->model('gpt-4')
    ->withParameters(['temperature' => 0.7])
    ->withTools([...])
    ->messages([...])
    ->get();

๐Ÿ”ง Troubleshooting

Common Issues

Issue Solution
Driver not supported Verify driver is properly configured in config/ai.php
401 Unauthorized Check API keys in .env and ensure they're valid
Connection Refused (Ollama) Ensure Ollama is running: ollama serve
SSL Certificate Errors Update base_url or configure SSL certificates
Timeout Errors Increase timeout value in provider config

Debug Mode

Enable verbose error messages:

config(['app.debug' => true]);

๐Ÿงช Testing

Run the test suite:

composer test

Run specific tests:

./vendor/bin/phpunit --filter OpenAIDriverTest

๏ฟฝ Documentation

For complete documentation and examples, visit the GitHub repository.

๐Ÿค Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

Development Setup

git clone https://github.com/rahasistiyakofficial/laravel-ai-integration.git
cd laravel-ai-integration
composer install
composer test

๐Ÿ“‹ Requirements

  • PHP: 8.2 or higher
  • Laravel: 11.x or 12.x
  • Dependencies: Guzzle HTTP Client

๐Ÿ“ License

This package is open-source software licensed under the MIT License.

๐Ÿ™ Credits

โญ Support

If you find this package helpful, please consider giving it a star on GitHub!

For issues and feature requests, please use the issue tracker.