saqqal/llm-integration-bundle

Symfony bundle for integrating large language models (LLMs) into applications via api providers, supporting API Together and OpenAI.

v1.0.0 2024-09-22 20:08 UTC

This package is auto-updated.

Last update: 2025-04-22 23:07:44 UTC


README

License: MIT PHP Version Symfony Version

LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.

๐Ÿ“š Table of Contents

โœจ Features

  • ๐ŸŒ Support for multiple AI providers
  • โš™๏ธ Flexible configuration
  • ๐Ÿ›ก๏ธ Exception handling with custom exceptions
  • ๐Ÿ–ฅ๏ธ CLI integration for generating new AI service classes
  • ๐Ÿงฉ Extensible architecture
  • ๐Ÿงช Comprehensive unit testing

๐Ÿ“ฆ Installation

Install the bundle using Composer:

composer require saqqal/llm-integration-bundle

๐Ÿ› ๏ธ Configuration

  1. Register the bundle in config/bundles.php:
<?php
return [
    // ...
    Saqqal\LlmIntegrationBundle\LlmIntegrationBundle::class => ['all' => true],
];
  1. Create config/packages/llm_integration.yaml:
llm_integration:
    llm_provider: 'api_together'
    llm_api_key: '%env(LLM_PROVIDER_API_KEY)%'
    llm_model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'
  1. Set the API key in your .env file:
LLM_PROVIDER_API_KEY=your_api_key_here

๐Ÿš€ Usage

Injecting the AI Service

Inject AiServiceInterface into your services or controllers:

use Saqqal\LlmIntegrationBundle\Interface\AiServiceInterface;

class YourService
{
    private AiServiceInterface $aiService;

    public function __construct(AiServiceInterface $aiService)
    {
        $this->aiService = $aiService;
    }

    // ...
}

Generating Responses

Use the generate method to send prompts and receive responses:

public function generateResponse(string $prompt): string
{
    $response = $this->aiService->generate($prompt);
    return $response->getData()['content'];
}

Changing Output Type

You can change the output type to DynamicAiResponse for more flexible access to API responses:

public function generateDynamicResponse(string $prompt): mixed
{
    $response = $this->aiService->generate($prompt, [], true);
    return $response->choices[0]->message->content;
}

๐Ÿค Available AI Clients

LLMIntegrationBundle supports the following AI clients:

  1. API Together (ApiTogetherClient)
  2. OpenAI (OpenAiClient)
  3. Anthropic (AnthropicClient)
  4. Arliai (ArliaiClient)
  5. Deepinfra (DeepinfraClient)
  6. Groq (GroqClient)
  7. HuggingFace (HuggingFaceClient)
  8. Mistral (MistralClient)
  9. OpenRouter (OpenRouterClient)
  10. Tavily (TavilyClient)

To use a specific client, set the llm_provider in your configuration to the corresponding provider name.

๐Ÿ’ป CLI Commands

Generate a new AI service class

php bin/console llm:create-ai-service

Follow the prompts to enter the provider name and API endpoint.

List available AI clients

php bin/console llm:list-ai-services

This command will list all available AI clients that are tagged with the @AiClient attribute.

๐Ÿ”ง Extending the Bundle

To add a new AI provider:

  1. Create a new client class extending AbstractAiClient:
use Saqqal\LlmIntegrationBundle\Attribute\AiClient;
use Saqqal\LlmIntegrationBundle\Client\AbstractAiClient;

#[AiClient('your_provider')]
class YourProviderClient extends AbstractAiClient
{
    protected function getApiUrl(): string
    {
        return 'https://api.yourprovider.com/v1/chat/completions';
    }

    protected function getAdditionalRequestData(string $prompt, ?string $model): array
    {
        return [
            // Add provider-specific options here
        ];
    }
}
  1. Update your configuration to use the new provider:
llm_integration:
    llm_provider: 'your_provider'
    llm_api_key: '%env(YOUR_PROVIDER_API_KEY)%'
    llm_model: 'your-default-model'

๐Ÿšฆ Exception Handling

Create an event subscriber to handle LlmIntegrationExceptionEvent:

use Saqqal\LlmIntegrationBundle\Event\LlmIntegrationExceptionEvent;
use Symfony\Component\EventDispatcher\EventSubscriberInterface;

class LlmIntegrationExceptionSubscriber implements EventSubscriberInterface
{
    public static function getSubscribedEvents(): array
    {
        return [
            LlmIntegrationExceptionEvent::class => 'onLlmIntegrationException',
        ];
    }

    public function onLlmIntegrationException(LlmIntegrationExceptionEvent $event): void
    {
        $exception = $event->getException();
        // Handle the exception
    }
}

๐Ÿงช Testing

Run the test suite:

./vendor/bin/phpunit

๐Ÿ“„ License

This bundle is released under the MIT License. See the LICENSE file for details.

๐Ÿ‘จโ€๐Ÿ’ป Author

Abdelaziz Saqqal - LinkedIn - Portfolio

๐Ÿค Contributing

Contributions are welcome! Please fork the repository and submit a pull request with your changes.

๐Ÿ“š Documentation

For more detailed documentation, please visit our Wiki.