saqqal / llm-integration-bundle
Symfony bundle for integrating large language models (LLMs) into applications via api providers, supporting API Together and OpenAI.
Installs: 9
Dependents: 0
Suggesters: 0
Security: 0
Stars: 1
Watchers: 1
Forks: 0
Open Issues: 0
Type:symfony-bundle
Requires
- php: ^8.0
- symfony/framework-bundle: ^6.0 || ^7.0
- symfony/http-client: ^6.0 || ^7.0
- symfony/monolog-bundle: ^3.10
- symfony/yaml: ^6.0 || ^7.0
Requires (Dev)
- friendsofphp/php-cs-fixer: ^3.64
- phpunit/phpunit: ^9.5
- symfony/test-pack: ^1.1
This package is auto-updated.
Last update: 2025-04-22 23:07:44 UTC
README
LLMIntegrationBundle is a powerful Symfony bundle that seamlessly integrates Large Language Models (LLMs) into your Symfony applications. With support for multiple AI providers and a flexible architecture, it's designed for easy extension and customization.
๐ Table of Contents
- Features
- Installation
- Configuration
- Usage
- Available AI Clients
- CLI Commands
- Extending the Bundle
- Exception Handling
- Testing
- License
- Author
- Contributing
- Documentation
- Acknowledgements
โจ Features
- ๐ Support for multiple AI providers
- โ๏ธ Flexible configuration
- ๐ก๏ธ Exception handling with custom exceptions
- ๐ฅ๏ธ CLI integration for generating new AI service classes
- ๐งฉ Extensible architecture
- ๐งช Comprehensive unit testing
๐ฆ Installation
Install the bundle using Composer:
composer require saqqal/llm-integration-bundle
๐ ๏ธ Configuration
- Register the bundle in
config/bundles.php
:
<?php return [ // ... Saqqal\LlmIntegrationBundle\LlmIntegrationBundle::class => ['all' => true], ];
- Create
config/packages/llm_integration.yaml
:
llm_integration: llm_provider: 'api_together' llm_api_key: '%env(LLM_PROVIDER_API_KEY)%' llm_model: 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'
- Set the API key in your
.env
file:
LLM_PROVIDER_API_KEY=your_api_key_here
๐ Usage
Injecting the AI Service
Inject AiServiceInterface
into your services or controllers:
use Saqqal\LlmIntegrationBundle\Interface\AiServiceInterface; class YourService { private AiServiceInterface $aiService; public function __construct(AiServiceInterface $aiService) { $this->aiService = $aiService; } // ... }
Generating Responses
Use the generate
method to send prompts and receive responses:
public function generateResponse(string $prompt): string { $response = $this->aiService->generate($prompt); return $response->getData()['content']; }
Changing Output Type
You can change the output type to DynamicAiResponse
for more flexible access to API responses:
public function generateDynamicResponse(string $prompt): mixed { $response = $this->aiService->generate($prompt, [], true); return $response->choices[0]->message->content; }
๐ค Available AI Clients
LLMIntegrationBundle supports the following AI clients:
- API Together (
ApiTogetherClient
) - OpenAI (
OpenAiClient
) - Anthropic (
AnthropicClient
) - Arliai (
ArliaiClient
) - Deepinfra (
DeepinfraClient
) - Groq (
GroqClient
) - HuggingFace (
HuggingFaceClient
) - Mistral (
MistralClient
) - OpenRouter (
OpenRouterClient
) - Tavily (
TavilyClient
)
To use a specific client, set the llm_provider
in your configuration to the corresponding provider name.
๐ป CLI Commands
Generate a new AI service class
php bin/console llm:create-ai-service
Follow the prompts to enter the provider name and API endpoint.
List available AI clients
php bin/console llm:list-ai-services
This command will list all available AI clients that are tagged with the @AiClient
attribute.
๐ง Extending the Bundle
To add a new AI provider:
- Create a new client class extending
AbstractAiClient
:
use Saqqal\LlmIntegrationBundle\Attribute\AiClient; use Saqqal\LlmIntegrationBundle\Client\AbstractAiClient; #[AiClient('your_provider')] class YourProviderClient extends AbstractAiClient { protected function getApiUrl(): string { return 'https://api.yourprovider.com/v1/chat/completions'; } protected function getAdditionalRequestData(string $prompt, ?string $model): array { return [ // Add provider-specific options here ]; } }
- Update your configuration to use the new provider:
llm_integration: llm_provider: 'your_provider' llm_api_key: '%env(YOUR_PROVIDER_API_KEY)%' llm_model: 'your-default-model'
๐ฆ Exception Handling
Create an event subscriber to handle LlmIntegrationExceptionEvent
:
use Saqqal\LlmIntegrationBundle\Event\LlmIntegrationExceptionEvent; use Symfony\Component\EventDispatcher\EventSubscriberInterface; class LlmIntegrationExceptionSubscriber implements EventSubscriberInterface { public static function getSubscribedEvents(): array { return [ LlmIntegrationExceptionEvent::class => 'onLlmIntegrationException', ]; } public function onLlmIntegrationException(LlmIntegrationExceptionEvent $event): void { $exception = $event->getException(); // Handle the exception } }
๐งช Testing
Run the test suite:
./vendor/bin/phpunit
๐ License
This bundle is released under the MIT License. See the LICENSE file for details.
๐จโ๐ป Author
Abdelaziz Saqqal - LinkedIn - Portfolio
๐ค Contributing
Contributions are welcome! Please fork the repository and submit a pull request with your changes.
๐ Documentation
For more detailed documentation, please visit our Wiki.