woutersf / ai-connection-bundle
Core AI connection plugin for Mautic - manages LiteLLM integration and shared AI services
Installs: 0
Dependents: 0
Suggesters: 0
Security: 0
Stars: 0
Watchers: 0
Forks: 0
Open Issues: 0
Type:mautic-plugin
pkg:composer/woutersf/ai-connection-bundle
Requires
- php: ^7.4|^8.0
- mautic/core: ^4.0|^5.0
README
A core AI connection plugin for Mautic that manages LiteLLM integration and provides centralized AI services for all Mautic AI-powered plugins.
Overview
The Mautic AI Connection Bundle serves as the foundation for AI functionality in Mautic. It provides a centralized LiteLLM service that can be used by other AI-powered plugins such as:
- Mautic AI Console - AI-powered console interface with voice input
- Mautic AI Reports - AI-powered report generation
- Mautic AI Eval - AI evaluation features
Features
- Centralized AI Configuration - Single source of truth for LiteLLM endpoint and credentials
- LiteLLM Integration - Connect to multiple AI providers (OpenAI, Anthropic Claude, Llama, etc.) through LiteLLM proxy
- Shared Service Architecture - Other plugins access AI capabilities through this bundle's service
- Model Management - Dynamically fetch available models from your LiteLLM instance
- Secure Credential Storage - API keys are encrypted and stored securely
Requirements
- Mautic 4.0+ or Mautic 5.0+
- PHP 7.4 or 8.0+
- A running LiteLLM instance (proxy server)
Installation
Via Composer
composer require mautic/ai-connection-bundle
Manual Installation
- Download or clone this repository
- Place the
MauticAIconnectionBundlefolder indocroot/plugins/ - Clear Mautic cache:
php bin/console cache:clear
- Go to Mautic Settings → Plugins
- Click "Install/Upgrade Plugins"
- Find "Mautic AI Connection" and publish it
Configuration
Navigate to Mautic Settings → Plugins → Mautic AI Connection to configure the plugin.
Required Settings
-
LiteLLM Endpoint
- URL of your LiteLLM proxy server or an OPENAI API key.
- Example:
http://localhost:4000orhttps://your-litellm-server.comor https://api.openai.com/v1 - Note: This should point to your LiteLLM proxy, NOT directly to OpenAI or other providers
-
LiteLLM Secret Key
- API key for authenticating with your LiteLLM instance or Openai.
- This credential is encrypted and stored securely
Usage in Other Plugins
Other Mautic plugins can use the LiteLLM service provided by this bundle.
Accessing the Service
// Get the service from the container $liteLLMService = $this->container->get('mautic.ai_connection.service.litellm');
Available Methods
1. Chat Completion (with tools support)
$messages = [ ['role' => 'system', 'content' => 'You are a helpful assistant.'], ['role' => 'user', 'content' => 'What is Mautic?'], ]; $options = [ 'model' => 'gpt-3.5-turbo', 'temperature' => 0.7, 'max_tokens' => 1000, ]; $response = $liteLLMService->getChatCompletion($messages, $options);
2. Simple Completion
$response = $liteLLMService->getCompletion('Explain marketing automation in 50 words');
3. Streaming Completion
$liteLLMService->streamCompletion('Write a blog post about email marketing', function($chunk) { echo $chunk; });
4. Speech-to-Text
$audioData = file_get_contents('recording.wav'); $transcription = $liteLLMService->speechToText($audioData, 'en', 'whisper-1');
5. Get Available Models
$models = $liteLLMService->getAvailableModels(); // Returns: ['GPT-4' => 'gpt-4', 'Claude 3' => 'claude-3-sonnet', ...]
Subscribing to the Service in Controllers
use MauticPlugin\MauticAIconnectionBundle\Service\LiteLLMService; class YourController extends CommonController { public static function getSubscribedServices(): array { return array_merge(parent::getSubscribedServices(), [ 'mautic.ai_connection.service.litellm' => LiteLLMService::class, ]); } public function yourAction() { $liteLLMService = $this->container->get('mautic.ai_connection.service.litellm'); // Use the service... } }
Architecture
This plugin follows a centralized service architecture:
┌─────────────────────────────────────┐
│ Mautic AI Connection Bundle │
│ ┌───────────────────────────────┐ │
│ │ LiteLLM Service │ │
│ │ - Chat Completions │ │
│ │ - Streaming │ │
│ │ - Speech-to-Text │ │
│ │ - Model Discovery │ │
│ └───────────────────────────────┘ │
└─────────────────────────────────────┘
↑ ↑ ↑
│ │ │
┌─────────┘ │ └─────────┐
│ │ │
┌───┴────┐ ┌───┴────┐ ┌───┴────┐
│AI │ │AI │ │AI │
│Console │ │Reports │ │Eval │
│Bundle │ │Bundle │ │Bundle │
└────────┘ └────────┘ └────────┘
Composer Dependency
Other AI plugins should declare this bundle as a dependency in their composer.json:
{
"require": {
"mautic/ai-connection-bundle": "^1.0"
}
}
Security
- API keys are encrypted using Mautic's encryption helper
- All requests use HTTPS when connecting to remote LiteLLM instances
- The service validates configuration before making API calls
Troubleshooting
"LiteLLM endpoint and secret key must be configured"
Solution: Configure the LiteLLM endpoint and secret key in the plugin settings.
"404 Not Found" when making AI requests
Issue: The endpoint is pointing directly to OpenAI/Anthropic instead of LiteLLM proxy.
Solution: Ensure you're using your LiteLLM proxy URL (e.g., http://localhost:4000), not https://api.openai.com.
Models not appearing in dropdown
Issue: LiteLLM instance is not reachable or not properly configured.
Solution:
- Verify LiteLLM is running:
curl http://localhost:4000/models - Check endpoint URL in plugin settings
- Verify secret key is correct
Development
Running Tests
php bin/phpunit --filter MauticAIconnectionBundle
Code Style
Follow Mautic coding standards:
php bin/php-cs-fixer fix plugins/MauticAIconnectionBundle
Support
- GitHub Issues: Report an issue
- Mautic Community: community.mautic.org
- Documentation: LiteLLM Docs
License
GPL-3.0-or-later
Credits
Created by Frederik Wouters
Version
1.0.0
Changelog
1.0.0 (2024)
- Initial release
- LiteLLM service integration
- Chat completion support
- Streaming support
- Speech-to-text support
- Model discovery
- Secure credential storage