bluefly / drupal_native_ai
Vendor-agnostic AI integration module for Drupal with support for open-source models (Ollama, Hugging Face, ONNX), commercial providers, model management, and inference capabilities. Provides the foundation for building AI-driven Drupal applications with control over AI infrastructure.
Fund package maintenance!
Opencollective
Requires
- php: >=8.1
- drupal/ai_swarm_intelligence: ^1.0
- drupal/core: ^11.0
- drupal/experience_builder_converter: ^1.0
- drupal/field_eca_bridge: ^1.0
- drupal/llm: ^1.0
- huggingface/transformers: ^4.0
- langchain/langchain: ^0.1
- microsoft/onnxruntime: ^1.0
- ollama/ollama-php: ^1.0
Requires (Dev)
- dealerdirect/phpcodesniffer-composer-installer: ^1.0
- drupal/coder: ^8.3
- drupal/core-dev: ^11.0
- drupal/phpstan-drupal: ^1.1
- friendsofphp/php-cs-fixer: ^3.0
- mglaman/drupal-check: ^1.4
- phpro/grumphp: ^2.0
- phpunit/phpunit: ^10.0
- squizlabs/php_codesniffer: ^3.7
This package is auto-updated.
Last update: 2025-06-04 00:40:10 UTC
README
TODO: Drupal.org Compatibility Roadmap
We are working to make this module compatible with drupal.org standards by:Action Items:
Resolve circular dependency with the
llm
module
- Move
llm
from required to optional dependencies in the .info.yml file- Update service definitions to check for llm module availability
- Create fallback implementations for when llm is not available
Implement shared interfaces within the module
- Create
src/Interface
directory for common interfaces- Define or import core interfaces like
ProviderInterface
,ModelInterface
- Update services to type-hint against interfaces instead of concrete classes
Implement service discovery patterns
- Create a
ServiceDiscovery
class to check for module and service availability- Add graceful degradation when optional services aren't available
- Use dependency injection and factory patterns for flexible service creation
Create module-specific recipe
- Create
drupal_native_ai.recipe.yml
for standalone installation- Define configuration actions for default settings
- Document recipe usage in installation instructions
See the
/todos
directory for detailed implementation plans and examples.
Overview
Enterprise Drupal modules implementing AI functionality with OpenAPI 3.1 compliance. Provides unified AI model integration, distributed processing, and field processing capabilities through a modular, recipe-based architecture.
Features
Core Capabilities
- AI Integration: Unified model provider interface
- Swarm Intelligence: Distributed AI processing
- Field Processing: ECA-based field automation
- Security: Enterprise-grade security framework
- Testing: 95%+ test coverage with TDD
Advanced Features
- Content Generation: Experience Builder integration
- Migration Tools: AI-powered content migration
- API Standardization: OpenAPI 3.1 compliance
- Multi-tenancy: Secure multi-tenant support
- Blockchain: Secure transaction logging
Architecture
Core Components
- AI Service: Model provider integration
- Swarm System: Distributed processing
- Field Bridge: ECA field automation
- API Layer: OpenAPI 3.1 endpoints
- Security: Access control and audit
Platform Integration
// Integration stack
Drupal → AI Service → Model Providers
Drupal → Swarm System → Distributed Processing
Drupal → Field Bridge → Content Automation
Drupal → OpenAPI → API Gateway
Infrastructure
- Containerization: DDEV support
- Database: PostgreSQL with extensions
- Testing: PHPUnit with TDD
- CI/CD: GitLab CI templates
- Documentation: OpenAPI integration
Module Structure
Core AI Integration
- drupal_native_ai: AI model integration
- ai_swarm_intelligence: Distributed processing
- field_eca_bridge: Field processing
Experience and Content
- experience_builder_converter: Content generation
- drupal_migration_plus: Migration tools
- api_normalization: API standardization
Infrastructure and Security
- secure_drupal: Security features
- multitenancy: Multi-tenant support
- blockchain_manager: Blockchain integration
Integration and Services
- marketplace_integration: Marketplace connectivity
- mcp_client_extras: MCP extensions
- alternative_services: Service alternatives
Recipes and Templates
- recipe_onboarding: Installation recipes
- Marketplace templates
Prerequisites
Core Requirements
- PHP 8.2 or later
- Drupal 11.x
- Composer 2.x
- DDEV (for local development)
- PostgreSQL 15+ (for vector support)
Optional Requirements
- Redis (for caching)
- Elasticsearch (for search)
- WebSocket server
- Vector database providers
Installation
Quick Start
# Install via Composer
composer require bluefly/drupal_native_ai
# Enable module
drush en drupal_native_ai
# Configure AI providers
drush config:set drupal_native_ai.settings providers.openai_api_key "your-key"
# Enable real-time features
drush config:set drupal_native_ai.settings realtime.enabled true
Production Setup
# Install with all dependencies
composer require bluefly/drupal_native_ai --with-all-dependencies
# Apply recipe
drush recipe:apply drupal_native_ai
# Setup production
drush drupal_native_ai:setup-wizard --environment=production
# Configure OpenAPI
drush drupal_native_ai:openapi:generate
drush drupal_native_ai:openapi:validate
Usage
Module Integration
// Example: Using the AI service in a custom module
use Drupal\drupal_native_ai\Service\AIService;
$ai_service = \Drupal::service('drupal_native_ai.service');
$result = $ai_service->process($input, [
'openapi' => [
'spec' => 'openapi.yaml',
'validate' => true
]
]);
Recipe Integration
# Example: Including in a Drupal recipe
name: AI-Enabled Site
type: recipe
install:
- drupal_native_ai
- ai_swarm_intelligence
config:
import:
- drupal_native_ai.settings
- drupal_native_ai.openapi
Template Integration
# Example: Including in a marketplace template
template:
name: AI-Powered Site
type: template
modules:
- drupal_native_ai
- experience_builder_converter
config:
- drupal_native_ai.settings
- drupal_native_ai.openapi
Development
Setup
# Install development dependencies
composer install --dev
# Setup development environment
./setup-ddev.sh
# Start development server
ddev start
Testing
# Run all tests
ddev run-tests
# Run specific test suites
ddev run-tests Unit
ddev run-tests Kernel
ddev run-tests Functional
# Run with PHPUnit
./vendor/bin/phpunit -c phpunit.ddev.xml tests/src/Unit/
# Run TDD workflow
drush recipe:tdd:gen # Generate test skeletons
drush recipe:tdd:verify # Verify tests fail as expected
drush recipe:tdd:check # Check implementation meets tests
drush recipe:tdd:install # Install and validate in environment
Building
# Build for production
drush recipe:build
# Build documentation
drush recipe:build-docs
# Build OpenAPI spec
drush recipe:build-api
Contributing
See CONTRIBUTING.md for development guidelines.
Security
- Report security issues to security@bluefly.io
- Follow our Security Policy
- Review Security Guidelines
License
GPL-2.0 License - see LICENSE for details
Support
Credits
The Drupal Native AI modules provide enterprise-grade AI integration for Drupal 11, with OpenAPI compliance, distributed processing, and field automation capabilities.
Streaming Service
This module implements a streaming service for AI operations in Drupal, providing:
- Response streaming
- Metrics collection
- State management
- Error handling
- Integration with Drupal's state and config systems
- Configurable metrics and thresholds
Features
- Response streaming implementation
- Metrics collection (tokens, latency, cost)
- State management for streaming operations
- Error handling and logging
- Integration with Drupal's state and config systems
- Configurable metrics and thresholds
Migration Guide
For Module Developers
To integrate AI streaming functionality into your module:
Add Service Dependency
In your module's service definition (
your_module.services.yml
):services: your_module.ai_service: class: Drupal\your_module\Service\YourAiService arguments: ['@ai.streaming', '@logger.factory']
Use the Streaming Trait
In your service class:
use Drupal\drupal_native_ai\Service\Trait\AiStreamingTrait; class YourAiService { use AiStreamingTrait; public function __construct( AiStreamingServiceInterface $streaming_service, LoggerChannelFactoryInterface $logger_factory ) { $this->setStreamingService($streaming_service); $this->logger = $logger_factory->get('your_module'); } public function streamChat(array $messages, array $options = []): StreamedResponse { return $this->createStreamingResponse(function ($chunkHandler) use ($messages, $options) { // Your streaming logic here foreach ($messages as $message) { // Process message $chunkHandler($content, [ 'tokensUsed' => $tokens, 'cost' => $cost, ]); } }, $options); } }
Handle State and Metrics
The streaming service provides state and metrics objects:
$state = $this->createStreamingState(); $metrics = $this->createStreamingMetrics(); // Update state $state->setStatus('streaming'); $state->appendContent($chunk); // Update metrics $metrics->setTokensUsed($tokens); $metrics->setLatency($latency); $metrics->setCost($cost); // Add custom metrics $metrics->setCustomMetric('custom_key', $value);
Error Handling
The streaming service handles errors automatically, but you can also handle them in your code:
try { // Your streaming logic } catch (\Exception $e) { $this->logStreamingError($e->getMessage(), [ 'context' => $context, ]); throw $e; }
Configuration
The streaming service supports configuration through Drupal's config system:
// Get config
$config = \Drupal::config('drupal_native_ai.settings');
// Set thresholds
$config->set('metrics_thresholds', [
'max_tokens' => 1000000,
'max_cost' => 1000.0,
])->save();
Metrics and Monitoring
The service automatically tracks and logs metrics:
- Tokens used
- Operation latency
- Cost in credits
- Custom metrics
Metrics are stored in Drupal's state system and can be accessed for monitoring:
$metrics = \Drupal::state()->get('drupal_native_ai.streaming_metrics', []);
Best Practices
Always Use the Trait
The
AiStreamingTrait
provides a consistent interface and handles common functionality.Handle State Properly
Use the state object to track operation progress and status.
Track Metrics
Always provide metrics for tokens, latency, and cost when streaming.
Error Handling
Use the provided error handling and logging methods.
Custom Headers
Use the options array to set custom headers when needed:
$options = [ 'headers' => [ 'X-Custom-Header' => 'value', ], ];
API Reference
StreamingServiceInterface
createStreamingChatResponse(callable $streamCallback, array $options = []): StreamedResponse
createStreamingState(): AiOperationStateInterface
createStreamingMetrics(): StreamingMetricsInterface
AiOperationStateInterface
getStatus(): string
setStatus(string $status): void
getError(): ?string
setError(?string $error): void
getAccumulatedContent(): string
setAccumulatedContent(string $content): void
isComplete(): bool
setComplete(bool $complete): void
getMessages(): array
setMessages(array $messages): void
getMetrics(): ?array
setMetrics(?array $metrics): void
StreamingMetricsInterface
getTokensUsed(): int
setTokensUsed(int $tokens): void
getLatency(): float
setLatency(float $latency): void
getCost(): float
setCost(float $cost): void
toArray(): array