devcbh / laravel-ollama-insights
There is no license information available for the latest version (1.2.0) of this package.
Laravel package for AI insights using Ollama
1.2.0
2025-09-21 14:38 UTC
Requires
- php: ^7.2
- guzzlehttp/guzzle: ^7.0
- illuminate/support: ^7.0|^8.0|^9.0|^10.0|^11.0|^12.0
README
# Laravel Ollama Insights [](https://packagist.org/packages/devcbh/laravel-ollama-insights) [](https://packagist.org/packages/devcbh/laravel-ollama-insights) [](https://packagist.org/packages/devcbh/laravel-ollama-insights) A powerful Laravel package that seamlessly integrates Ollama's AI capabilities into your application for generating intelligent insights, data analysis, and predictive modeling. ## ✨ Features - 🤖 **Ollama Integration**: Full integration with Ollama API for local AI processing - 📊 **Pre-built Templates**: Ready-to-use templates for common analysis tasks - 🎯 **Custom Prompts**: Flexible custom prompt generation - ⚡ **Artisan Commands**: CLI interface for easy usage - 🔧 **Configurable**: Easy configuration via environment variables - 📝 **Comprehensive Logging**: Detailed error handling and logging - 🚀 **Laravel Native**: Built with Laravel best practices ## 🚀 Quick Start ### Installation ```bash composer require devcbh/laravel-ollama-insights
Publish Configuration
php artisan vendor:publish --provider="Devcbh\\OllamaInsights\\OllamaInsightsServiceProvider" --tag="config"
Environment Setup
Add to your .env
file:
OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=llama2 OLLAMA_TIMEOUT=30
If you are using Ollama Turbo:
Add to your .env
file:
OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=llama2 OLLAMA_TIMEOUT=30 OLLAMA_API_KEY=API_KEY
📖 Usage
Basic Facade Usage
use Devcbh\OllamaInsights\Facades\OllamaInsights; $salesData = [ 'q1' => 15000, 'q2' => 18000, 'q3' => 21000, 'q4' => 19000 ]; $insight = OllamaInsights::generateInsight('data_analysis', $salesData);
Artisan Commands
# Generate sales trend analysis php artisan ollama:insight trend_analysis --data='{"sales":[100,150,200,250,300]}' # Use specific model php artisan ollama:insight prediction --data='{"growth":[5,8,12,15]}' --model=llama2 # From JSON file php artisan ollama:insight summary --file=storage/data/user_activity.json
🎯 Available Templates
Template | Description | Example Use Case |
---|---|---|
data_analysis |
Analyze datasets and provide key insights | Sales data analysis |
trend_analysis |
Identify patterns and trends | User growth patterns |
prediction |
Predict future outcomes | Revenue forecasting |
summary |
Create concise summaries | Report summarization |
⚙️ Configuration
The package configuration (config/ollama-insights.php
) includes:
return [ 'base_url' => env('OLLAMA_BASE_URL', 'http://localhost:11434'), 'model' => env('OLLAMA_MODEL', 'llama2'), 'timeout' => env('OLLAMA_TIMEOUT', 30), 'templates' => [ 'data_analysis' => 'Analyze this dataset and provide key insights: {data}', 'trend_analysis' => 'Identify trends and patterns in this data: {data}', 'prediction' => 'Based on this data, predict future outcomes: {data}', 'summary' => 'Summarize this information concisely: {data}', // Add your custom templates here ], ];
🔧 API Methods
Generate Completion
OllamaInsights::generateCompletion(string $prompt, ?string $model = null): ?string
Generate Insight
OllamaInsights::generateInsight(string $templateKey, array $data, ?string $model = null): ?string
List Models
OllamaInsights::listModels(): array ## 📋 Prerequisites 1. **Ollama Installation**: ```bash # Install Ollama (see https://ollama.ai) curl -fsSL https://ollama.ai/install.sh | sh
-
Download Models:
ollama pull llama2 ollama pull mistral # Add other models as needed
-
Start Ollama:
ollama serve
🛠️ Advanced Usage
Custom Templates
Add custom templates in your configuration:
'templates' => [ 'sentiment_analysis' => 'Analyze sentiment in this text: {data}', 'code_review' => 'Review this code for improvements: {data}', 'content_generation' => 'Generate content based on: {data}', ],
Custom Prompt Generation
$customPrompt = "Analyze this e-commerce data and suggest marketing strategies: "; $data = json_encode($ecommerceData); $response = OllamaInsights::generateCompletion($customPrompt . $data, 'mistral');
🐛 Troubleshooting
Common Issues
-
Connection Issues:
# Verify Ollama is running curl http://localhost:11434/api/tags
-
Model Not Found:
# List available models ollama list # Pull missing model ollama pull model-name
-
Timeout Errors: Increase timeout in configuration
Debug Mode
Check Laravel logs for detailed error information:
tail -f storage/logs/laravel.log
🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add some amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
📄 License
This package is open-sourced software licensed under the MIT license.
🆘 Support
- Documentation: Ollama Documentation
- Issues: GitHub Issues
- Discussions: GitHub Discussions
🏆 Credits
- Christian Villegas (https://github.com/devcbh)
🔗 Links
Note: Ensure Ollama is properly installed and running before using this package. Visit ollama.ai for installation instructions.