rajanvijayan/ai-engine

Multi-provider AI library for PHP - supports Gemini, Meta Llama, and Groq

Fund package maintenance!
rajanvijayan

Installs: 39

Dependents: 0

Suggesters: 0

Security: 0

Stars: 0

Watchers: 1

Forks: 0

Open Issues: 0

pkg:composer/rajanvijayan/ai-engine

1.0.0 2025-07-14 19:25 UTC

README

A powerful, flexible PHP library for integrating multiple AI providers into your applications. Supports Google Gemini, Meta Llama, and Groq with conversation history.

๐Ÿš€ Features

  • Multi-Provider Support: Gemini, Meta Llama, and Groq (Llama)
  • Conversation Mode: Maintains chat history for context-aware responses
  • Easy Provider Switching: Switch between providers on the fly
  • System Instructions: Set AI personality/behavior per conversation
  • Simple API: Intuitive interface that works out of the box
  • Extensible: Easy to add new providers

๐Ÿ“‹ Requirements

  • PHP 8.0 or higher
  • json extension
  • Composer

๐Ÿ› ๏ธ Installation

composer require rajanvijayan/ai-engine

๐Ÿ”‘ API Keys Setup

Create a test.ai-key file in your project root:

{
    "gemini": "YOUR_GEMINI_API_KEY",
    "meta": "YOUR_META_LLAMA_API_KEY",
    "groq": "YOUR_GROQ_API_KEY"
}

Get Your API Keys

Provider URL Notes
Gemini makersuite.google.com Google AI
Meta Llama llama.meta.com Official Meta API
Groq console.groq.com FREE tier available!

๐Ÿš€ Quick Start

Basic Usage

<?php
require_once 'vendor/autoload.php';

use AIEngine\AIEngine;

// Create with Gemini (default)
$ai = new AIEngine('your-gemini-api-key');
$response = $ai->generateContent('Hello! How are you?');
echo $response;

Choose Provider

// Gemini
$ai = new AIEngine($geminiKey, ['provider' => 'gemini']);

// Meta Llama
$ai = new AIEngine($metaKey, ['provider' => 'meta']);

// Groq (Llama models)
$ai = AIEngine::create('groq', $groqKey);

Conversation Mode

$ai = AIEngine::create('groq', $groqKey);
$ai->setSystemInstruction("You are a helpful assistant.");

// Start a conversation
$ai->chat("My name is John");
$ai->chat("What's my name?");  // AI remembers: "John"

// Clear conversation
$ai->newConversation();

๐Ÿ“š API Reference

Constructor

new AIEngine($apiKey, $config = [])

Config Options:

[
    'provider' => 'gemini',     // 'gemini', 'meta', or 'groq'
    'model' => null,            // Model name (uses provider default if null)
    'timeout' => 60,            // Request timeout in seconds
    'enable_logging' => false   // Enable logging
]

Methods

Method Description
generateContent($prompt) Single prompt, no history
chat($message) Send message with conversation history
newConversation() Clear conversation history
getHistory() Get conversation history array
setSystemInstruction($text) Set AI personality/behavior
switchProvider($name, $apiKey) Switch to different provider
getProviderName() Get current provider name

Static Methods

Method Description
AIEngine::create($provider, $apiKey) Factory method
AIEngine::getModelsForProvider($provider) List available models
AIEngine::getDefaultModels() Get default model per provider

๐Ÿค– Available Models

Gemini

  • gemini-2.0-flash (default)
  • gemini-2.0-flash-lite
  • gemini-2.5-flash
  • gemini-2.5-pro

Meta Llama

  • Llama-4-Maverick-17B-128E-Instruct-FP8 (default)
  • Llama-4-Scout-17B-16E-Instruct
  • Llama-3.3-70B-Instruct
  • Llama-3.2-3B-Instruct
  • Llama-3.2-1B-Instruct

Groq

  • llama-3.3-70b-versatile (default)
  • llama-3.1-8b-instant
  • mixtral-8x7b-32768
  • gemma2-9b-it

๐Ÿ’ฌ Usage Examples

Simple Question

$ai = new AIEngine($apiKey);
$answer = $ai->generateContent('What is the capital of France?');
echo $answer;

Conversation with Memory

$ai = AIEngine::create('groq', $groqKey);
$ai->setSystemInstruction("You are a math tutor. Be concise.");

echo $ai->chat("What is 15 + 27?");      // "42"
echo $ai->chat("Multiply that by 2");     // "84" - remembers previous answer!

Switch Providers Mid-Session

$ai = new AIEngine($geminiKey);
echo $ai->getProviderName();  // "Gemini"

$ai->switchProvider('groq', $groqKey);
echo $ai->getProviderName();  // "Groq"

Error Handling

$response = $ai->chat('Hello');

if (is_array($response) && isset($response['error'])) {
    echo "Error: " . $response['error'];
} else {
    echo "AI: " . $response;
}

๐Ÿงช Testing

Interactive Test

php test.php

This launches an interactive chat where you can:

  • Select a provider
  • Chat with the AI
  • Use /new to clear conversation
  • Use /quit to exit

๐Ÿ“ Project Structure

ai-engine/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ AIEngine.php              # Main engine class
โ”‚   โ””โ”€โ”€ Providers/
โ”‚       โ”œโ”€โ”€ ProviderInterface.php # Provider contract
โ”‚       โ”œโ”€โ”€ Gemini.php            # Google Gemini
โ”‚       โ”œโ”€โ”€ MetaLlama.php         # Meta Llama API
โ”‚       โ””โ”€โ”€ Groq.php              # Groq API
โ”œโ”€โ”€ test.php                      # Interactive test
โ”œโ”€โ”€ test.ai-key                   # API keys (JSON)
โ”œโ”€โ”€ composer.json
โ””โ”€โ”€ README.md

๐Ÿ”ง Adding New Providers

  1. Create a new class implementing ProviderInterface
  2. Implement required methods: generateContent(), sendMessage(), startNewConversation(), getConversationHistory(), setSystemInstruction(), isConfigured(), getName()
  3. Add the provider to AIEngine::createProvider()

๐Ÿ›ก๏ธ Error Handling

Common errors returned:

Error Cause
Provider not properly configured Invalid or missing API key
Invalid prompt Empty or too long prompt
Error contacting API Network or API issue

๐Ÿ“„ License

MIT License - see LICENSE file.

๐Ÿ™ Acknowledgments

  • Google for Gemini API
  • Meta for Llama API
  • Groq for fast inference

Made with โค๏ธ for the PHP community