mahmoudnaggar/laravel-lmstudio

Advanced LM Studio integration for Laravel - Run local LLMs with OpenAI-compatible API, model management, streaming, embeddings, and more

Installs: 4

Dependents: 0

Suggesters: 0

Security: 0

Stars: 4

Watchers: 0

Forks: 0

Open Issues: 0

pkg:composer/mahmoudnaggar/laravel-lmstudio

v1.0.0 2025-12-24 23:10 UTC

This package is auto-updated.

Last update: 2026-01-02 21:16:50 UTC


README

Latest Version on Packagist Total Downloads License

Advanced LM Studio integration for Laravel - Run powerful local LLMs with a clean, Laravel-friendly API. Perfect for privacy-focused AI applications, offline development, and cost-effective AI solutions.

๐Ÿš€ Features

  • โœ… OpenAI-Compatible API - Drop-in replacement for OpenAI with local models
  • โœ… Model Management - List, load, and switch between models programmatically
  • โœ… Streaming Support - Real-time token streaming for chat responses
  • โœ… Embeddings - Generate vector embeddings for semantic search
  • โœ… Conversation Management - Maintain context across multiple messages
  • โœ… Health Monitoring - Check LM Studio server status and loaded models
  • โœ… Token Counting - Estimate and track token usage
  • โœ… Artisan Commands - CLI tools for model management and testing
  • โœ… Comprehensive Testing - Full test suite included
  • โœ… Laravel 10 & 11 - Full support for latest Laravel versions

๐Ÿ“‹ Requirements

  • PHP 8.1 or higher
  • Laravel 10.x or 11.x
  • LM Studio installed and running
  • LM Studio local server enabled (default: http://localhost:1234)

๐Ÿ“ฆ Installation

Install the package via Composer:

composer require mahmoudnaggar/laravel-lmstudio

Publish the configuration file:

php artisan vendor:publish --tag=lmstudio-config

โš™๏ธ Configuration

Update your .env file:

LMSTUDIO_BASE_URL=http://localhost:1234/v1
LMSTUDIO_TIMEOUT=120
LMSTUDIO_DEFAULT_MODEL=llama-3.2-3b-instruct
LMSTUDIO_MAX_TOKENS=2048
LMSTUDIO_TEMPERATURE=0.7

The configuration file (config/lmstudio.php) provides extensive customization options.

๐ŸŽฏ Quick Start

Basic Chat

use MahmoudNaggar\LaravelLMStudio\Facades\LMStudio;

// Simple chat
$response = LMStudio::chat('What is Laravel?');
echo $response->content();

// With options
$response = LMStudio::chat('Explain quantum computing', [
    'model' => 'llama-3.2-3b-instruct',
    'temperature' => 0.8,
    'max_tokens' => 500,
]);

Streaming Responses

LMStudio::stream('Write a story about AI', function ($chunk) {
    echo $chunk; // Output each token as it arrives
});

Conversations

$conversation = LMStudio::conversation();

$conversation->addMessage('user', 'Hello! My name is John.');
$response1 = $conversation->send();

$conversation->addMessage('user', 'What is my name?');
$response2 = $conversation->send(); // Will remember "John"

Embeddings

$embedding = LMStudio::embedding('Laravel is a PHP framework');
$vector = $embedding->vector(); // Array of floats
$dimensions = $embedding->dimensions(); // 384, 768, etc.

Model Management

// List available models
$models = LMStudio::models()->list();

// Get loaded model
$currentModel = LMStudio::models()->loaded();

// Load a specific model
LMStudio::models()->load('mistral-7b-instruct');

// Unload current model
LMStudio::models()->unload();

Health Checks

// Check if LM Studio is running
if (LMStudio::health()->isHealthy()) {
    echo "LM Studio is running!";
}

// Get detailed status
$status = LMStudio::health()->status();
echo "Server: " . $status['server'];
echo "Model: " . $status['model'];

๐Ÿ› ๏ธ Artisan Commands

List Models

php artisan lmstudio:models

Test Connection

php artisan lmstudio:test

Chat from CLI

php artisan lmstudio:chat "What is the meaning of life?"

Load Model

php artisan lmstudio:load mistral-7b-instruct

๐Ÿ“š Advanced Usage

Custom System Prompts

$response = LMStudio::chat('Hello', [
    'system' => 'You are a helpful coding assistant specializing in Laravel.',
]);

Function Calling (Tool Use)

$response = LMStudio::chat('What is the weather in Paris?', [
    'tools' => [
        [
            'type' => 'function',
            'function' => [
                'name' => 'get_weather',
                'description' => 'Get current weather',
                'parameters' => [
                    'type' => 'object',
                    'properties' => [
                        'location' => ['type' => 'string'],
                    ],
                ],
            ],
        ],
    ],
]);

if ($response->hasToolCalls()) {
    $toolCalls = $response->toolCalls();
    // Process tool calls...
}

Token Counting

$text = "This is a sample text";
$tokens = LMStudio::countTokens($text);

if (LMStudio::withinTokenLimit($text, 1000)) {
    // Process the text
}

Batch Processing

$prompts = ['Question 1', 'Question 2', 'Question 3'];

$responses = collect($prompts)->map(function ($prompt) {
    return LMStudio::chat($prompt);
});

Error Handling

use MahmoudNaggar\LaravelLMStudio\Exceptions\LMStudioException;

try {
    $response = LMStudio::chat('Hello');
} catch (LMStudioException $e) {
    Log::error('LM Studio error: ' . $e->getMessage());
    // Fallback logic
}

๐Ÿงช Testing

Run the test suite:

composer test

Run code formatting:

composer format

๐Ÿ“– Documentation

For detailed documentation, visit the Wiki.

๐Ÿค Contributing

Contributions are welcome! Please see CONTRIBUTING.md for details.

๐Ÿ”’ Security

If you discover any security-related issues, please email mahmoud@example.com instead of using the issue tracker.

๐Ÿ“„ License

The MIT License (MIT). Please see License File for more information.

๐Ÿ™ Credits

๐ŸŒŸ Show Your Support

If this package helps you, please consider giving it a โญ๏ธ on GitHub!

๐Ÿ“ž Support

Made with โค๏ธ for the Laravel community