ikarolaborda/azure-openai-sdk-php

Azure OpenAI PHP SDK — first-class Azure OpenAI Service support built on top of openai-php/client

Maintainers

Package info

github.com/ikarolaborda/azure-openai-sdk-php

pkg:composer/ikarolaborda/azure-openai-sdk-php

Statistics

Installs: 115

Dependents: 0

Suggesters: 0

Stars: 1

Open Issues: 0

v1.0.0 2026-03-29 20:32 UTC

README

A PHP package that brings first-class Azure OpenAI Service support to your applications. Built on top of openai-php/client — the excellent community-maintained PHP client created by Nuno Maduro and contributors.

This package handles Azure-specific deployment-based URL routing, API versioning, and authentication (API keys and Azure AD / Entra ID) so you can work with Azure OpenAI the same way you work with the standard OpenAI API.

Table of Contents

Installation

Requires PHP 8.2+

composer require ikarolaborda/azure-openai-sdk-php

If your project doesn't already have a PSR-18 HTTP client, install one:

composer require guzzlehttp/guzzle

Configuration

Laravel

The package auto-discovers its service provider and facade. Publish the config file:

php artisan vendor:publish --tag=azure-openai

This creates config/azure-openai.php. Then add your credentials to .env:

AZURE_OPENAI_API_KEY=your-azure-api-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
AZURE_OPENAI_MODEL_DEPLOYMENT=gpt-4o
AZURE_OPENAI_API_VERSION=2024-10-21

Now you can use the facade anywhere in your application:

use IkaroLaborda\AzureOpenAI\Facades\AzureOpenAI;

$response = AzureOpenAI::chat()->create([
    'messages' => [
        ['role' => 'user', 'content' => 'Hello!'],
    ],
]);

echo $response->choices[0]->message->content;

Or resolve the client from the container:

use OpenAI\Client;

$client = app(Client::class);

$response = $client->chat()->create([
    'messages' => [
        ['role' => 'user', 'content' => 'Hello!'],
    ],
]);

Standalone (non-Laravel)

Without Laravel, configure the client directly. Set your environment variables and call client() with no arguments:

use IkaroLaborda\AzureOpenAI\AzureOpenAI;

// Reads from AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT,
// AZURE_OPENAI_MODEL_DEPLOYMENT, and AZURE_OPENAI_API_VERSION
$client = AzureOpenAI::client();

Or pass parameters explicitly — any argument takes precedence over its environment variable:

$client = AzureOpenAI::client(
    apiKey: 'your-azure-api-key',
    endpoint: 'https://your-resource.openai.azure.com',
    deployment: 'gpt-4o',
    apiVersion: '2024-10-21',
);

For full control, use the factory:

$client = AzureOpenAI::factory()
    ->withApiKey('your-azure-api-key')
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withDeployment('gpt-4o')
    ->withApiVersion('2024-10-21')
    ->withHttpClient(new \GuzzleHttp\Client(['timeout' => 120]))
    ->make();

If you prefer using the resource name instead of the full endpoint URL:

$client = AzureOpenAI::factory()
    ->withApiKey('your-azure-api-key')
    ->withResource('your-resource-name')     // builds https://your-resource-name.openai.azure.com
    ->withDeployment('gpt-4o')
    ->withApiVersion('2024-10-21')
    ->make();

Usage

Because Azure routes requests based on the deployment name in the URL, you don't need to pass a model parameter in your calls — it's determined by the deployment you configured.

Chat Completions

$response = $client->chat()->create([
    'messages' => [
        ['role' => 'system', 'content' => 'You are a helpful assistant.'],
        ['role' => 'user', 'content' => 'What is the capital of France?'],
    ],
]);

echo $response->choices[0]->message->content;

Responses API

The Responses API is OpenAI's newest interface for generating model outputs. It works the same way as the Python client.responses.create():

$response = $client->responses()->create([
    'input' => 'What is the meaning of life?',
]);

echo $response->outputText;

With tools (web search, file search, or custom functions):

$response = $client->responses()->create([
    'input' => 'What was the latest news today?',
    'tools' => [
        ['type' => 'web_search_preview'],
    ],
]);

echo $response->outputText;

Streamed responses:

$stream = $client->responses()->createStreamed([
    'input' => 'Write a short poem about PHP.',
]);

foreach ($stream as $response) {
    if ($response->outputText) {
        echo $response->outputText;
    }
}

Streaming

$stream = $client->chat()->createStreamed([
    'messages' => [
        ['role' => 'user', 'content' => 'Write a short poem about PHP.'],
    ],
]);

foreach ($stream as $response) {
    echo $response->choices[0]->delta->content;
}

Embeddings

$response = $client->embeddings()->create([
    'input' => 'The quick brown fox jumps over the lazy dog.',
]);

$vector = $response->embeddings[0]->embedding; // array of floats

Multiple Deployments

Each client targets a single deployment. If your application uses multiple models, create a client for each:

use IkaroLaborda\AzureOpenAI\AzureOpenAI;

// Both clients share the API key and endpoint from env;
// only the deployment differs.
$chatClient = AzureOpenAI::client(deployment: 'gpt-4o');
$embeddingClient = AzureOpenAI::client(deployment: 'text-embedding-3-small');

$chat = $chatClient->chat()->create([
    'messages' => [['role' => 'user', 'content' => 'Summarize this document.']],
]);

$embedding = $embeddingClient->embeddings()->create([
    'input' => 'The quick brown fox jumps over the lazy dog.',
]);

Account-Scoped Operations

Some Azure OpenAI operations — like listing models or managing files — don't require a deployment. You can omit the deployment for these:

$client = AzureOpenAI::factory()
    ->withApiKey('your-azure-api-key')
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withApiVersion('2024-10-21')
    ->make();

$models = $client->models()->list();

Authentication

API Key

The most common method. Your key is sent via the api-key header:

$client = AzureOpenAI::factory()
    ->withApiKey('your-azure-api-key')
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withDeployment('gpt-4o')
    ->withApiVersion('2024-10-21')
    ->make();

Azure AD / Entra ID Token

For production workloads, Azure AD tokens are recommended over API keys:

$client = AzureOpenAI::factory()
    ->withAzureAdToken($token)
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withDeployment('gpt-4o')
    ->withApiVersion('2024-10-21')
    ->make();

Azure AD Token Provider

For long-running processes where tokens expire, pass a callable that fetches a fresh token on each request:

$client = AzureOpenAI::factory()
    ->withAzureAdTokenProvider(function () use ($credential) {
        $token = $credential->getToken('https://cognitiveservices.azure.com/.default');

        return $token->token;
    })
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withDeployment('gpt-4o')
    ->withApiVersion('2024-10-21')
    ->make();

API Versions

Azure OpenAI requires an explicit API version on every request. The package provides an enum with known versions:

use IkaroLaborda\AzureOpenAI\Enums\AzureApiVersion;

$client = AzureOpenAI::factory()
    ->withApiKey('your-azure-api-key')
    ->withEndpoint('https://your-resource.openai.azure.com')
    ->withDeployment('gpt-4o')
    ->withApiVersion(AzureApiVersion::V2024_10_21->value)
    ->make();
Enum Case Value
V2024_06_01 2024-06-01
V2024_10_21 2024-10-21
V2025_01_01_PREVIEW 2025-01-01-preview
V2025_03_01_PREVIEW 2025-03-01-preview

Available Resources

This package gives you access to all resources provided by openai-php/client, routed through Azure's deployment-based endpoints. Deployment-scoped resources (chat, completions, embeddings, images, audio, responses) are automatically routed via /deployments/{deployment}/. Account-scoped resources (models, files, fine-tuning, assistants, threads, batches, vector stores, moderations) are routed directly under /openai/.

For full resource documentation, see the openai-php/client README.

Built with care by Ikaro C. Laborda. Licensed under the MIT license.