ikarolaborda / azure-openai-sdk-php
Azure OpenAI PHP SDK — first-class Azure OpenAI Service support built on top of openai-php/client
Package info
github.com/ikarolaborda/azure-openai-sdk-php
pkg:composer/ikarolaborda/azure-openai-sdk-php
Requires
- php: ^8.3
- illuminate/support: ^11.0|^12.0|^13.0
- openai-php/client: ^0.19
Requires (Dev)
- guzzlehttp/guzzle: ^7.10.0
- guzzlehttp/psr7: ^2.8.0
- laravel/pint: ^1.29.0
- mockery/mockery: ^1.6.12
- nunomaduro/collision: ^8.9.1
- pestphp/pest: ^4.0
- pestphp/pest-plugin-arch: ^4.0
- phpstan/phpstan: ^1.12.32|^2.0
- dev-main
- v1.0.0
- v0.19.0
- v0.18.0
- v0.17.1
- v0.17.0
- v0.16.1
- v0.16.0
- v0.15.0
- v0.14.0
- v0.13.0
- v0.12.0
- v0.11.0
- v0.10.3
- v0.10.2
- v0.10.1
- v0.10.0-beta.1
- v0.9.2
- v0.9.1
- v0.9.0
- v0.8.5
- v0.8.4
- v0.8.3
- v0.8.2
- v0.8.1
- v0.8.0
- v0.8.0-beta.3
- v0.8.0-beta.2
- v0.8.0-beta.1
- v0.7.10
- v0.7.9
- v0.7.8
- v0.7.7
- v0.7.6
- v0.7.5
- v0.7.4
- v0.7.3
- v0.7.2
- v0.7.1
- v0.7.0
- v0.6.4
- v0.6.3
- v0.6.2
- v0.6.1
- v0.6.0
- v0.5.3
- v0.5.2
- v0.5.1
- v0.5.0
- v0.4.2
- v0.4.1
- v0.4.0
- v0.3.5
- v0.3.4
- v0.3.3
- v0.3.2
- v0.3.1
- v0.3.0
- v0.2.1
- v0.2.0
- v0.1.0
- dev-support-laravel-13
This package is auto-updated.
Last update: 2026-03-29 20:41:31 UTC
README
A PHP package that brings first-class Azure OpenAI Service support to your applications. Built on top of openai-php/client — the excellent community-maintained PHP client created by Nuno Maduro and contributors.
This package handles Azure-specific deployment-based URL routing, API versioning, and authentication (API keys and Azure AD / Entra ID) so you can work with Azure OpenAI the same way you work with the standard OpenAI API.
Table of Contents
Installation
Requires PHP 8.2+
composer require ikarolaborda/azure-openai-sdk-php
If your project doesn't already have a PSR-18 HTTP client, install one:
composer require guzzlehttp/guzzle
Configuration
Laravel
The package auto-discovers its service provider and facade. Publish the config file:
php artisan vendor:publish --tag=azure-openai
This creates config/azure-openai.php. Then add your credentials to .env:
AZURE_OPENAI_API_KEY=your-azure-api-key AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com AZURE_OPENAI_MODEL_DEPLOYMENT=gpt-4o AZURE_OPENAI_API_VERSION=2024-10-21
Now you can use the facade anywhere in your application:
use IkaroLaborda\AzureOpenAI\Facades\AzureOpenAI; $response = AzureOpenAI::chat()->create([ 'messages' => [ ['role' => 'user', 'content' => 'Hello!'], ], ]); echo $response->choices[0]->message->content;
Or resolve the client from the container:
use OpenAI\Client; $client = app(Client::class); $response = $client->chat()->create([ 'messages' => [ ['role' => 'user', 'content' => 'Hello!'], ], ]);
Standalone (non-Laravel)
Without Laravel, configure the client directly. Set your environment variables and call client() with no arguments:
use IkaroLaborda\AzureOpenAI\AzureOpenAI; // Reads from AZURE_OPENAI_API_KEY, AZURE_OPENAI_ENDPOINT, // AZURE_OPENAI_MODEL_DEPLOYMENT, and AZURE_OPENAI_API_VERSION $client = AzureOpenAI::client();
Or pass parameters explicitly — any argument takes precedence over its environment variable:
$client = AzureOpenAI::client( apiKey: 'your-azure-api-key', endpoint: 'https://your-resource.openai.azure.com', deployment: 'gpt-4o', apiVersion: '2024-10-21', );
For full control, use the factory:
$client = AzureOpenAI::factory() ->withApiKey('your-azure-api-key') ->withEndpoint('https://your-resource.openai.azure.com') ->withDeployment('gpt-4o') ->withApiVersion('2024-10-21') ->withHttpClient(new \GuzzleHttp\Client(['timeout' => 120])) ->make();
If you prefer using the resource name instead of the full endpoint URL:
$client = AzureOpenAI::factory() ->withApiKey('your-azure-api-key') ->withResource('your-resource-name') // builds https://your-resource-name.openai.azure.com ->withDeployment('gpt-4o') ->withApiVersion('2024-10-21') ->make();
Usage
Because Azure routes requests based on the deployment name in the URL, you don't need to pass a model parameter in your calls — it's determined by the deployment you configured.
Chat Completions
$response = $client->chat()->create([ 'messages' => [ ['role' => 'system', 'content' => 'You are a helpful assistant.'], ['role' => 'user', 'content' => 'What is the capital of France?'], ], ]); echo $response->choices[0]->message->content;
Responses API
The Responses API is OpenAI's newest interface for generating model outputs. It works the same way as the Python client.responses.create():
$response = $client->responses()->create([ 'input' => 'What is the meaning of life?', ]); echo $response->outputText;
With tools (web search, file search, or custom functions):
$response = $client->responses()->create([ 'input' => 'What was the latest news today?', 'tools' => [ ['type' => 'web_search_preview'], ], ]); echo $response->outputText;
Streamed responses:
$stream = $client->responses()->createStreamed([ 'input' => 'Write a short poem about PHP.', ]); foreach ($stream as $response) { if ($response->outputText) { echo $response->outputText; } }
Streaming
$stream = $client->chat()->createStreamed([ 'messages' => [ ['role' => 'user', 'content' => 'Write a short poem about PHP.'], ], ]); foreach ($stream as $response) { echo $response->choices[0]->delta->content; }
Embeddings
$response = $client->embeddings()->create([ 'input' => 'The quick brown fox jumps over the lazy dog.', ]); $vector = $response->embeddings[0]->embedding; // array of floats
Multiple Deployments
Each client targets a single deployment. If your application uses multiple models, create a client for each:
use IkaroLaborda\AzureOpenAI\AzureOpenAI; // Both clients share the API key and endpoint from env; // only the deployment differs. $chatClient = AzureOpenAI::client(deployment: 'gpt-4o'); $embeddingClient = AzureOpenAI::client(deployment: 'text-embedding-3-small'); $chat = $chatClient->chat()->create([ 'messages' => [['role' => 'user', 'content' => 'Summarize this document.']], ]); $embedding = $embeddingClient->embeddings()->create([ 'input' => 'The quick brown fox jumps over the lazy dog.', ]);
Account-Scoped Operations
Some Azure OpenAI operations — like listing models or managing files — don't require a deployment. You can omit the deployment for these:
$client = AzureOpenAI::factory() ->withApiKey('your-azure-api-key') ->withEndpoint('https://your-resource.openai.azure.com') ->withApiVersion('2024-10-21') ->make(); $models = $client->models()->list();
Authentication
API Key
The most common method. Your key is sent via the api-key header:
$client = AzureOpenAI::factory() ->withApiKey('your-azure-api-key') ->withEndpoint('https://your-resource.openai.azure.com') ->withDeployment('gpt-4o') ->withApiVersion('2024-10-21') ->make();
Azure AD / Entra ID Token
For production workloads, Azure AD tokens are recommended over API keys:
$client = AzureOpenAI::factory() ->withAzureAdToken($token) ->withEndpoint('https://your-resource.openai.azure.com') ->withDeployment('gpt-4o') ->withApiVersion('2024-10-21') ->make();
Azure AD Token Provider
For long-running processes where tokens expire, pass a callable that fetches a fresh token on each request:
$client = AzureOpenAI::factory() ->withAzureAdTokenProvider(function () use ($credential) { $token = $credential->getToken('https://cognitiveservices.azure.com/.default'); return $token->token; }) ->withEndpoint('https://your-resource.openai.azure.com') ->withDeployment('gpt-4o') ->withApiVersion('2024-10-21') ->make();
API Versions
Azure OpenAI requires an explicit API version on every request. The package provides an enum with known versions:
use IkaroLaborda\AzureOpenAI\Enums\AzureApiVersion; $client = AzureOpenAI::factory() ->withApiKey('your-azure-api-key') ->withEndpoint('https://your-resource.openai.azure.com') ->withDeployment('gpt-4o') ->withApiVersion(AzureApiVersion::V2024_10_21->value) ->make();
| Enum Case | Value |
|---|---|
V2024_06_01 |
2024-06-01 |
V2024_10_21 |
2024-10-21 |
V2025_01_01_PREVIEW |
2025-01-01-preview |
V2025_03_01_PREVIEW |
2025-03-01-preview |
Available Resources
This package gives you access to all resources provided by openai-php/client, routed through Azure's deployment-based endpoints. Deployment-scoped resources (chat, completions, embeddings, images, audio, responses) are automatically routed via /deployments/{deployment}/. Account-scoped resources (models, files, fine-tuning, assistants, threads, batches, vector stores, moderations) are routed directly under /openai/.
For full resource documentation, see the openai-php/client README.
Built with care by Ikaro C. Laborda. Licensed under the MIT license.