sourceability/openai-client

PHP 8.0+ OpenAI API client with fully typed/documented requests+responses models, guzzle and symfony/http-client support and async/parallel requests.

0.3.6 2023-06-13 20:50 UTC

This package is auto-updated.

Last update: 2024-05-09 15:51:56 UTC


README

PHP 8.0+ OpenAI API client with fully typed/documented requests+responses models, guzzlehttp/guzzle + symfony/http-client support through HTTPPug, and async/parallel requests.

The client is generated using openai's OpenAPI with jane-php.

Features:

This is a community-maintained/unofficial library.

Installation

composer require sourceability/openai-client

Getting started

require __DIR__ . '/vendor/autoload.php';

use Sourceability\OpenAIClient\Client;
use Sourceability\OpenAIClient\Generated\Model\CreateCompletionRequest;

$apiClient = Client::create(
    apiKey: getenv('OPENAI_API_KEY')
);

$requests = [
    (new CreateCompletionRequest())
        ->setModel('text-davinci-003')
        ->setTemperature(0)
        ->setMaxTokens(512)
        ->setPrompt('The jane php library is very useful because'),
    new CreateCompletionRequest(
        model: 'text-davinci-003',
        temperature: 0,
        maxTokens: 512,
        prompt: 'Symfony symfony symfony is like sourceability on a'
    ),
];
$completionResponses = $apiClient->createCompletions($requests);

var_dump($completionResponses);

ChatGPT with /v1/chat/completions:

<?php

require __DIR__ . '/vendor/autoload.php';

use Sourceability\OpenAIClient\Client;
use Sourceability\OpenAIClient\Generated\Model\ChatCompletionRequestMessage;
use Sourceability\OpenAIClient\Generated\Model\CreateChatCompletionRequest;

$apiClient = Client::create(
    apiKey: getenv('OPENAI_API_KEY')
);

$requests = [
    new CreateChatCompletionRequest(
        model: 'gpt-3.5-turbo',
        temperature: 0,
        messages: [
            new ChatCompletionRequestMessage(
                role: 'user',
                content: 'The jane php library is very useful because'
            )
        ],
    ),
    new CreateChatCompletionRequest(
        model: 'gpt-3.5-turbo',
        temperature: 0,
        messages: [
            new ChatCompletionRequestMessage(
                role: 'user',
                content: 'Symfony symfony symfony is like sourceability on a'
            )
        ],
    ),
];
$completionResponses = $apiClient->createChatCompletions($requests);

var_dump($completionResponses);

Cost calculator

You can use ResponseCostCalculator, which relies on brick/money, to calculate the cost of a response:

use Sourceability\OpenAIClient\Pricing\ResponseCostCalculator;

$responseCostCalculator = new ResponseCostCalculator();
$responseCost = $responseCostCalculator->calculate($myCompletionResponse);

var_dump([
    'total' => $responseCost->getTotal()->formatTo('en_US'),
    'prompt' => $responseCost->getPrompt()->formatTo('en_US'),
    'completion' => $responseCost->getCompletion()->formatTo('en_US'),
]);

array(3) {
  ["total"]=>
  string(10) "$0.0001280"
  ["prompt"]=>
  string(10) "$0.0000980"
  ["completion"]=>
  string(10) "$0.0000300"
}