adachsoft/ai-integration

Unified AI tool-calling chat abstraction for PHP 8.3 with pluggable SPI providers (OpenAI, Deepseek) and a clean Public API.

Maintainers

Package info

gitlab.com/a.adach/AiIntegration

Issues

pkg:composer/adachsoft/ai-integration

Statistics

Installs: 54

Dependents: 8

Suggesters: 0

Stars: 0

v0.7.0 2026-03-25 21:48 UTC

This package is auto-updated.

Last update: 2026-04-25 21:04:52 UTC


README

Unified AI tool-calling chat abstraction for PHP 8.3 with pluggable providers and a clean, framework-agnostic Public API.

  • Public API: simple facade to send chat messages, define tools (function-calling), and receive results.
  • Built-in providers: OpenAI and Deepseek.
  • SPI: implement your own provider by fulfilling a tiny interface and DTO set.
  • HTTP logging: optional, via a small logger interface.
  • Zero-framework: no container required; everything is manually wired with a builder.

Requirements

  • PHP 8.3+
  • ext-json, ext-mbstring (standard in most PHP installations)

Installation

composer require adachsoft/ai-integration

Quick start (Public API)

use AdachSoft\AiIntegration\PublicApi\Builder\ToolCallingChatFacadeBuilder;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ChatMessageDto;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ChatRoleEnum;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\Collection\ChatMessageDtoCollection;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\Collection\ToolDefinitionDtoCollection;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ToolCallingChatRequestDto;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ToolDefinitionDto;

$builder = ToolCallingChatFacadeBuilder::create()
    ->withOpenAi(apiKey: getenv('OPENAI_API_KEY'));

$messages = new ChatMessageDtoCollection([
    ChatMessageDto::createSystemMessage('You are a helpful assistant.'),
    ChatMessageDto::createUserMessage('Add 2 and 3 using a tool and show the token.'),
]);

$tools = new ToolDefinitionDtoCollection([
    new ToolDefinitionDto(
        name: 'sum',
        description: 'Returns JSON {result: string, token: string}',
        parametersSchema: [
            'type' => 'object',
            'properties' => [
                'a' => ['type' => 'number'],
                'b' => ['type' => 'number'],
            ],
            'required' => ['a', 'b'],
            'additionalProperties' => false,
        ],
    ),
]);

$request = new ToolCallingChatRequestDto(
    messages: $messages,
    tools: $tools,
    providerId: 'openai',
    modelId: 'gpt-4o-mini',
    parameters: [
        // Free-form provider parameters.
        // They are forwarded 1:1 to the provider payload (no defaults are injected).
        // Example (only if your model supports it):
        // 'temperature' => 0.0,
    ],
);

$facade = $builder->build();
$response = $facade->chat($request);

if ($response->result !== null) {
    echo $response->result; // final model answer (should include tool token if your prompt enforces it)
}

foreach ($response->toolCalls as $call) {
    // inspect tool calls if needed
}

Chat messages and tool calls (Public API)

ChatMessageDto is the main DTO used to build a conversation. Its content can be either a plain string or a list of structured blocks for multimodal payloads (see the next section):

public function __construct(
    public ChatRoleEnum $role,
    public string|array $content,
    public ToolCallDtoCollection $toolCalls,
    public array $metadata = [],
) { }

To make common cases easier and to guarantee that toolCalls is never null, use the static factories:

ChatMessageDto::createSystemMessage(string|array $content, array $metadata = []): self;
ChatMessageDto::createUserMessage(string|array $content, array $metadata = []): self;
ChatMessageDto::createAssistantMessage(string|array $content, ToolCallDtoCollection $toolCalls, array $metadata = []): self;
ChatMessageDto::createToolMessage(string|array $content, ToolCallDtoCollection $toolCalls, array $metadata = []): self;

Typical patterns:

  • system/user messages: empty ToolCallDtoCollection (created by the factory),
  • assistant messages: may include tool calls when the model proposes or summarizes tool usage,
  • tool messages: should carry tool results and the corresponding ToolCallDto instances in toolCalls.

Multimodal (image + text) chat with OpenAI

For OpenAI models that support vision (for example gpt-4o), you can send multimodal content by passing a list of content blocks as the content of a ChatMessageDto. The list is forwarded 1:1 to the provider payload.

Each block in the list follows the OpenAI format, for example:

  • ['type' => 'text', 'text' => 'Describe this image']
  • ['type' => 'image_url', 'image_url' => ['url' => 'data:image/jpeg;base64,BASE64_ENCODED_IMAGE']]

A minimal example using the Public API:

use AdachSoft\AiIntegration\PublicApi\Builder\ToolCallingChatFacadeBuilder;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ChatMessageDto;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\Collection\ChatMessageDtoCollection;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\Collection\ToolDefinitionDtoCollection;
use AdachSoft\AiIntegration\PublicApi\ToolCalling\Dto\ToolCallingChatRequestDto;

$builder = ToolCallingChatFacadeBuilder::create()
    ->withOpenAi(apiKey: getenv('OPENAI_API_KEY'));

// Produce a JPEG binary using any method you like (GD, Imagick, file_get_contents, ...)
$imageBinary = /* string JPEG binary */;

// Encode it as a data URL understood by OpenAI vision models
$dataUrl = 'data:image/jpeg;base64,' . base64_encode($imageBinary);

$messages = new ChatMessageDtoCollection([
    ChatMessageDto::createUserMessage([
        [
            'type' => 'text',
            'text' => 'Read the text from this image and respond with the exact text only.',
        ],
        [
            'type' => 'image_url',
            'image_url' => [
                'url' => $dataUrl,
            ],
        ],
    ]),
]);

$request = new ToolCallingChatRequestDto(
    messages: $messages,
    tools: new ToolDefinitionDtoCollection([]), // no tools needed for pure vision OCR
    providerId: 'openai',
    modelId: 'gpt-4o', // or another OpenAI multimodal model
    parameters: [
        'max_tokens' => 100,
    ],
);

$facade = $builder->build();
$response = $facade->chat($request);

echo $response->result; // should contain the text read from the image

Notes:

  • The library does not generate images; you are responsible for providing a valid JPEG/PNG binary and building the data URL.
  • The content array is passed as-is to the OpenAI-compatible payload builder, so you must follow the provider's multimodal format.
  • Always use a model that explicitly supports vision (for example gpt-4o) when sending images.

Generation parameters (parameters)

Generation settings are a free-form associative array and are provider-specific. The library forwards them 1:1 to the provider payload.

Important:

  • Do not rely on implicit defaults (e.g. temperature). If you do not pass a key, it is not sent.
  • Passing unsupported parameters may result in a provider error.

Providers

If you pass modelId as null, each provider uses its default. For OpenAI, a safe starter is gpt-4o-mini; for Deepseek: deepseek-chat.

HTTP logging and CLI example

You can inject your own HTTP traffic logger via the builder. A convenient demonstration script is included:

php bin/test-ai-tool-chat.php --provider=openai --model=gpt-4o-mini --show-meta=on --log-http=1 --log-headers=0

The script runs 3 scenarios (smoke and two tool-calling flows) and optionally pretty-prints HTTP request/response payloads.

SPI (Service Provider Interface)

Do not depend on internals. External integrations should use only:

  • AdachSoft\AiIntegration\PublicApi - the facade and DTOs to call the model.
  • AdachSoft\AiIntegration\Spi - the small interface and DTOs to implement your own provider.

In addition to the core Public API and SPI, the library exposes a reusable, public module under src/Support/OpenAiCompatible/* with DTOs, payload builders and HTTP helpers for OpenAI-compatible chat.completions endpoints. If you already have an OpenAI-compatible HTTP API, you can use the ready-to-use SPI provider OpenAiCompatibleToolCallingChatSpi to bridge your endpoint with the domain model.

All other namespaces (Application, Domain, Infrastructure) are internal and may change at any time.

For a complete SPI guide (interface, DTOs, exceptions, examples), see:

  • docs/SPI.md

Production verification of built-in SPI providers

When you implement a new built-in ToolCalling provider inside this library (for example, a new provider wired into ToolCallingChatFacadeBuilder), you must add a production test that verifies the provider end-to-end using the shared base test:

  • Tests\\Production\\ToolCalling\\AbstractToolCallingProviderProductionTestCase

To add a new provider production test:

  1. Ensure your provider is exposed on the builder, for example:
    • ToolCallingChatFacadeBuilder::withMyProvider(string $apiKey): self.
  2. Create tests/Production/ToolCalling/MyProviderProductionTest.php:

    use AdachSoft\\AiIntegration\\PublicApi\\Builder\\ToolCallingChatFacadeBuilder;
    use PHPUnit\\Framework\\Attributes\\Group;
    
    #[Group('external')]
    final class MyProviderProductionTest extends AbstractToolCallingProviderProductionTestCase
    {
        protected function getProviderId(): string
        {
            return 'my-provider';
        }
    
        protected function getApiKeyEnvName(): string
        {
            return 'MY_PROVIDER_API_KEY';
        }
    
        protected function getDefaultModelId(): string
        {
            return 'my-model-id';
        }
    
        protected function configureBuilder(ToolCallingChatFacadeBuilder $builder, string $apiKey): ToolCallingChatFacadeBuilder
        {
            return $builder->withMyProvider(apiKey: $apiKey);
        }
    }
    
  3. Provide a real API key via environment variable (MY_PROVIDER_API_KEY) or .env file. The base test will:
    • resolve the API key from env/.env using resolveApiKeyFromEnvironment(),
    • resolve the model id from env (for example, MY-PROVIDER_MODEL) or getDefaultModelId().
  4. Run production tests for the provider (for example):
    vendor/bin/phpunit tests/Production/ToolCalling/MyProviderProductionTest.php --group external
    

The base test will validate both:

  • a simple smoke chat that must return a non-empty final result, and
  • a minimal tool-calling scenario (sum of 2 and 3) that must:
    • produce at least one tool call with non-empty arguments,
    • keep a non-empty toolCallId,
    • return a final answer that includes 5.

Testing

  • Run unit and integration tests:
    composer test
    
  • Production checks (require API keys): see tests/Production and bin/test-ai-tool-chat.php.