minhyung/error-solutions-openai

An OpenAI API-compatible solution provider for spatie/error-solutions.

Maintainers

Package info

github.com/overworks/error-solutions-openai

pkg:composer/minhyung/error-solutions-openai

Statistics

Installs: 0

Dependents: 0

Suggesters: 0

Stars: 0

Open Issues: 0

0.1.0 2026-05-15 06:00 UTC

This package is auto-updated.

Last update: 2026-05-15 06:07:35 UTC


README

Latest Version on Packagist Tests Total Downloads License

minhyung/error-solutions-openai provides a small replacement for the OpenAI solution classes in spatie/error-solutions.

It keeps Spatie's existing solution provider flow, but lets you use modern OpenAI models and OpenAI API-compatible providers such as OpenRouter, vLLM, or Ollama-compatible servers.

Installation

composer require minhyung/error-solutions-openai

Publish the optional config file:

php artisan vendor:publish --tag="error-solutions-openai-config"

Configuration

Set your API key and model:

ERROR_SOLUTIONS_OPENAI_KEY=sk-...
ERROR_SOLUTIONS_OPENAI_MODEL=gpt-5.4-mini

For an OpenAI API-compatible provider, set a custom base URL:

ERROR_SOLUTIONS_OPENAI_KEY=...
ERROR_SOLUTIONS_OPENAI_BASE_URL=https://openrouter.ai/api/v1
ERROR_SOLUTIONS_OPENAI_MODEL=openai/gpt-5.4-mini

Extra provider headers can be configured in config/error-solutions-openai.php:

'headers' => [
    'HTTP-Referer' => env('APP_URL'),
    'X-Title' => env('APP_NAME'),
],

If your provider expects a token limit parameter other than max_tokens, set:

ERROR_SOLUTIONS_OPENAI_TOKEN_LIMIT_PARAMETER=max_completion_tokens

Usage

Register the provider in Spatie's existing config/error-solutions.php:

use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider;

return [
    'solution_providers' => [
        'php',
        'laravel',
        OpenAiSolutionProvider::class,
    ],
];

You can also instantiate it directly:

use Minhyung\ErrorSolutionsOpenAI\OpenAiSolutionProvider;

$provider = new OpenAiSolutionProvider(
    apiKey: env('ERROR_SOLUTIONS_OPENAI_KEY'),
    model: 'gpt-5.4-mini',
);

The provider uses Chat Completions because it is the broadest common API across OpenAI-compatible model providers.

Testing

composer test
composer analyse

Changelog

Please see the GitHub releases for more information on what has changed.

Contributing

Pull requests are welcome. Please run the test suite and static analysis before opening a pull request.

Security

If you discover a security vulnerability, please report it privately instead of opening a public issue by emailing urlinee@gmail.com.

Credits

License

The MIT License (MIT). Please see License File for more information.