takaaki-mizuno / llm-json-adapter
0.1.0
2024-04-28 02:52 UTC
Requires
- php: ^8.2
- aws/aws-sdk-php: ^3.305
- guzzlehttp/guzzle: ^7.8
- orhanerday/open-ai: ^5.1
- swaggest/json-schema: ^0.12.42
Requires (Dev)
- friendsofphp/php-cs-fixer: ^3.54
- phpstan/phpstan: ^1.10
- phpunit/phpunit: ^10.5
- vlucas/phpdotenv: ^5.6
This package is auto-updated.
Last update: 2024-12-28 04:59:27 UTC
README
What is it ?
When using LLMs from the system, you often expect to get output results in JSON: OpenAPI's GPT API has a mechanism called Function Calling, which can return JSON, but Google's Gemini does not seem to have that functionality.
Therefore, I have created a wrapper library to switch LLMs and get results in JSON. What this library can do is as follows.
- Allows you to define the results you want to get in JSON Schema
- Switch between LLMs (currently supports OpenAI's GPT and Google's Gemini).
- Retry a specified number of times if the JSON retrieval fails
Installation
composer require takaaki-mizuno/llm-json-adapter
How to use
Use the following code to get the results in JSON.
OpenAI
$instance = new LLMJsonAdapter( providerName: "openai", attributes: [ "api_key" => "[API-KEY]", "model" => "gpt-3.5-turbo", ], maximumRetryCount: 3, model: "gpt-3.5-turbo", defaultLanguage: "en" ); $response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response( name: "response data name", description: "response data description", schema: [JSON SCHEMA] );
Google Gemini
$instance = new LLMJsonAdapter( providerName: "google", attributes: [ "api_key" => "[API-KEY]", "model" => "gemini-1.5-pro-latest", ], maximumRetryCount: 3, defaultLanguage: "en" ); $response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response( name: "response data name", description: "response data description", schema: [JSON SCHEMA] );
BedRock
$instance = new LLMJsonAdapter( providerName: "bedrock", attributes: [ 'accessKeyId' => '[ACCESS-KEY]', 'secretAccessKey' => '[SECRET-KEY]', 'model' => 'anthropic.claude-3-haiku-20240307-v1:0', ], maximumRetryCount: 3, defaultLanguage: "en" ); $response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response( name: "response data name", description: "response data description", schema: [JSON SCHEMA] );
Ollama
$instance = new LLMJsonAdapter( providerName: "ollama", attributes: [ 'url' => "http://localhost:11434", 'model' => 'llama3', ], maximumRetryCount: 3, defaultLanguage: "en" ); $response = new \TakaakiMizuno\LLMJsonAdapter\Models\Response( name: "response data name", description: "response data description", schema: [JSON SCHEMA] );