1tomany / llm-sdk
A single, unified, framework-independent library for integration with many popular AI platforms and large language models
Requires
- php: >=8.4
- ext-fileinfo: *
- fakerphp/faker: ^1.24
- phpdocumentor/reflection-docblock: ^5.6
- psr/container: ^2.0
- symfony/http-client: ^7.2|^8.0
- symfony/property-access: ^7.2|^8.0
- symfony/property-info: ^7.2|^8.0
- symfony/serializer: ^7.2|^8.0
Requires (Dev)
- friendsofphp/php-cs-fixer: ^3.93
- phpstan/phpstan: ^2.1
- phpunit/phpunit: ^12.5
README
This library provides a single, unified, framework-independent library for integration with several popular AI platforms and large language models.
Installation
Install the library using Composer:
composer require 1tomany/llm-sdk
Usage
There are two ways to use this library:
- Direct Instantiate the AI client you wish to use and send a request object to it. This method is easier to use, but comes with the cost that your application will be less flexible and testable.
- Actions Register the clients you wish to use with a
OneToMany\LlmSdk\Factory\ClientFactoryinstance, inject that instance into each action you wish to take, and interact with the action instead of through the client.
Note: A Symfony bundle is available if you wish to integrate this library into your Symfony applications with autowiring and configuration support.
I learn best by looking at actual code samples, so lets take a look at the two methods first.
Examples
examples/files/upload.phpUploads a file to an LLM vendorexamples/queries/embed.phpEmbeds content to a high dimensionality vectorexamples/queries/generate.phpGenerates output from a prompt sent to an LLM
Supported platforms
- Anthropic
- Gemini
- Mock
- OpenAI
Platform feature support
Note: Each platform refers to running model inference differently; OpenAI uses the word "Responses" while Gemini uses the word "Content". I've decided the word "Query" is the most succinct term to describe interacting with an LLM. The "Embeddings" and "Queries" sections below refers to the ability to compile a query and use it to generate output from an LLM.
| Feature | Anthropic | Gemini | Mock | OpenAI |
|---|---|---|---|---|
| Batches | ||||
| Create | ❌ | ✅ | ✅ | ✅ |
| Read | ❌ | ✅ | ✅ | ✅ |
| Cancel | ❌ | ❌ | ❌ | ❌ |
| Embeddings | ||||
| Compile | ❌ | ✅ | ✅ | ✅ |
| Embed Content | ❌ | ✅ | ✅ | ✅ |
| Files | ||||
| Upload | ✅ | ✅ | ✅ | ✅ |
| Read | ❌ | ❌ | ❌ | ❌ |
| List | ❌ | ❌ | ❌ | ❌ |
| Download | ❌ | ❌ | ❌ | ❌ |
| Delete | ✅ | ✅ | ✅ | ✅ |
| Queries | ||||
| Compile | ❌ | ✅ | ✅ | ✅ |
| Generate Output | ❌ | ✅ | ✅ | ✅ |
Credits
License
The MIT License