janzaba / langfuse
A PHP library for interacting with the Langfuse API
0.0.4
2024-11-15 22:25 UTC
Requires
- php: >=8.1
- guzzlehttp/guzzle: ^7.0
- openai-php/client: ^0.10.3
- ramsey/uuid: ^4.7
- webmozart/assert: ^1.11
Requires (Dev)
- phpunit/phpunit: ^9.0
This package is auto-updated.
Last update: 2025-06-15 23:52:04 UTC
README
Introduction
This library provides wraper functions to use Langfuse LLM monitoring for your application. It was build for Symfony but can be used in any PHP application.
Installation
Install the library and required dependencies via Composer:
composer require janzaba/langfuse
Configuration in Symfony
Step 1: Define Environment Variables
In your .env
file, add your Langfuse PUBLIC_KEY
and SECRET_KEY
:
LANGFUSE_PUBLIC_KEY=your-public-key LANGFUSE_SECRET_KEY=your-secret-key
Step 2: Register Services
In your config/services.yaml
, add the following service definitions:
parameters: langfuse_config: public_key: '%env(LANGFUSE_PUBLIC_KEY)%' secret_key: '%env(LANGFUSE_SECRET_KEY)%' # Optional: langfuse_base_uri: 'https://custom.langfuse.endpoint/' services: Langfuse\Config\Config: class: Langfuse\Config\Config arguments: - '%langfuse_config%' public: false Langfuse\Client\LangfuseClient: arguments: $config: '@Langfuse\Config\Config' Langfuse\LangfuseManager: arguments: $langfuseClient: '@Langfuse\Client\LangfuseClient'
Step 3: Use the OpenAI Client in Your Services
Now you can wrap your code with helper methods
Trace
$this->langfuseManager->withTrace( 'Trace name', ['operation' => 'example operation name'], function () { // Your code here } );
Generation
Inside a trace you can have LLM generation.
$answer = $this->langfuseManager->withGeneration( 'prompt name', 'gpt-4o-mini', $prompt, function () use ($prompt) { return $this->openAIClient->chat()->create( [ 'model' => 'gpt-4o-mini', 'messages' => $prompt, ] ); } );
Contributing
Contributions are welcome! Please submit a pull request or open an issue for any improvements or bugs.
License
This project is licensed under the MIT License.