rmh/vertex

A provider for Prism adding support for Google Vertex AI.

Maintainers

Package info

github.com/RanaMoizHaider/vertex

pkg:composer/rmh/vertex

Fund package maintenance!

ranamoizhaider

Statistics

Installs: 90

Dependents: 0

Suggesters: 0

Stars: 3

Open Issues: 0

v1.1.0 2026-03-05 10:03 UTC

This package is auto-updated.

Last update: 2026-04-05 10:30:02 UTC


README

Total Downloads Latest Stable Version License

Prism Vertex

A standalone Google Vertex AI provider for the Prism PHP framework. Access Google Gemini and partner model families through a single configuration and unified interface.

Installation

composer require rmh/vertex

Configuration

Add the following to your Prism configuration (config/prism.php):

Standard mode (project + location required)

'vertex' => [
    'project_id'  => env('VERTEX_PROJECT_ID'),
    'location'    => env('VERTEX_LOCATION', 'us-central1'),
    // Auth - choose one (or omit both to use Application Default Credentials):
    'credentials' => env('VERTEX_CREDENTIALS'), // path to service-account.json
    // 'api_key' => env('VERTEX_API_KEY'),
],

Express mode (API key only, Google models only)

When project_id and location are omitted, the package automatically uses Vertex AI Express Mode endpoints:

'vertex' => [
    'api_key' => env('VERTEX_API_KEY'),
    // project_id and location omitted — triggers Express mode
],

Important

Express mode only supports Google Gemini models. Using partner model providers in Express mode will throw an exception.

Usage

Use the provider constants from Prism\Vertex\Enums\Vertex with the Prism facade:

Text Generation

use Prism\Prism\Prism;
use Prism\Vertex\Enums\Vertex;

// Google Gemini
$response = Prism::text()
    ->using(Vertex::Gemini, 'gemini-2.5-flash')
    ->withPrompt('Explain quantum computing in simple terms')
    ->asText();

echo $response->text;

// Anthropic Claude on Vertex AI
$response = Prism::text()
    ->using(Vertex::Anthropic, 'claude-3-5-sonnet@20241022')
    ->withPrompt('Explain quantum computing in simple terms')
    ->asText();

// Meta Llama on Vertex AI
$response = Prism::text()
    ->using(Vertex::Meta, 'llama-4-scout-17b-16e-instruct-maas')
    ->withPrompt('Explain quantum computing in simple terms')
    ->asText();

Structured Output (JSON)

use Prism\Prism\Prism;
use Prism\Vertex\Enums\Vertex;
use Prism\Prism\Schema\ObjectSchema;
use Prism\Prism\Schema\StringSchema;
use Prism\Prism\Schema\ArraySchema;

$schema = new ObjectSchema(
    name: 'languages',
    description: 'Top programming languages',
    properties: [
        new ArraySchema(
            'languages',
            'List of programming languages',
            items: new ObjectSchema(
                name: 'language',
                description: 'Programming language details',
                properties: [
                    new StringSchema('name', 'The language name'),
                    new StringSchema('popularity', 'Popularity description'),
                ]
            )
        )
    ]
);

$response = Prism::structured()
    ->using(Vertex::Gemini, 'gemini-2.5-flash')
    ->withSchema($schema)
    ->withPrompt('List the top 3 programming languages')
    ->asStructured();

$data = $response->structured;

Embeddings

use Prism\Prism\Prism;
use Prism\Vertex\Enums\Vertex;

$response = Prism::embeddings()
    ->using(Vertex::Gemini, 'text-embedding-005')
    ->fromInput('The sky is blue')
    ->asEmbeddings();

$embeddings = $response->embeddings;

Streaming

Warning

Streaming is not yet supported. This feature is planned for a future release.

Supported Providers

All providers share one vertex config block. The provider constant determines which API schema and publisher endpoint is used:

Constant Publisher Example Models Schema
Vertex::Gemini google gemini-2.5-flash, text-embedding-005 Gemini
Vertex::Anthropic anthropic claude-3-5-sonnet@20241022, claude-3-5-haiku@20241022 Anthropic
Vertex::Mistral mistralai mistral-small-2503, codestral-2501 OpenAI
Vertex::Meta meta llama-4-scout-17b-16e-instruct-maas OpenAI
Vertex::DeepSeek deepseek deepseek-v3-0324-maas OpenAI
Vertex::AI21 ai21 jamba-1.5-mini@001, jamba-1.5-large@001 OpenAI
Vertex::Kimi kimi kimi-k2-0711-maas OpenAI
Vertex::MiniMax minimax minimax-m1-40k-0709-maas OpenAI
Vertex::OpenAI openai gpt-oss-4o-mini-maas OpenAI
Vertex::Qwen qwen qwen2.5-72b-instruct-maas OpenAI
Vertex::ZAI zaiorg glm-4-plus-maas OpenAI

API Schemas

Prism Vertex uses three API schemas to handle the different formats used by Vertex AI publishers:

Schema Text Structured Embeddings
Gemini yes yes yes
Anthropic yes yes no
OpenAI yes yes no
  • Gemini — Google's native generateContent / predict endpoints.
  • Anthropic — Uses :rawPredict with the Anthropic Messages API format (anthropic_version: vertex-2023-10-16).
  • OpenAI — Uses :rawPredict or :chatCompletions with OpenAI-compatible format for partner models (Mistral, Meta, DeepSeek, AI21, Kimi, MiniMax, OpenAI-OSS, Qwen, ZAI).

The provider constant automatically selects the correct schema. You can override it via withProviderOptions():

use Prism\Vertex\Enums\VertexSchema;

$response = Prism::text()
    ->using(Vertex::Gemini, 'some-model')
    ->withProviderOptions(['apiSchema' => VertexSchema::Anthropic])
    ->withPrompt('Hello')
    ->asText();

Structured Output Details

Each schema handles structured output differently:

  • Gemini — Native structured output via response_mime_type: application/json and response_schema. The model is constrained to produce valid JSON matching your schema.
  • OpenAI — Uses response_format: { type: "json_object" } combined with a schema instruction message. All MaaS partner models support this mode.
  • Anthropic — No native JSON mode. A prompt is appended instructing the model to respond with JSON conforming to the provided schema.

You can override the schema instruction prompt for Anthropic and OpenAI schemas using withProviderOptions():

Prism::structured()
    ->using(Vertex::Anthropic, 'claude-3-5-sonnet@20241022')
    ->withSchema($schema)
    ->withProviderOptions([
        'jsonModeMessage' => 'My custom JSON instruction message',
    ])
    ->withPrompt('My prompt')
    ->asStructured();

Authentication

API Key

Set the api_key config option. The key is sent as a ?key= query parameter on every request. This works for both Standard and Express modes.

Service Account JSON file

Set the credentials config option to the path of your service account JSON key file. The package uses google/auth to obtain a Bearer token automatically.

Application Default Credentials (ADC)

When neither credentials nor api_key are set in Standard mode, the package automatically uses Application Default Credentials. This is the recommended approach for code running on Google Cloud Platform, no credentials need to be configured at all.

ADC resolves credentials in this order:

  1. GOOGLE_APPLICATION_CREDENTIALS environment variable (path to a JSON key file)
  2. gcloud auth application-default login (local development)
  3. Attached service account (GCE, GKE, Cloud Run, Cloud Functions, etc.)
'vertex' => [
    'project_id' => env('VERTEX_PROJECT_ID'),
    'location'   => env('VERTEX_LOCATION', 'us-central1'),
    // no api_key or credentials - ADC handles auth automatically
],

Per-Provider Config Overrides

All providers read from the shared vertex config. If you need different credentials for a specific provider, add a per-provider config block — it will override the shared config:

'vertex' => [
    'project_id'  => env('VERTEX_PROJECT_ID'),
    'location'    => env('VERTEX_LOCATION', 'us-central1'),
    'credentials' => env('VERTEX_CREDENTIALS'),
],

// Override for Anthropic only (e.g. different region)
'vertex-anthropic' => [
    'location' => 'europe-west1',
],

Requirements

License

The MIT License (MIT). Please see License File for more information.