llm-speak / ollama
A Laravel package for integrating Local Ollama models into LLMSpeak
0.3.3
2025-07-14 19:47 UTC
Requires
- php: ^8.2
README
use LLMSpeak\Ollama\Support\Facades\Ollama; Ollama::messages() <---OllamaMessagesAPIRepository Instance ->withModel($model) <---OllamaMessagesAPIRepository Instance ->withTools($temperature) <---OllamaMessagesAPIRepository Instance ->withMessages($messages) <--- MessagesEndpoint Instance ->handle(); Ollama::embeddings() ->withModel($model) ->withInput($convo) ->handle();