The Ollama provider for the AI SDK contains language model support for the Ollama chat and completion APIs and embedding model support for the Ollama embeddings API.
The Ollama provider is available in the @ai-sdk/ollama
module. You can install it with
npm i @ai-sdk/ollama
You can import the default provider instance ollama
from @ai-sdk/ollama
:
import { ollama } from '@ai-sdk/ollama';
import { ollama } from '@ai-sdk/ollama';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('gpt-4-turbo'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
Please check out the Ollama provider documentation for more information.