Getting Started
Add the power of generative AI to your Nuxt project with the Nuxt Ollama module.
Add Nuxt Ollama to your project
- Install Ollama:
Follow the official guide to install Ollama on your system : https://ollama.com/download
- Install the module:
- Easy installation using nuxi
npx nuxi
npx nuxi module add nuxt-ollama
- or install using your package manager:
npm
npm install -D nuxt-ollama
- Add the module to your
nuxt.config.ts
and set your Ollama options:
nuxt.config.ts
// ~/nuxt.config.ts
export default {
//...
modules: [
'nuxt-ollama'
],
ollama: {
protocol: 'http', // or 'https'
host: 'localhost', //domain or ip address
port: 11434, //port number
proxy: false, //use proxy
}
}
- Send a chat message to your Ollama server:
The Vue 3 composable and the server side util share the same name so the code is exactly the same on both sides.
const ollama = useOllama()
const response = await ollama.chat({
model: 'llama3.1',
messages: [
{
role: 'user',
content: 'Hello world!'
}
]
})