chat
ollama.chat(request)
request<Object>: The request object containing chat parameters.model<string>The name of the model to use for the chat.messages<Message[]>: Array of message objects representing the chat history.role<string>: The role of the message sender ('user', 'system', or 'assistant').content<string>: The content of the message.images<Uint8Array[] | string[]>: (Optional) Images to be included in the message, either as Uint8Array or base64 encoded strings.
format<string>: (Optional) Set the expected format of the response (json).stream<boolean>: (Optional) When true anAsyncGeneratoris returned.keep_alive<string | number>: (Optional) How long to keep the model loaded.tools<Tool[]>: (Optional) A list of tool calls the model may make.options<Options>: (Optional) Options to configure the runtime.